Epistemic Beliefs and Conceptual Understanding in Biotechnology: A Case Study
NASA Astrophysics Data System (ADS)
Rebello, Carina M.; Siegel, Marcelle A.; Witzig, Stephen B.; Freyermuth, Sharyn K.; McClure, Bruce A.
2012-04-01
The purpose of this investigation was to explore students' epistemic beliefs and conceptual understanding of biotechnology. Epistemic beliefs can influence reasoning, how individuals evaluate information, and informed decision making abilities. These skills are important for an informed citizenry that will participate in debates regarding areas in science such as biotechnology. We report on an in-depth case study analysis of three undergraduate, non-science majors in a biotechnology course designed for non-biochemistry majors. We selected participants who performed above average and below average on the first in-class exam. Data from multiple sources—interviews, exams, and a concept instrument—were used to construct (a) individual profiles and (b) a cross-case analysis of our participants' conceptual development and epistemic beliefs from two different theoretical perspectives—Women's Ways of Knowing and the Reflective Judgment Model. Two independent trained researchers coded all case records independently for both theoretical perspectives, with resultant initial Cohen's kappa values above .715 (substantial agreement), and then reached consensus on the codes. Results indicate that a student with more sophisticated epistemology demonstrated greater conceptual understandings at the end of the course than a student with less sophisticated epistemology, even though the latter performed higher initially. Also a student with a less sophisticated epistemology and low initial conceptual performance does not demonstrate gains in their overall conceptual understanding. Results suggest the need for instructional interventions fostering epistemological development of learners in order to facilitate their conceptual growth.
Enhanced analysis and users manual for radial-inflow turbine conceptual design code RTD
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1995-01-01
Modeling enhancements made to a radial-inflow turbine conceptual design code are documented in this report. A stator-endwall clearance-flow model was added for use with pivoting vanes. The rotor calculations were modified to account for swept blades and splitter blades. Stator and rotor trailing-edge losses and a vaneless-space loss were added to the loss model. Changes were made to the disk-friction and rotor-clearance loss calculations. The loss model was then calibrated based on experimental turbine performance. A complete description of code input and output along with sample cases are included in the report.
Performance and Feasibility Analysis of a Wind Turbine Power System for Use on Mars
NASA Technical Reports Server (NTRS)
Lichter, Matthew D.; Viterna, Larry
1999-01-01
A wind turbine power system for future missions to the Martian surface was studied for performance and feasibility. A C++ program was developed from existing FORTRAN code to analyze the power capabilities of wind turbines under different environments and design philosophies. Power output, efficiency, torque, thrust, and other performance criteria could be computed given design geometries, atmospheric conditions, and airfoil behavior. After reviewing performance of such a wind turbine, a conceptual system design was modeled to evaluate feasibility. More analysis code was developed to study and optimize the overall structural design. Findings of this preliminary study show that turbine power output on Mars could be as high as several hundred kilowatts. The optimized conceptual design examined here would have a power output of 104 kW, total mass of 1910 kg, and specific power of 54.6 W/kg.
The Deceptive Mean: Conceptual Scoring of Cloze Entries Differentially Advantages More Able Readers
ERIC Educational Resources Information Center
O'Toole, J. M.; King, R. A. R.
2011-01-01
The "cloze" test is one possible investigative instrument for predicting text comprehensibility. Conceptual coding of student replacement of deleted words has been considered to be more valid than exact coding, partly because conceptual coding seemed fairer to poorer readers. This paper reports a quantitative study of 447 Australian…
NASA Astrophysics Data System (ADS)
Roshanian, Jafar; Jodei, Jahangir; Mirshams, Mehran; Ebrahimi, Reza; Mirzaee, Masood
A new automated multi-level of fidelity Multi-Disciplinary Design Optimization (MDO) methodology has been developed at the MDO Laboratory of K.N. Toosi University of Technology. This paper explains a new design approach by formulation of developed disciplinary modules. A conceptual design for a small, solid-propellant launch vehicle was considered at two levels of fidelity structure. Low and medium level of fidelity disciplinary codes were developed and linked. Appropriate design and analysis codes were defined according to their effect on the conceptual design process. Simultaneous optimization of the launch vehicle was performed at the discipline level and system level. Propulsion, aerodynamics, structure and trajectory disciplinary codes were used. To reach the minimum launch weight, the Low LoF code first searches the whole design space to achieve the mission requirements. Then the medium LoF code receives the output of the low LoF and gives a value near the optimum launch weight with more details and higher fidelity.
Decoding the neural representation of fine-grained conceptual categories.
Ghio, Marta; Vaghi, Matilde Maria Serena; Perani, Daniela; Tettamanti, Marco
2016-05-15
Neuroscientific research on conceptual knowledge based on the grounded cognition framework has shed light on the organization of concrete concepts into semantic categories that rely on different types of experiential information. Abstract concepts have traditionally been investigated as an undifferentiated whole, and have only recently been addressed in a grounded cognition perspective. The present fMRI study investigated the involvement of brain systems coding for experiential information in the conceptual processing of fine-grained semantic categories along the abstract-concrete continuum. These categories consisted of mental state-, emotion-, mathematics-, mouth action-, hand action-, and leg action-related meanings. Thirty-five sentences for each category were used as stimuli in a 1-back task performed by 36 healthy participants. A univariate analysis failed to reveal category-specific activations. Multivariate pattern analyses, in turn, revealed that fMRI data contained sufficient information to disentangle all six fine-grained semantic categories across participants. However, the category-specific activity patterns showed no overlap with the regions coding for experiential information. These findings demonstrate the possibility of detecting specific patterns of neural representation associated with the processing of fine-grained conceptual categories, crucially including abstract ones, though bearing no anatomical correspondence with regions coding for experiential information as predicted by the grounded cognition hypothesis. Copyright © 2016 Elsevier Inc. All rights reserved.
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
ERIC Educational Resources Information Center
Cardenas-Claros, Monica Stella; Gruba, Paul A.
2013-01-01
This paper proposes a theoretical framework for the conceptualization and design of help options in computer-based second language (L2) listening. Based on four empirical studies, it aims at clarifying both conceptualization and design (CoDe) components. The elements of conceptualization consist of a novel four-part classification of help options:…
1984-06-29
effort that requires hard copy documentation. As a result, there are generally numerous delays in providing current quality information. In the FoF...process have had fixed controls or were based on " hard -coded" information. A template, for example, is hard -coded information defining the shape of a...represents soft-coded control information. (Although manual handling of punch tapes still possess some of the limitations of " hard -coded" controls
Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter
NASA Technical Reports Server (NTRS)
Russell, Carl; Johnson, Wayne
2012-01-01
A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.
Mean Line Pump Flow Model in Rocket Engine System Simulation
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Lavelle, Thomas M.
2000-01-01
A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.
Development of an Object-Oriented Turbomachinery Analysis Code within the NPSS Framework
NASA Technical Reports Server (NTRS)
Jones, Scott M.
2014-01-01
During the preliminary or conceptual design phase of an aircraft engine, the turbomachinery designer has a need to estimate the effects of a large number of design parameters such as flow size, stage count, blade count, radial position, etc. on the weight and efficiency of a turbomachine. Computer codes are invariably used to perform this task however, such codes are often very old, written in outdated languages with arcane input files, and rarely adaptable to new architectures or unconventional layouts. Given the need to perform these kinds of preliminary design trades, a modern 2-D turbomachinery design and analysis code has been written using the Numerical Propulsion System Simulation (NPSS) framework. This paper discusses the development of the governing equations and the structure of the primary objects used in OTAC.
ETF system code: composition and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reid, R.L.; Wu, K.F.
1980-01-01
A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies,more » such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yidong Xia; Mitch Plummer; Robert Podgorney
2016-02-01
Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less
Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY
2018-01-01
A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853
Equivalent plate modeling for conceptual design of aircraft wing structures
NASA Technical Reports Server (NTRS)
Giles, Gary L.
1995-01-01
This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.
NASA Technical Reports Server (NTRS)
Lam, David W.
1995-01-01
The transonic performance of a dual-throat, single-expansion-ramp nozzle (SERN) was investigated with a PARC computational fluid dynamics (CFD) code, an external flow Navier-Stokes solver. The nozzle configuration was from a conceptual Mach 5 cruise aircraft powered by four air-breathing turboramjets. Initial test cases used the two-dimensional version of PARC in Euler mode to investigate the effect of geometric variation on transonic performance. Additional cases used the two-dimensional version in viscous mode and the three-dimensional version in both Euler and viscous modes. Results of the analysis indicate low nozzle performance and a highly three-dimensional nozzle flow at transonic conditions. In another comparative study using the PARC code, a single-throat SERN configuration for which experimental data were available at transonic conditions was used to validate the results of the over/under turboramjet nozzle.
NASA Astrophysics Data System (ADS)
Mbewe, Simeon
The purpose of this study was threefold: Examine middle school teachers' familiarity with, interest in, conceptual knowledge of and performance on light; Examine their ability to identify misconceptions on light and their suggested pedagogical ideas to address the identified misconceptions; and Establish the relationship between the middle school teachers' interest, familiarity, conceptual understanding, performance, misconception identification, and pedagogical ideas for light. Sixty six (66) middle school science teachers enrolled in three math and science teacher professional development projects at Southern Illinois University Carbondale participated in this study. This study used mixed-methods approach to collect and analyze data. The participants responded in writing to four different instruments: Familiarity and Interest Questionnaire, Conceptual Knowledge Test, Two-tier Performance Test, and Misconceptions Identification Questionnaire. Data was analyzed quantitatively by conducting non-parametric (Wilcoxon, Mann-Whitney U, and Kruskal-Wallis) and parametric (paired samples, independent samples, and One-Way ANOVA) tests. Qualitative data was analyzed using thematic analysis and open coding to identify emerging themes and categories. The results showed that the teachers reported high levels of familiarity with and interest in learning more about light concepts. However, they had low conceptual knowledge and performance on light concepts. As such, middle school teachers' perceived knowledge of light concepts was not consistent with their actual knowledge of light. To some extent, the teachers identified students' misconceptions expressed in some scenarios on light and also suggested pedagogical ideas for addressing such misconceptions in middle school science classrooms. However, most teachers did not provide details on their pedagogical ideas for light. Correlations among the four constructs (familiarity, interest, conceptual understanding, and performance) were only significant between performance and conceptual understanding, r (64) = .50, p = .000. There was no significant relationship between conceptual understanding and familiarity, and between performance and familiarity. In view of these findings, it is evident that some teachers did not have sound conceptual understanding and pedagogical ideas to effectively help their students develop the understanding of light concepts accentuated in the US national science education standards. These findings have implications on teacher education and science teaching and learning.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
...) disposal facilities. The workshop has been developed to facilitate communication among Federal and State... and conceptual models, and (3) the selection of computer codes. Information gathered from invited.... NRC Public Meeting The purpose of this public meeting is to facilitate communication and gather...
High density arrays of micromirrors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Folta, J. M.; Decker, J. Y.; Kolman, J.
We established and achieved our goal to (1) fabricate and evaluate test structures based on the micromirror design optimized for maskless lithography applications, (2) perform system analysis and code development for the maskless lithography concept, and (3) identify specifications for micromirror arrays (MMAs) for LLNL's adaptive optics (AO) applications and conceptualize new devices.
Users manual for updated computer code for axial-flow compressor conceptual design
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
An existing computer code that determines the flow path for an axial-flow compressor either for a given number of stages or for a given overall pressure ratio was modified for use in air-breathing engine conceptual design studies. This code uses a rapid approximate design methodology that is based on isentropic simple radial equilibrium. Calculations are performed at constant-span-fraction locations from tip to hub. Energy addition per stage is controlled by specifying the maximum allowable values for several aerodynamic design parameters. New modeling was introduced to the code to overcome perceived limitations. Specific changes included variable rather than constant tip radius, flow path inclination added to the continuity equation, input of mass flow rate directly rather than indirectly as inlet axial velocity, solution for the exact value of overall pressure ratio rather than for any value that met or exceeded it, and internal computation of efficiency rather than the use of input values. The modified code was shown to be capable of computing efficiencies that are compatible with those of five multistage compressors and one fan that were tested experimentally. This report serves as a users manual for the revised code, Compressor Spanline Analysis (CSPAN). The modeling modifications, including two internal loss correlations, are presented. Program input and output are described. A sample case for a multistage compressor is included.
User's manual for COAST 4: a code for costing and sizing tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sink, D. A.; Iwinski, E. M.
1979-09-01
The purpose of this report is to document the computer program COAST 4 for the user/analyst. COAST, COst And Size Tokamak reactors, provides complete and self-consistent size models for the engineering features of D-T burning tokamak reactors and associated facilities involving a continuum of performance including highly beam driven through ignited plasma devices. TNS (The Next Step) devices with no tritium breeding or electrical power production are handled as well as power producing and fissile producing fusion-fission hybrid reactors. The code has been normalized with a TFTR calculation which is consistent with cost, size, and performance data published in themore » conceptual design report for that device. Information on code development, computer implementation and detailed user instructions are included in the text.« less
Centrifugal and Axial Pump Design and Off-Design Performance Prediction
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1995-01-01
A meanline pump-flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump-flow code PUMPA was written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design-point rotor efficiency and slip factors are obtained from empirical correlations to rotor-specific speed and geometry. The pump code can model axial, inducer, mixed-flow, and centrifugal pumps and can model multistage pumps in series. The rapid input setup and computer run time for this meanline pump flow code make it an effective analysis and conceptual design tool. The map-generation capabilities of the code provide the information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of PUMPA permit the user to do parametric design space exploration of candidate pump configurations and to provide head-flow maps for engine system evaluation.
Flowers, Natalie L
2010-01-01
CodeSlinger is a desktop application that was developed to aid medical professionals in the intertranslation, exploration, and use of biomedical coding schemes. The application was designed to provide a highly intuitive, easy-to-use interface that simplifies a complex business problem: a set of time-consuming, laborious tasks that were regularly performed by a group of medical professionals involving manually searching coding books, searching the Internet, and checking documentation references. A workplace observation session with a target user revealed the details of the current process and a clear understanding of the business goals of the target user group. These goals drove the design of the application's interface, which centers on searches for medical conditions and displays the codes found in the application's database that represent those conditions. The interface also allows the exploration of complex conceptual relationships across multiple coding schemes.
Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design
NASA Astrophysics Data System (ADS)
Iqbal, Liaquat Ullah
An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).
NASA Astrophysics Data System (ADS)
Knowlton, R. G.; Arnold, B. W.; Mattie, P. D.; Kuo, M.; Tien, N.
2006-12-01
For several years now, Taiwan has been engaged in a process to select a low-level radioactive waste (LLW) disposal site. Taiwan is generating LLW from operational and decommissioning wastes associated with nuclear power reactors, as well as research, industrial, and medical radioactive wastes. The preliminary selection process has narrowed the search to four potential candidate sites. These sites are to be evaluated in a performance assessment analysis to determine the likelihood of meeting the regulatory criteria for disposal. Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research have been working together to develop the necessary performance assessment methodology and associated computer models to perform these analyses. The methodology utilizes both deterministic (e.g., single run) and probabilistic (e.g., multiple statistical realizations) analyses to achieve the goals. The probabilistic approach provides a means of quantitatively evaluating uncertainty in the model predictions and a more robust basis for performing sensitivity analyses to better understand what is driving the dose predictions from the models. Two types of disposal configurations are under consideration: a shallow land burial concept and a cavern disposal concept. The shallow land burial option includes a protective cover to limit infiltration potential to the waste. Both conceptual designs call for the disposal of 55 gallon waste drums within concrete lined trenches or tunnels, and backfilled with grout. Waste emplaced in the drums may be solidified. Both types of sites are underlain or placed within saturated fractured bedrock material. These factors have influenced the conceptual model development of each site, as well as the selection of the models to employ for the performance assessment analyses. Several existing codes were integrated in order to facilitate a comprehensive performance assessment methodology to evaluate the potential disposal sites. First, a need existed to simulate the failure processes of the waste containers, with subsequent leaching of the waste form to the underlying host rock. The Breach, Leach, and Transport Multiple Species (BLT-MS) code was selected to meet these needs. BLT-MS also has a 2-D finite-element advective-dispersive transport module, with radionuclide in-growth and decay. BLT-MS does not solve the groundwater flow equation, but instead requires the input of Darcy flow velocity terms. These terms were abstracted from a groundwater flow model using the FEHM code. For the shallow land burial site, the HELP code was also used to evaluate the performance of the protective cover. The GoldSim code was used for two purposes: quantifying uncertainties in the predictions, and providing a platform to evaluate an alternative conceptual model involving matrix-diffusion transport. Results of the preliminary performance assessment analyses using examples to illustrate the computational framework will be presented. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Oak Ridge Spallation Neutron Source (ORSNS) target station design integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamy, T.; Booth, R.; Cleaves, J.
1996-06-01
The conceptual design for a 1- to 3-MW short pulse spallation source with a liquid mercury target has been started recently. The design tools and methods being developed to define requirements, integrate the work, and provide early cost guidance will be presented with a summary of the current target station design status. The initial design point was selected with performance and cost estimate projections by a systems code. This code was developed recently using cost estimates from the Brookhaven Pulsed Spallation Neutron Source study and experience from the Advanced Neutron Source Project`s conceptual design. It will be updated and improvedmore » as the design develops. Performance was characterized by a simplified figure of merit based on a ratio of neutron production to costs. A work breakdown structure was developed, with simplified systems diagrams used to define interfaces and system responsibilities. A risk assessment method was used to identify potential problems, to identify required research and development (R&D), and to aid contingency development. Preliminary 3-D models of the target station are being used to develop remote maintenance concepts and to estimate costs.« less
Parametric Model of an Aerospike Rocket Engine
NASA Technical Reports Server (NTRS)
Korte, J. J.
2000-01-01
A suite of computer codes was assembled to simulate the performance of an aerospike engine and to generate the engine input for the Program to Optimize Simulated Trajectories. First an engine simulator module was developed that predicts the aerospike engine performance for a given mixture ratio, power level, thrust vectoring level, and altitude. This module was then used to rapidly generate the aerospike engine performance tables for axial thrust, normal thrust, pitching moment, and specific thrust. Parametric engine geometry was defined for use with the engine simulator module. The parametric model was also integrated into the iSIGHTI multidisciplinary framework so that alternate designs could be determined. The computer codes were used to support in-house conceptual studies of reusable launch vehicle designs.
Parametric Model of an Aerospike Rocket Engine
NASA Technical Reports Server (NTRS)
Korte, J. J.
2000-01-01
A suite of computer codes was assembled to simulate the performance of an aerospike engine and to generate the engine input for the Program to Optimize Simulated Trajectories. First an engine simulator module was developed that predicts the aerospike engine performance for a given mixture ratio, power level, thrust vectoring level, and altitude. This module was then used to rapidly generate the aerospike engine performance tables for axial thrust, normal thrust, pitching moment, and specific thrust. Parametric engine geometry was defined for use with the engine simulator module. The parametric model was also integrated into the iSIGHT multidisciplinary framework so that alternate designs could be determined. The computer codes were used to support in-house conceptual studies of reusable launch vehicle designs.
Probabilistic margin evaluation on accidental transients for the ASTRID reactor project
NASA Astrophysics Data System (ADS)
Marquès, Michel
2014-06-01
ASTRID is a technological demonstrator of Sodium cooled Fast Reactor (SFR) under development. The conceptual design studies are being conducted in accordance with the Generation IV reactor objectives, particularly in terms of improving safety. For the hypothetical events, belonging to the accidental category "severe accident prevention situations" having a very low frequency of occurrence, the safety demonstration is no more based on a deterministic demonstration with conservative assumptions on models and parameters but on a "Best-Estimate Plus Uncertainty" (BEPU) approach. This BEPU approach ispresented in this paper for an Unprotected Loss-of-Flow (ULOF) event. The Best-Estimate (BE) analysis of this ULOFt ransient is performed with the CATHARE2 code, which is the French reference system code for SFR applications. The objective of the BEPU analysis is twofold: first evaluate the safety margin to sodium boiling in taking into account the uncertainties on the input parameters of the CATHARE2 code (twenty-two uncertain input parameters have been identified, which can be classified into five groups: reactor power, accident management, pumps characteristics, reactivity coefficients, thermal parameters and head losses); secondly quantify the contribution of each input uncertainty to the overall uncertainty of the safety margins, in order to refocusing R&D efforts on the most influential factors. This paper focuses on the methodological aspects of the evaluation of the safety margin. At least for the preliminary phase of the project (conceptual design), a probabilistic criterion has been fixed in the context of this BEPU analysis; this criterion is the value of the margin to sodium boiling, which has a probability 95% to be exceeded, obtained with a confidence level of 95% (i.e. the M5,95percentile of the margin distribution). This paper presents two methods used to assess this percentile: the Wilks method and the Bootstrap method ; the effectiveness of the two methods is compared on the basis of 500 simulations performed with theCATHARE2 code. We conclude that, with only 100 simulations performed with the CATHARE2 code, which is a number of simulations workable in the conceptual design phase of the ASTRID project where the models and the hypothesis are often modified, it is best in order to evaluate the percentile M5,95 of the margin to sodium boiling to use the bootstrap method, which will provide a slightly conservative result. On the other hand, in order to obtain an accurate estimation of the percentileM5,95, for the safety report for example, it will be necessary to perform at least 300 simulations with the CATHARE2 code. In this case, both methods (Wilks and Bootstrap) would give equivalent results.
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.; Lavelle, Thomas M.
1995-01-01
Modifications made to the axial-flow compressor conceptual design code CSPAN are documented in this report. Endwall blockage and stall margin predictions were added. The loss-coefficient model was upgraded. Default correlations for rotor and stator solidity and aspect-ratio inputs and for stator-exit tangential velocity inputs were included in the code along with defaults for aerodynamic design limits. A complete description of input and output along with sample cases are included.
Evaluating the performance of parallel subsurface simulators: An illustrative example with PFLOTRAN
Hammond, G E; Lichtner, P C; Mills, R T
2014-01-01
[1] To better inform the subsurface scientist on the expected performance of parallel simulators, this work investigates performance of the reactive multiphase flow and multicomponent biogeochemical transport code PFLOTRAN as it is applied to several realistic modeling scenarios run on the Jaguar supercomputer. After a brief introduction to the code's parallel layout and code design, PFLOTRAN's parallel performance (measured through strong and weak scalability analyses) is evaluated in the context of conceptual model layout, software and algorithmic design, and known hardware limitations. PFLOTRAN scales well (with regard to strong scaling) for three realistic problem scenarios: (1) in situ leaching of copper from a mineral ore deposit within a 5-spot flow regime, (2) transient flow and solute transport within a regional doublet, and (3) a real-world problem involving uranium surface complexation within a heterogeneous and extremely dynamic variably saturated flow field. Weak scalability is discussed in detail for the regional doublet problem, and several difficulties with its interpretation are noted. PMID:25506097
Evaluating the performance of parallel subsurface simulators: An illustrative example with PFLOTRAN.
Hammond, G E; Lichtner, P C; Mills, R T
2014-01-01
[1] To better inform the subsurface scientist on the expected performance of parallel simulators, this work investigates performance of the reactive multiphase flow and multicomponent biogeochemical transport code PFLOTRAN as it is applied to several realistic modeling scenarios run on the Jaguar supercomputer. After a brief introduction to the code's parallel layout and code design, PFLOTRAN's parallel performance (measured through strong and weak scalability analyses) is evaluated in the context of conceptual model layout, software and algorithmic design, and known hardware limitations. PFLOTRAN scales well (with regard to strong scaling) for three realistic problem scenarios: (1) in situ leaching of copper from a mineral ore deposit within a 5-spot flow regime, (2) transient flow and solute transport within a regional doublet, and (3) a real-world problem involving uranium surface complexation within a heterogeneous and extremely dynamic variably saturated flow field. Weak scalability is discussed in detail for the regional doublet problem, and several difficulties with its interpretation are noted.
Tactile communication, cooperation, and performance: an ethological study of the NBA.
Kraus, Michael W; Huang, Cassey; Keltner, Dacher
2010-10-01
Tactile communication, or physical touch, promotes cooperation between people, communicates distinct emotions, soothes in times of stress, and is used to make inferences of warmth and trust. Based on this conceptual analysis, we predicted that in group competition, physical touch would predict increases in both individual and group performance. In an ethological study, we coded the touch behavior of players from the National Basketball Association (NBA) during the 2008-2009 regular season. Consistent with hypotheses, early season touch predicted greater performance for individuals as well as teams later in the season. Additional analyses confirmed that touch predicted improved performance even after accounting for player status, preseason expectations, and early season performance. Moreover, coded cooperative behaviors between teammates explained the association between touch and team performance. Discussion focused on the contributions touch makes to cooperative groups and the potential implications for other group settings. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
L1 and L2 Picture Naming in Mandarin-English Bilinguals: A Test of Bilingual Dual Coding Theory
ERIC Educational Resources Information Center
Jared, Debra; Poh, Rebecca Pei Yun; Paivio, Allan
2013-01-01
This study examined the nature of bilinguals' conceptual representations and the links from these representations to words in L1 and L2. Specifically, we tested an assumption of the Bilingual Dual Coding Theory that conceptual representations include image representations, and that learning two languages in separate contexts can result in…
ACSYNT - A standards-based system for parametric, computer aided conceptual design of aircraft
NASA Technical Reports Server (NTRS)
Jayaram, S.; Myklebust, A.; Gelhausen, P.
1992-01-01
A group of eight US aerospace companies together with several NASA and NAVY centers, led by NASA Ames Systems Analysis Branch, and Virginia Tech's CAD Laboratory agreed, through the assistance of Americal Technology Initiative, in 1990 to form the ACSYNT (Aircraft Synthesis) Institute. The Institute is supported by a Joint Sponsored Research Agreement to continue the research and development in computer aided conceptual design of aircraft initiated by NASA Ames Research Center and Virginia Tech's CAD Laboratory. The result of this collaboration, a feature-based, parametric computer aided aircraft conceptual design code called ACSYNT, is described. The code is based on analysis routines begun at NASA Ames in the early 1970's. ACSYNT's CAD system is based entirely on the ISO standard Programmer's Hierarchical Interactive Graphics System and is graphics-device independent. The code includes a highly interactive graphical user interface, automatically generated Hermite and B-Spline surface models, and shaded image displays. Numerous features to enhance aircraft conceptual design are described.
Propulsion System Models for Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2014-01-01
The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.
Assessment of Alternative Conceptual Models Using Reactive Transport Modeling with Monitoring Data
NASA Astrophysics Data System (ADS)
Dai, Z.; Price, V.; Heffner, D.; Hodges, R.; Temples, T.; Nicholson, T.
2005-12-01
Monitoring data proved very useful in evaluating alternative conceptual models, simulating contaminant transport behavior, and reducing uncertainty. A graded approach using three alternative conceptual site models was formulated to simulate a field case of tetrachloroethene (PCE) transport and biodegradation. These models ranged from simple to complex in their representation of subsurface heterogeneities. The simplest model was a single-layer homogeneous aquifer that employed an analytical reactive transport code, BIOCHLOR (Aziz et al., 1999). Due to over-simplification of the aquifer structure, this simulation could not reproduce the monitoring data. The second model consisted of a multi-layer conceptual model, in combination with numerical modules, MODFLOW and RT3D within GMS, to simulate flow and reactive transport. Although the simulation results from the second model were comparatively better than those from the simple model, they still did not adequately reproduce the monitoring well concentrations because the geological structures were still inadequately defined. Finally, a more realistic conceptual model was formulated that incorporated heterogeneities and geologic structures identified from well logs and seismic survey data using the Petra and PetraSeis software. This conceptual model included both a major channel and a younger channel that were detected in the PCE source area. In this model, these channels control the local ground-water flow direction and provide a preferential chemical transport pathway. Simulation results using this conceptual site model proved compatible with the monitoring concentration data. This study demonstrates that the bias and uncertainty from inadequate conceptual models are much larger than those introduced from an inadequate choice of model parameter values (Neuman and Wierenga, 2003; Meyer et al., 2004; Ye et al., 2004). This case study integrated conceptual and numerical models, based on interpreted local hydrogeologic and geochemical data, with detailed monitoring plume data. It provided key insights for confirming alternative conceptual site models and assessing the performance of monitoring networks. A monitoring strategy based on this graded approach for assessing alternative conceptual models can provide the technical bases for identifying critical monitoring locations, adequate monitoring frequency, and performance indicator parameters for performance monitoring involving ground-water levels and PCE concentrations.
Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN
NASA Technical Reports Server (NTRS)
Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.
1996-01-01
A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.
Concept For Generation Of Long Pseudorandom Sequences
NASA Technical Reports Server (NTRS)
Wang, C. C.
1990-01-01
Conceptual very-large-scale integrated (VLSI) digital circuit performs exponentiation in finite field. Algorithm that generates unusually long sequences of pseudorandom numbers executed by digital processor that includes such circuits. Concepts particularly advantageous for such applications as spread-spectrum communications, cryptography, and generation of ranging codes, synthetic noise, and test data, where usually desirable to make pseudorandom sequences as long as possible.
Integration of Rotor Aerodynamic Optimization with the Conceptual Design of a Large Civil Tiltrotor
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.
2010-01-01
Coupling of aeromechanics analysis with vehicle sizing is demonstrated with the CAMRAD II aeromechanics code and NDARC sizing code. The example is optimization of cruise tip speed with rotor/wing interference for the Large Civil Tiltrotor (LCTR2) concept design. Free-wake models were used for both rotors and the wing. This report is part of a NASA effort to develop an integrated analytical capability combining rotorcraft aeromechanics, structures, propulsion, mission analysis, and vehicle sizing. The present paper extends previous efforts by including rotor/wing interference explicitly in the rotor performance optimization and implicitly in the sizing.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne R.
2009-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC - NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2015-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NASA Astrophysics Data System (ADS)
El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel
This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.
Development of 1D Liner Compression Code for IDL
NASA Astrophysics Data System (ADS)
Shimazu, Akihisa; Slough, John; Pancotti, Anthony
2015-11-01
A 1D liner compression code is developed to model liner implosion dynamics in the Inductively Driven Liner Experiment (IDL) where FRC plasmoid is compressed via inductively-driven metal liners. The driver circuit, magnetic field, joule heating, and liner dynamics calculations are performed at each time step in sequence to couple these effects in the code. To obtain more realistic magnetic field results for a given drive coil geometry, 2D and 3D effects are incorporated into the 1D field calculation through use of correction factor table lookup approach. Commercial low-frequency electromagnetic fields solver, ANSYS Maxwell 3D, is used to solve the magnetic field profile for static liner condition at various liner radius in order to derive correction factors for the 1D field calculation in the code. The liner dynamics results from the code is verified to be in good agreement with the results from commercial explicit dynamics solver, ANSYS Explicit Dynamics, and previous liner experiment. The developed code is used to optimize the capacitor bank and driver coil design for better energy transfer and coupling. FRC gain calculations are also performed using the liner compression data from the code for the conceptual design of the reactor sized system for fusion energy gains.
Processing Motion: Using Code to Teach Newtonian Physics
NASA Astrophysics Data System (ADS)
Massey, M. Ryan
Prior to instruction, students often possess a common-sense view of motion, which is inconsistent with Newtonian physics. Effective physics lessons therefore involve conceptual change. To provide a theoretical explanation for concepts and how they change, the triangulation model brings together key attributes of prototypes, exemplars, theories, Bayesian learning, ontological categories, and the causal model theory. The triangulation model provides a theoretical rationale for why coding is a viable method for physics instruction. As an experiment, thirty-two adolescent students participated in summer coding academies to learn how to design Newtonian simulations. Conceptual and attitudinal data was collected using the Force Concept Inventory and the Colorado Learning Attitudes about Science Survey. Results suggest that coding is an effective means for teaching Newtonian physics.
An Object-Oriented Collection of Minimum Degree Algorithms: Design, Implementation, and Experiences
NASA Technical Reports Server (NTRS)
Kumfert, Gary; Pothen, Alex
1999-01-01
The multiple minimum degree (MMD) algorithm and its variants have enjoyed 20+ years of research and progress in generating fill-reducing orderings for sparse, symmetric positive definite matrices. Although conceptually simple, efficient implementations of these algorithms are deceptively complex and highly specialized. In this case study, we present an object-oriented library that implements several recent minimum degree-like algorithms. We discuss how object-oriented design forces us to decompose these algorithms in a different manner than earlier codes and demonstrate how this impacts the flexibility and efficiency of our C++ implementation. We compare the performance of our code against other implementations in C or Fortran.
Biasing spatial attention with semantic information: an event coding approach.
Amer, Tarek; Gozli, Davood G; Pratt, Jay
2017-04-21
We investigated the influence of conceptual processing on visual attention from the standpoint of Theory of Event Coding (TEC). The theory makes two predictions: first, an important factor in determining the influence of event 1 on processing event 2 is whether features of event 1 are bound into a unified representation (i.e., selection or retrieval of event 1). Second, whether processing the two events facilitates or interferes with each other should depend on the extent to which their constituent features overlap. In two experiments, participants performed a visual-attention cueing task, in which the visual target (event 2) was preceded by a relevant or irrelevant explicit (e.g., "UP") or implicit (e.g., "HAPPY") spatial-conceptual cue (event 1). Consistent with TEC, we found relevant explicit cues (which featurally overlap to a greater extent with the target) and implicit cues (which featurally overlap to a lesser extent), respectively, facilitated and interfered with target processing at compatible locations. Irrelevant explicit and implicit cues, on the other hand, both facilitated target processing, presumably because they were less likely selected or retrieved as an integrated and unified event file. We argue that such effects, often described as "attentional cueing", are better accounted for within the event coding framework.
The new Italian code of medical ethics.
Fineschi, V; Turillazzi, E; Cateni, C
1997-01-01
In June 1995, the Italian code of medical ethics was revised in order that its principles should reflect the ever-changing relationship between the medical profession and society and between physicians and patients. The updated code is also a response to new ethical problems created by scientific progress; the discussion of such problems often shows up a need for better understanding on the part of the medical profession itself. Medical deontology is defined as the discipline for the study of norms of conduct for the health care professions, including moral and legal norms as well as those pertaining more strictly to professional performance. The aim of deontology is therefore, the in-depth investigation and revision of the code of medical ethics. It is in the light of this conceptual definition that one should interpret a review of the different codes which have attempted, throughout the various periods of Italy's recent history, to adapt ethical norms to particular social and health care climates. PMID:9279746
Efficient, Multi-Scale Designs Take Flight
NASA Technical Reports Server (NTRS)
2003-01-01
Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.
The LAC Test: A New Look at Auditory Conceptualization and Literacy Development K-12.
ERIC Educational Resources Information Center
Lindamood, Charles; And Others
The Lindamood Auditory Conceptualization (LAC) Test was constructed with the recognition that the process of decoding involves an integration of the auditory, visual, and motor senses. Requiring the manipulation of colored blocks to indicate conceptualization of test patterns spoken by the examiner, subtest 1 entails coding of identity, number,…
NDARC-NASA Design and Analysis of Rotorcraft Theoretical Basis and Architecture
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2010-01-01
The theoretical basis and architecture of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are described. The principal tasks of NDARC are to design (or size) a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated. The aircraft attributes are obtained from the sum of the component attributes. NDARC provides a capability to model general rotorcraft configurations, and estimate the performance and attributes of advanced rotor concepts. The software has been implemented with low-fidelity models, typical of the conceptual design environment. Incorporation of higher-fidelity models will be possible, as the architecture of the code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis and optimization.
Improving aircraft conceptual design - A PHIGS interactive graphics interface for ACSYNT
NASA Technical Reports Server (NTRS)
Wampler, S. G.; Myklebust, A.; Jayaram, S.; Gelhausen, P.
1988-01-01
A CAD interface has been created for the 'ACSYNT' aircraft conceptual design code that permits the execution and control of the design process via interactive graphics menus. This CAD interface was coded entirely with the new three-dimensional graphics standard, the Programmer's Hierarchical Interactive Graphics System. The CAD/ACSYNT system is designed for use by state-of-the-art high-speed imaging work stations. Attention is given to the approaches employed in modeling, data storage, and rendering.
Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.
2016-01-01
The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Terahertz-Regime, Micro-VEDs: Evaluation of Micromachined TWT Conceptual Designs
NASA Technical Reports Server (NTRS)
Booske, John H.; Kory, Carol L.; Gallagher, D.; van der Weide, Daniel W.; Limbach, S; Gustafson, P; Lee, W.-J.; Gallagher, S.; Jain, K.
2001-01-01
Summary form only given. The Terahertz (THz) region of the electromagnetic spectrum (approx.300-3000 GHz) has enormous potential for high-data-rate communications, spectroscopy, astronomy, space research, medicine, biology, surveillance, remote sensing, industrial process control, etc. The most critical roadblock to full exploitation of the THz band is lack of coherent radiation sources that are powerful (0.01-10.0 W continuous wave), efficient (>1 %), frequency agile (instantaneously tunable over 1% bandwidths or more), reliable, and relatively inexpensive. Micro-machined Vacuum Electron Devices (micro-VEDs) represent a promising solution. We describe prospects for miniature, THz-regime TWTs fabricated using micromachining techniques. Several approx.600 GHz conceptual designs are compared. Their expected performance has been analyzed using SD, 2.51), and 3D TWT codes. A folded waveguide (FWG) TWT forward-wave amplifier design is presented based on a Northrop Grumman (NGC) optimized design procedure. This conceptual device is compared to the simulated performance of a novel, micro-VED helix TWT. Conceptual FWG TWT backward-wave amplifiers and oscillators are also discussed. A scaled (100 GHz) FWG TWT operating at a relatively low voltage (-12 kV) is under development at NGC. Also, actual-size micromachining experiments are planned to evaluate the feasibility of arrays of micro-VED TWTs. Progress and results of these efforts are described. This work was supported, in part by AFOSR, ONR, and NSF.
System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO
NASA Technical Reports Server (NTRS)
Olds, John R.
1994-01-01
This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.
Creating Semantic Waves: Using Legitimation Code Theory as a Tool to Aid the Teaching of Chemistry
ERIC Educational Resources Information Center
Blackie, Margaret A. L.
2014-01-01
This is a conceptual paper aimed at chemistry educators. The purpose of this paper is to illustrate the use of the semantic code of Legitimation Code Theory in chemistry teaching. Chemistry is an abstract subject which many students struggle to grasp. Legitimation Code Theory provides a way of separating out abstraction from complexity both of…
A probabilistic methodology for radar cross section prediction in conceptual aircraft design
NASA Astrophysics Data System (ADS)
Hines, Nathan Robert
System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of shaping parameters on radar cross section. Finally, two unique low observable configurations were analyzed to examine the impact of shaping for stealthiness.
ERIC Educational Resources Information Center
Erduran, Sibel
Eight physical science textbooks were analyzed for coverage on acids, bases, and neutralization. At the level of the text, clarity and coherence of statements were investigated. The conceptual framework for this topic was represented in a concept map which was used as a coding tool for tracing concepts and links present in textbooks. Cognitive…
Computer Code Aids Design Of Wings
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1993-01-01
AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.
The Landscape of long non-coding RNA classification
St Laurent, Georges; Wahlestedt, Claes; Kapranov, Philipp
2015-01-01
Advances in the depth and quality of transcriptome sequencing have revealed many new classes of long non-coding RNAs (lncRNAs). lncRNA classification has mushroomed to accommodate these new findings, even though the real dimensions and complexity of the non-coding transcriptome remain unknown. Although evidence of functionality of specific lncRNAs continues to accumulate, conflicting, confusing, and overlapping terminology has fostered ambiguity and lack of clarity in the field in general. The lack of fundamental conceptual un-ambiguous classification framework results in a number of challenges in the annotation and interpretation of non-coding transcriptome data. It also might undermine integration of the new genomic methods and datasets in an effort to unravel function of lncRNA. Here, we review existing lncRNA classifications, nomenclature, and terminology. Then we describe the conceptual guidelines that have emerged for their classification and functional annotation based on expanding and more comprehensive use of large systems biology-based datasets. PMID:25869999
Evaluation of CFETR as a Fusion Nuclear Science Facility using multiple system codes
NASA Astrophysics Data System (ADS)
Chan, V. S.; Costley, A. E.; Wan, B. N.; Garofalo, A. M.; Leuer, J. A.
2015-02-01
This paper presents the results of a multi-system codes benchmarking study of the recently published China Fusion Engineering Test Reactor (CFETR) pre-conceptual design (Wan et al 2014 IEEE Trans. Plasma Sci. 42 495). Two system codes, General Atomics System Code (GASC) and Tokamak Energy System Code (TESC), using different methodologies to arrive at CFETR performance parameters under the same CFETR constraints show that the correlation between the physics performance and the fusion performance is consistent, and the computed parameters are in good agreement. Optimization of the first wall surface for tritium breeding and the minimization of the machine size are highly compatible. Variations of the plasma currents and profiles lead to changes in the required normalized physics performance, however, they do not significantly affect the optimized size of the machine. GASC and TESC have also been used to explore a lower aspect ratio, larger volume plasma taking advantage of the engineering flexibility in the CFETR design. Assuming the ITER steady-state scenario physics, the larger plasma together with a moderately higher BT and Ip can result in a high gain Qfus ˜ 12, Pfus ˜ 1 GW machine approaching DEMO-like performance. It is concluded that the CFETR baseline mode can meet the minimum goal of the Fusion Nuclear Science Facility (FNSF) mission and advanced physics will enable it to address comprehensively the outstanding critical technology gaps on the path to a demonstration reactor (DEMO). Before proceeding with CFETR construction steady-state operation has to be demonstrated, further development is needed to solve the divertor heat load issue, and blankets have to be designed with tritium breeding ratio (TBR) >1 as a target.
Vectorization, threading, and cache-blocking considerations for hydrocodes on emerging architectures
Fung, J.; Aulwes, R. T.; Bement, M. T.; ...
2015-07-14
This work reports on considerations for improving computational performance in preparation for current and expected changes to computer architecture. The algorithms studied will include increasingly complex prototypes for radiation hydrodynamics codes, such as gradient routines and diffusion matrix assembly (e.g., in [1-6]). The meshes considered for the algorithms are structured or unstructured meshes. The considerations applied for performance improvements are meant to be general in terms of architecture (not specifically graphical processing unit (GPUs) or multi-core machines, for example) and include techniques for vectorization, threading, tiling, and cache blocking. Out of a survey of optimization techniques on applications such asmore » diffusion and hydrodynamics, we make general recommendations with a view toward making these techniques conceptually accessible to the applications code developer. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
User's Guide for ENSAERO_FE Parallel Finite Element Solver
NASA Technical Reports Server (NTRS)
Eldred, Lloyd B.; Guruswamy, Guru P.
1999-01-01
A high fidelity parallel static structural analysis capability is created and interfaced to the multidisciplinary analysis package ENSAERO-MPI of Ames Research Center. This new module replaces ENSAERO's lower fidelity simple finite element and modal modules. Full aircraft structures may be more accurately modeled using the new finite element capability. Parallel computation is performed by breaking the full structure into multiple substructures. This approach is conceptually similar to ENSAERO's multizonal fluid analysis capability. The new substructure code is used to solve the structural finite element equations for each substructure in parallel. NASTRANKOSMIC is utilized as a front end for this code. Its full library of elements can be used to create an accurate and realistic aircraft model. It is used to create the stiffness matrices for each substructure. The new parallel code then uses an iterative preconditioned conjugate gradient method to solve the global structural equations for the substructure boundary nodes.
The effect of multiple internal representations on context-rich instruction
NASA Astrophysics Data System (ADS)
Lasry, Nathaniel; Aulls, Mark W.
2007-11-01
We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.
The Current Status of Behaviorism and Neurofeedback
ERIC Educational Resources Information Center
Fultz, Dwight E.
2009-01-01
There appears to be no dominant conceptual model for the process and outcomes of neurofeedback among practitioners or manufacturers. Behaviorists are well-positioned to develop a neuroscience-based source code in which neural activity is described in behavioral terms, providing a basis for behavioral conceptualization and education of…
Axial and Centrifugal Compressor Mean Line Flow Analysis Method
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
2009-01-01
This paper describes a method to estimate key aerodynamic parameters of single and multistage axial and centrifugal compressors. This mean-line compressor code COMDES provides the capability of sizing single and multistage compressors quickly during the conceptual design process. Based on the compressible fluid flow equations and the Euler equation, the code can estimate rotor inlet and exit blade angles when run in the design mode. The design point rotor efficiency and stator losses are inputs to the code, and are modeled at off design. When run in the off-design analysis mode, it can be used to generate performance maps based on simple models for losses due to rotor incidence and inlet guide vane reset angle. The code can provide an improved understanding of basic aerodynamic parameters such as diffusion factor, loading levels and incidence, when matching multistage compressor blade rows at design and at part-speed operation. Rotor loading levels and relative velocity ratio are correlated to the onset of compressor surge. NASA Stage 37 and the three-stage NASA 74-A axial compressors were analyzed and the results compared to test data. The code has been used to generate the performance map for the NASA 76-B three-stage axial compressor featuring variable geometry. The compressor stages were aerodynamically matched at off-design speeds by adjusting the variable inlet guide vane and variable stator geometry angles to control the rotor diffusion factor and incidence angles.
Transition Models for Engineering Calculations
NASA Technical Reports Server (NTRS)
Fraser, C. J.
2007-01-01
While future theoretical and conceptual developments may promote a better understanding of the physical processes involved in the latter stages of boundary layer transition, the designers of rotodynamic machinery and other fluid dynamic devices need effective transition models now. This presentation will therefore center around the development of of some transition models which have been developed as design aids to improve the prediction codes used in the performance evaluation of gas turbine blading. All models are based on Narasimba's concentrated breakdown and spot growth.
NASA Astrophysics Data System (ADS)
May, David B.
2002-11-01
To explore students' epistemological beliefs in a variety of conceptual domains in physics, and in a specific and novel context of measurement, this Dissertation makes use of Weekly Reports, a class assignment in which students reflect in writing on what they learn each week and how they learn it. Reports were assigned to students in the introductory physics course for honors engineering majors at The Ohio State University in two successive years. The Weekly Reports of several students from the first year were analyzed for the kinds of epistemological beliefs exhibited therein, called epistemological self-reflection, and a coding scheme was developed for categorizing and quantifying this reflection. The connection between epistemological self-reflection and conceptual learning in physics seen in a pilot study was replicated in a larger study, in which the coded reflections from the Weekly Reports of thirty students were correlated with their conceptual learning gains. Although the total amount of epistemological self-reflection was not found to be related to conceptual gain, different kinds of epistemological self-reflection were. Describing learning physics concepts in terms of logical reasoning and making personal connections were positively correlated with gains; describing learning from authority figures or by observing phenomena without making inferences were negatively correlated. Linear regression equations were determined in order to quantify the effects on conceptual gain of specific ways of describing learning. In an experimental test of this model, the regression equations and the Weekly Report coding scheme developed from the first year's data were used to predict the conceptual gains of thirty students from the second year. The prediction was unsuccessful, possibly because these students were not given as much feedback on their reflections as were the first-year students. These results show that epistemological beliefs are important factors affecting the conceptual learning of physics students. Also, getting students to reflect meaningfully on their knowledge and learning is difficult and requires consistent feedback. Research into the epistemological beliefs of physics students in different contexts and from different populations can help us develop more complete models of epistemological beliefs, and ultimately improve the conceptual and epistemological knowledge of all students.
Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-01-01
Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625
Qualitative data analysis for health services research: developing taxonomy, themes, and theory.
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-08-01
To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.
What Is FRBR? A Conceptual Model for the Bibliographic Universe
ERIC Educational Resources Information Center
Tillett, Barbara
2005-01-01
From 1992 to 1995 the IFLA Study Group on Functional Requirements for Bibliographic Records (FRBR) developed an entity relationship model as a generalised view of the bibliographic universe, intended to be independent of any cataloguing code or implementation. The FRBR report itself includes a description of the conceptual model (the entities,…
Recurrent Coupling Improves Discrimination of Temporal Spike Patterns
Yuan, Chun-Wei; Leibold, Christian
2012-01-01
Despite the ubiquitous presence of recurrent synaptic connections in sensory neuronal systems, their general functional purpose is not well understood. A recent conceptual advance has been achieved by theories of reservoir computing in which recurrent networks have been proposed to generate short-term memory as well as to improve neuronal representation of the sensory input for subsequent computations. Here, we present a numerical study on the distinct effects of inhibitory and excitatory recurrence in a canonical linear classification task. It is found that both types of coupling improve the ability to discriminate temporal spike patterns as compared to a purely feed-forward system, although in different ways. For a large class of inhibitory networks, the network’s performance is optimal as long as a fraction of roughly 50% of neurons per stimulus is active in the resulting population code. Thereby the contribution of inactive neurons to the neural code is found to be even more informative than that of the active neurons, generating an inherent robustness of classification performance against temporal jitter of the input spikes. Excitatory couplings are found to not only produce a short-term memory buffer but also to improve linear separability of the population patterns by evoking more irregular firing as compared to the purely inhibitory case. As the excitatory connectivity becomes more sparse, firing becomes more variable, and pattern separability improves. We argue that the proposed paradigm is particularly well-suited as a conceptual framework for processing of sensory information in the auditory pathway. PMID:22586392
Ascent Aerodynamic Pressure Distributions on WB001
NASA Technical Reports Server (NTRS)
Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.
1996-01-01
To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.
Leveraging Quick Response Code Technology to Facilitate Simulation-Based Leaderboard Competition.
Chang, Todd P; Doughty, Cara B; Mitchell, Diana; Rutledge, Chrystal; Auerbach, Marc A; Frisell, Karin; Jani, Priti; Kessler, David O; Wolfe, Heather; MacKinnon, Ralph J; Dewan, Maya; Pirie, Jonathan; Lemke, Daniel; Khattab, Mona; Tofil, Nancy; Nagamuthu, Chenthila; Walsh, Catharine M
2018-02-01
Leaderboards provide feedback on relative performance and a competitive atmosphere for both self-guided improvement and social comparison. Because simulation can provide substantial quantitative participant feedback, leaderboards can be used, not only locally but also in a multidepartment, multicenter fashion. Quick Response (QR) codes can be integrated to allow participants to access and upload data. We present the development, implementation, and initial evaluation of an online leaderboard employing principles of gamification using points, badges, and leaderboards designed to enhance competition among healthcare providers. This article details the fundamentals behind the development and implementation of a user-friendly, online, multinational leaderboard that employs principles of gamification to enhance competition and integrates a QR code system to promote both self-reporting of performance data and data integrity. An open-ended survey was administered to capture perceptions of leaderboard implementation. Conceptual step-by-step instructions detailing how to apply the QR code system to any leaderboard using simulated or real performance metrics are outlined using an illustrative example of a leaderboard that employed simulated cardiopulmonary resuscitation performance scores to compare participants across 17 hospitals in 4 countries for 16 months. The following three major descriptive categories that captured perceptions of leaderboard implementation emerged from initial evaluation data from 10 sites: (1) competition, (2) longevity, and (3) perceived deficits. A well-designed leaderboard should be user-friendly and encompass best practices in gamification principles while collecting and storing data for research analyses. Easy storage and export of data allow for longitudinal record keeping that can be leveraged both to track compliance and to enable social competition.
Development of a Conceptual Framework to Measure the Social Impact of Burns.
Marino, Molly; Soley-Bori, Marina; Jette, Alan M; Slavin, Mary D; Ryan, Colleen M; Schneider, Jeffrey C; Resnik, Linda; Acton, Amy; Amaya, Flor; Rossi, Melinda; Soria-Saucedo, Rene; Kazis, Lewis E
Measuring community reintegration following burn injury is important to assess the efficacy of therapies designed to optimize recovery. This project aims to develop and validate a conceptual framework for understanding the social impact of burn injuries in adults. The framework is critical for developing the item banks used for a computerized adaptive test. We performed a comprehensive literature review and consulted with clinical experts and burn survivors about social life areas impacted by burn injury. Focus groups with burn survivors and clinicians were conducted to inform and validate the framework. Transcripts were coded using grounded theory methodology. The World Health Organization's International Classification of Functioning, Disability and Health, was chosen to ground the content model. The primary construct identified was social participation, which contains two concepts: societal role and personal relationships. The subdomains chosen for item development were work, recreation and leisure, relating with strangers, and romantic, sexual, family, and informal relationships. Qualitative results strongly suggest that the conceptual model fits the constructs for societal role and personal relationships with the respective subdomains. This conceptual framework has guided the implementation of a large-scale calibration study currently underway which will lead to a computerized adaptive test for monitoring the social impacts of burn injuries during recovery.
An Object-Oriented Computer Code for Aircraft Engine Weight Estimation
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Naylor, Bret A.
2009-01-01
Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn Research Center (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA's NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc., that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300-passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case.
ERIC Educational Resources Information Center
Nicholas, Mark C.
2011-01-01
Empirical research on how faculty across disciplines conceptualize or assess CT is scarce. This investigation focused on a group of 14 faculty drawn from multiple disciplines in the humanities and natural sciences. Using in-depth interviews, focus group discussions, assessment artifacts and qualitative coding strategies, this study examined how…
A Method to Reveal Fine-Grained and Diverse Conceptual Progressions during Learning
ERIC Educational Resources Information Center
Lombard, François; Merminod, Marie; Widmer, Vincent; Schneider, Daniel K.
2018-01-01
Empirical data on learners' conceptual progression is required to design curricula and guide students. In this paper, we present the Reference Map Change Coding (RMCC) method for revealing students' progression at a fine-grained level. The method has been developed and tested through the analysis of successive versions of the productions of eight…
NASA Astrophysics Data System (ADS)
Honnell, Kevin; Burnett, Sarah; Yorke, Chloe'; Howard, April; Ramsey, Scott
2017-06-01
The Noh problem is classic verification problem in the field of compressible flows. Simple to conceptualize, it is nonetheless difficult for numerical codes to predict correctly, making it an ideal code-verification test bed. In its original incarnation, the fluid is a simple ideal gas; once validated, however, these codes are often used to study highly non-ideal fluids and solids. In this work the classic Noh problem is extended beyond the commonly-studied polytropic ideal gas to more realistic equations of state (EOS) including the stiff gas, the Nobel-Abel gas, and the Carnahan-Starling hard-sphere fluid, thus enabling verification studies to be performed on more physically-realistic fluids. Exact solutions are compared with numerical results obtained from the Lagrangian hydrocode FLAG, developed at Los Alamos. For these more realistic EOSs, the simulation errors decreased in magnitude both at the origin and at the shock, but also spread more broadly about these points compared to the ideal EOS. The overall spatial convergence rate remained first order.
Reference results for time-like evolution up to
NASA Astrophysics Data System (ADS)
Bertone, Valerio; Carrazza, Stefano; Nocera, Emanuele R.
2015-03-01
We present high-precision numerical results for time-like Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution in the factorisation scheme, for the first time up to next-to-next-to-leading order accuracy in quantum chromodynamics. First, we scrutinise the analytical expressions of the splitting functions available in the literature, in both x and N space, and check their mutual consistency. Second, we implement time-like evolution in two publicly available, entirely independent and conceptually different numerical codes, in x and N space respectively: the already existing APFEL code, which has been updated with time-like evolution, and the new MELA code, which has been specifically developed to perform the study in this work. Third, by means of a model for fragmentation functions, we provide results for the evolution in different factorisation schemes, for different ratios between renormalisation and factorisation scales and at different final scales. Our results are collected in the format of benchmark tables, which could be used as a reference for global determinations of fragmentation functions in the future.
A Computer Code for Gas Turbine Engine Weight And Disk Life Estimation
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Ghosn, Louis J.; Halliwell, Ian; Wickenheiser, Tim (Technical Monitor)
2002-01-01
Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. In this paper, the major enhancements to NASA's engine-weight estimate computer code (WATE) are described. These enhancements include the incorporation of improved weight-calculation routines for the compressor and turbine disks using the finite-difference technique. Furthermore, the stress distribution for various disk geometries was also incorporated, for a life-prediction module to calculate disk life. A material database, consisting of the material data of most of the commonly-used aerospace materials, has also been incorporated into WATE. Collectively, these enhancements provide a more realistic and systematic way to calculate the engine weight. They also provide additional insight into the design trade-off between engine life and engine weight. To demonstrate the new capabilities, the enhanced WATE code is used to perform an engine weight/life trade-off assessment on a production aircraft engine.
Secondary School Students' Reasoning about Evolution
ERIC Educational Resources Information Center
To, Cheryl; Tenenbaum, Harriet R.; Hogh, Henriette
2017-01-01
This study examined age differences in young people's understanding of evolution theory in secondary school. A second aim of this study was to propose a new coding scheme that more accurately described students' conceptual understanding about evolutionary theory. We argue that coding schemes adopted in previous research may have overestimated…
Conceptual-driven classification for coding advise in health insurance reimbursement.
Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando
2011-01-01
With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in aspects of patient, hospital, and healthcare system. Copyright © 2010 Elsevier B.V. All rights reserved.
Method for Estimating the Sonic-Boom Characteristics of Lifting Canard-Wing Aircraft Concepts
NASA Technical Reports Server (NTRS)
Mack, Robert J.
2005-01-01
A method for estimating the sonic-boom overpressures from a conceptual aircraft where the lift is carried by both a canard and a wing during supersonic cruise is presented and discussed. Computer codes used for the prediction of the aerodynamic performance of the wing, the canard-wing interference, the nacelle-wing interference, and the sonic-boom overpressures are identified and discussed as the procedures in the method are discussed. A canard-wing supersonic-cruise concept was used as an example to demonstrate the application of the method.
High Efficiency Nuclear Power Plants Using Liquid Fluoride Thorium Reactor Technology
NASA Technical Reports Server (NTRS)
Juhasz, Albert J.; Rarick, Richard A.; Rangarajan, Rajmohan
2009-01-01
An overall system analysis approach is used to propose potential conceptual designs of advanced terrestrial nuclear power plants based on Oak Ridge National Laboratory (ORNL) Molten Salt Reactor (MSR) experience and utilizing Closed Cycle Gas Turbine (CCGT) thermal-to-electric energy conversion technology. In particular conceptual designs for an advanced 1 GWe power plant with turbine reheat and compressor intercooling at a 950 K turbine inlet temperature (TIT), as well as near term 100 MWe demonstration plants with TITs of 950 and 1200 K are presented. Power plant performance data were obtained for TITs ranging from 650 to 1300 K by use of a Closed Brayton Cycle (CBC) systems code which considered the interaction between major sub-systems, including the Liquid Fluoride Thorium Reactor (LFTR), heat source and heat sink heat exchangers, turbo-generator machinery, and an electric power generation and transmission system. Optional off-shore submarine installation of the power plant is a major consideration.
Forecasting of construction and demolition waste in Brazil.
Paz, Diogo Hf; Lafayette, Kalinny Pv
2016-08-01
The objective of this article is to develop a computerised tool (software) that facilitates the analysis of strategies for waste management on construction sites through the use of indicators of construction and demolition waste generation. The development involved the following steps: knowledge acquisition, structuring the system, coding and system evaluation. The step of knowledge acquisition aims to provide subsidies for the representation of them through models. In the step of structuring the system, it was presented the structuring and formalisation of knowledge for the development of the system, and has two stages: the construction of the conceptual model and the subsequent instantiation of the model. The coding system aims to implement (code) the conceptual model developed in a model played by computer (digital). The results showed that the system is very useful and applicable in construction sites, helping to improve the quality of waste management, and creating a database that will support new research. © The Author(s) 2016.
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines.OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott M.
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Solar Power System Options for the Radiation and Technology Demonstration Spacecraft
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Haraburda, Francis M.; Riehl, John P.
2000-01-01
The Radiation and Technology Demonstration (RTD) Mission has the primary objective of demonstrating high-power (10 kilowatts) electric thruster technologies in Earth orbit. This paper discusses the conceptual design of the RTD spacecraft photovoltaic (PV) power system and mission performance analyses. These power system studies assessed multiple options for PV arrays, battery technologies and bus voltage levels. To quantify performance attributes of these power system options, a dedicated Fortran code was developed to predict power system performance and estimate system mass. The low-thrust mission trajectory was analyzed and important Earth orbital environments were modeled. Baseline power system design options are recommended on the basis of performance, mass and risk/complexity. Important findings from parametric studies are discussed and the resulting impacts to the spacecraft design and cost.
NASA Astrophysics Data System (ADS)
Tseitlin, Michael; Galili, Igal
The crisis in physics education necessitates searching for new relevant meanings of physics knowledge. This paper advocates regarding physics as the dialogue among discipline-cultures, rather than as a cluster of disciplines to be an appropriate subject of science education. In a discipline-culture one can distinguish elements of knowledge as belonging to either (1) central principles and paradigms - nucleus, (2) normal disciplinary area - body of knowledge or (3) rival knowledge of the subject - periphery. It appears that Physics cannot be represented as a simple dynamic wholeness, that is, cannot be arranged in a single tripartite (triadic) structure (this result presents a deconstruction), but incorporates several discipline-cultures. Bound together by family similarity, they maintain a conceptual discourse. Teaching physics as a culture is performed in polyphonic space of different worldviews; in other words, it is performed in a Kontrapunkt. Implications of the tripartite code are suggested with regard to representation of scientific revolutions, individual conceptual change, physics curricula and the typology of students learning science.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Rosenthal, Jennifer L; Okumura, Megumi J; Hernandez, Lenore; Li, Su-Ting T; Rehm, Roberta S
2016-01-01
Children with special health care needs often require health services that are only provided at subspecialty centers. Such children who present to nonspecialty hospitals might require a hospital-to-hospital transfer. When transitioning between medical settings, communication is an integral aspect that can affect the quality of patient care. The objectives of the study were to identify barriers and facilitators to effective interfacility pediatric transfer communication to general pediatric floors from the perspectives of referring and accepting physicians, and then develop a conceptual model for effective interfacility transfer communication. This was a single-center qualitative study using grounded theory methodology. Referring and accepting physicians of children with special health care needs were interviewed. Four researchers coded the data using ATLAS.ti (version 7, Scientific Software Development GMBH, Berlin, Germany), using a 2-step process of open coding, followed by focused coding until no new codes emerged. The research team reached consensus on the final major categories and subsequently developed a conceptual model. Eight referring and 9 accepting physicians were interviewed. Theoretical coding resulted in 3 major categories: streamlined transfer process, quality handoff and 2-way communication, and positive relationships between physicians across facilities. The conceptual model unites these categories and shows how these categories contribute to effective interfacility transfer communication. Proposed interventions involved standardizing the communication process and incorporating technology such as telemedicine during transfers. Communication is perceived to be an integral component of interfacility transfers. We recommend that transfer systems be re-engineered to make the process more streamlined, to improve the quality of the handoff and 2-way communication, and to facilitate positive relationships between physicians across facilities. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Ahi, Berat
2016-01-01
This study aimed to determine mental models and identify codes (schemes) used in conceptualizing a desert environment. The sample for this study consisted of 184--out of a total population of 3,630--children in preschool education in the central district of Kastamonu, Turkey. Within the scope of this study, the children were initially asked to…
Preparing for the Downsizing and Closure of Letterman Army Medical Center: A Case Study
1991-06-17
and closure of Lieutenant Colonel F. William Brown believed in the value of this project, encouraged , and guided me during conceptualization , design...issues dirocled Sn the RW docnent repository were coded within this framwork . The muiaion category was coded 1 if primary or secmonay care waM affected
Three Mentor Texts that Support Code-Switching Pedagogies
ERIC Educational Resources Information Center
Hill, Dara
2013-01-01
This article informs us about the need for facilitating code-switching pedagogies that call for teacher-led scaffolding of students' home languages to negotiate informal and formal contexts for writing and speaking. Varied strategies are guided by three mentor texts the author has conceptualized or enacted in practice and research among middle…
An Object-oriented Computer Code for Aircraft Engine Weight Estimation
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Naylor, Bret A.
2008-01-01
Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA s NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc. that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300- passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case. Keywords: NASA, aircraft engine, weight, object-oriented
Lim, David W; White, Jonathan S
2015-11-01
There remains debate regarding the value of the written comments that medical students are traditionally asked to provide to evaluate the teaching they receive. The purpose of this study was to examine written teaching evaluations to understand how medical students conceptualize teachers' behaviors and performance. All written comments collected from medical students about teachers in the two surgery clerkships at the University of Alberta in 2009-2010 and 2010-2011 were collated and anonymized. A grounded theory approach was used for analysis, with iterative reading and open coding to identify recurring themes. A framework capturing variations observed in the data was generated until data saturation was achieved. Domains and subdomains were named using an in situ coding approach. The conceptual framework contained three main domains: "Physician as Teacher," "Physician as Person," and "Physician as Physician." Under "Physician as Teacher," students commented on specific acts of teaching and subjective perceptions of an educator's teaching values. Under the "Physician as Physician" domain, students commented on elements of their educator's physicianship, including communication and collaborative skills, medical expertise, professionalism, and role modeling. Under "Physician as Person," students commented on how both positive and negative personality traits impacted their learning. This framework describes how medical students perceive their teachers and how they use written language to attach meaning to the behaviors they observe. Such a framework can be used to help students provide more constructive feedback to teachers and to assist in faculty development efforts aimed at improving teaching performance.
Trade Studies for a Manned High-Power Nuclear Electric Propulsion Vehicle
NASA Technical Reports Server (NTRS)
SanSoucie, Michael; Hull, Patrick V.; Irwin, Ryan W.; TInker, Michael L.; Patton, Bruce W.
2005-01-01
Nuclear electric propulsion (NEP) vehicles will be needed for future manned missions to Mars and beyond. Candidate vehicles must be identified through trade studies for further detailed design from a large array of possibilities. Genetic algorithms have proven their utility in conceptual design studies by effectively searching a large design space to pinpoint unique optimal designs. This research combines analysis codes for NEP subsystems with genetic algorithm-based optimization. Trade studies for a NEP reference mission to the asteroids were conducted to identify important trends, and to determine the effects of various technologies and subsystems on vehicle performance. It was found that the electric thruster type and thruster performance have a major impact on the achievable system performance, and that significant effort in thruster research and development is merited.
Baucom, Brian R W; Leo, Karena; Adamo, Colin; Georgiou, Panayiotis; Baucom, Katherine J W
2017-12-01
Observational behavioral coding methods are widely used for the study of relational phenomena. There are numerous guidelines for the development and implementation of these methods that include principles for creating new and adapting existing coding systems as well as principles for creating coding teams. While these principles have been successfully implemented in research on relational phenomena, the ever expanding array of phenomena being investigated with observational methods calls for a similar expansion of these principles. Specifically, guidelines are needed for decisions that arise in current areas of emphasis in couple research including observational investigation of related outcomes (e.g., relationship distress and psychological symptoms), the study of change in behavior over time, and the study of group similarities and differences in the enactment and perception of behavior. This article describes conceptual and statistical considerations involved in these 3 areas of research and presents principle- and empirically based rationale for design decisions related to these issues. A unifying principle underlying these guidelines is the need for careful consideration of fit between theory, research questions, selection of coding systems, and creation of coding teams. Implications of (mis)fit for the advancement of theory are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Temporal and Rate Coding for Discrete Event Sequences in the Hippocampus.
Terada, Satoshi; Sakurai, Yoshio; Nakahara, Hiroyuki; Fujisawa, Shigeyoshi
2017-06-21
Although the hippocampus is critical to episodic memory, neuronal representations supporting this role, especially relating to nonspatial information, remain elusive. Here, we investigated rate and temporal coding of hippocampal CA1 neurons in rats performing a cue-combination task that requires the integration of sequentially provided sound and odor cues. The majority of CA1 neurons displayed sensory cue-, combination-, or choice-specific (simply, "event"-specific) elevated discharge activities, which were sustained throughout the event period. These event cells underwent transient theta phase precession at event onset, followed by sustained phase locking to the early theta phases. As a result of this unique single neuron behavior, the theta sequences of CA1 cell assemblies of the event sequences had discrete representations. These results help to update the conceptual framework for space encoding toward a more general model of episodic event representations in the hippocampus. Copyright © 2017 Elsevier Inc. All rights reserved.
FY2017 Updates to the SAS4A/SASSYS-1 Safety Analysis Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fanning, T. H.
The SAS4A/SASSYS-1 safety analysis software is used to perform deterministic analysis of anticipated events as well as design-basis and beyond-design-basis accidents for advanced fast reactors. It plays a central role in the analysis of U.S. DOE conceptual designs, proposed test and demonstration reactors, and in domestic and international collaborations. This report summarizes the code development activities that have taken place during FY2017. Extensions to the void and cladding reactivity feedback models have been implemented, and Control System capabilities have been improved through a new virtual data acquisition system for plant state variables and an additional Block Signal for a variablemore » lag compensator to represent reactivity feedback for novel shutdown devices. Current code development and maintenance needs are also summarized in three key areas: software quality assurance, modeling improvements, and maintenance of related tools. With ongoing support, SAS4A/SASSYS-1 can continue to fulfill its growing role in fast reactor safety analysis and help solidify DOE’s leadership role in fast reactor safety both domestically and in international collaborations.« less
Zyvoloski, G.; Kwicklis, E.; Eddebbarh, A.-A.; Arnold, B.; Faunt, C.; Robinson, B.A.
2003-01-01
This paper presents several different conceptual models of the Large Hydraulic Gradient (LHG) region north of Yucca Mountain and describes the impact of those models on groundwater flow near the potential high-level repository site. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain. This model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The numerical model is calibrated by matching available water level measurements using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM and parameter estimation software PEST) and model setup allows for efficient calibration of multiple conceptual models. Until now, the Large Hydraulic Gradient has been simulated using a low-permeability, east-west oriented feature, even though direct evidence for this feature is lacking. In addition to this model, we investigate and calibrate three additional conceptual models of the Large Hydraulic Gradient, all of which are based on a presumed zone of hydrothermal chemical alteration north of Yucca Mountain. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the potential repository that record differences in the predicted groundwater flow regime. The results show that Large Hydraulic Gradient can be represented with the alternate conceptual models that include the hydrothermally altered zone. The predicted pathways are mildly sensitive to the choice of the conceptual model and more sensitive to the quality of calibration in the vicinity on the repository. These differences are most likely due to different degrees of fit of model to data, and do not represent important differences in hydrologic conditions for the different conceptual models. ?? 2002 Elsevier Science B.V. All rights reserved.
ERIC Educational Resources Information Center
Henning, Elizabeth
2012-01-01
From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pakin, Scott
2004-05-13
A frequently reinvented wheel among network researchers is a suite of programs that test a networks performance. A problem with having umpteen versions of performance tests is that it leads to a variety in the way results are reported; colloquially, apples are often compared to oranges. Consider a bandwidth test. Does a bandwidth test run for a fixed number of iterations or a fixed length of time? Is bandwidth measured as ping-pong bandwidth (i.e., 2 * message length / round-trip time) or unidirectional throughput (N messages in one direction followed by a single acknowledgement message)? Is the acknowledgement message ofmore » minimal length or as long as the entire message? Does its length contribute to the total bandwidth? Is data sent unidirectionally or in both directions at once? How many warmup messages (if any) are sent before the timing loop? Is there a delay after the warmup messages (to give the network a chance to reclaim any scarce resources)? Are receives nonblocking (possibly allowing overlap in the NIC) or blocking? The motivation behind creating coNCePTuaL, a simple specification language designed for describing network benchmarks, is that it enables a benchmark to be described sufficiently tersely as to fit easily in a report or research paper, facilitating peer review of the experimental setup and timing measurements. Because coNCePTuaL code is simple to write, network tests can be developed and deployed with low turnaround times -- useful when the results of one test suggest a following test that should be written. Because coNCePTuaL is special-purpose its run-time system can perform the following functions, which benchmark writers often neglect to implement: * logging information about the environment under which the benchmark ran: operating system, CPU architecture and clock speed, timer type and resolution, etc. * aborting a program if it takes longer than a predetermined length of time to complete * writing measurement data and descriptive statistics to a variety of output formats, including the input formats of various graph-plotting programs coNCePTuaL is not limited to network peformance tests, however. It can also be used for network verification. That is, coNCePTuaL programs can be used to locate failed links or to determine the frequency of bit errors --even those that may sneak past the networks CRC hardware. In addition, because coNCePTuaL is a very high-level language, the coNCePTuaL compilers backend has a great deal of potential. It would be possible for the backend to produce a variety of target formats such as Fortran + MPI, Perl + sockets, C + a network vendors low-level messaging layer, and so forth. It could directly manipulate a network simulator. It could feed into a graphics program to produce a space-time diagram of a coNCePTuaL program. The possibilities are endless.« less
High Efficiency Nuclear Power Plants using Liquid Fluoride Thorium Reactor Technology
NASA Technical Reports Server (NTRS)
Juhasz, Albert J.; Rarick, Richard A.; Rangarajan, Rajmohan
2009-01-01
An overall system analysis approach is used to propose potential conceptual designs of advanced terrestrial nuclear power plants based on Oak Ridge National Laboratory (ORNL) Molten Salt Reactor (MSR) experience and utilizing Closed Cycle Gas Turbine (CCGT) thermal-to-electric energy conversion technology. In particular conceptual designs for an advanced 1 GWe power plant with turbine reheat and compressor intercooling at a 950 K turbine inlet temperature (TIT), as well as near term 100 MWe demonstration plants with TITS of 950 K and 1200 K are presented. Power plant performance data were obtained for TITS ranging from 650 to 1300 K by use of a Closed Brayton Cycle (CBC) systems code which considered the interaction between major sub-systems, including the Liquid Fluoride Thorium Reactor (LFTR), heat source and heat sink heat exchangers, turbo -generator machinery, and an electric power generation and transmission system. Optional off-shore submarine installation of the power plant is a major consideration.
Reactive transport modeling in fractured rock: A state-of-the-science review
NASA Astrophysics Data System (ADS)
MacQuarrie, Kerry T. B.; Mayer, K. Ulrich
2005-10-01
The field of reactive transport modeling has expanded significantly in the past two decades and has assisted in resolving many issues in Earth Sciences. Numerical models allow for detailed examination of coupled transport and reactions, or more general investigation of controlling processes over geologic time scales. Reactive transport models serve to provide guidance in field data collection and, in particular, enable researchers to link modeling and hydrogeochemical studies. In this state-of-science review, the key objectives were to examine the applicability of reactive transport codes for exploring issues of redox stability to depths of several hundreds of meters in sparsely fractured crystalline rock, with a focus on the Canadian Shield setting. A conceptual model of oxygen ingress and redox buffering, within a Shield environment at time and space scales relevant to nuclear waste repository performance, is developed through a review of previous research. This conceptual model describes geochemical and biological processes and mechanisms materially important to understanding redox buffering capacity and radionuclide mobility in the far-field. Consistent with this model, reactive transport codes should ideally be capable of simulating the effects of changing recharge water compositions as a result of long-term climate change, and fracture-matrix interactions that may govern water-rock interaction. Other aspects influencing the suitability of reactive transport codes include the treatment of various reaction and transport time scales, the ability to apply equilibrium or kinetic formulations simultaneously, the need to capture feedback between water-rock interactions and porosity-permeability changes, and the representation of fractured crystalline rock environments as discrete fracture or dual continuum media. A review of modern multicomponent reactive transport codes indicates a relatively high-level of maturity. Within the Yucca Mountain nuclear waste disposal program, reactive transport codes of varying complexity have been applied to investigate the migration of radionuclides and the geochemical evolution of host rock around the planned disposal facility. Through appropriate near- and far-field application of dual continuum codes, this example demonstrates how reactive transport models have been applied to assist in constraining historic water infiltration rates, interpreting the sealing of flow paths due to mineral precipitation, and investigating post-closure geochemical monitoring strategies. Natural analogue modeling studies, although few in number, are also of key importance as they allow the comparison of model results with hydrogeochemical and paleohydrogeological data over geologic time scales.
Conceptual frameworks of individual work performance: a systematic review.
Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; Schaufeli, Wilmar B; de Vet Henrica, C W; van der Beek, Allard J
2011-08-01
Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework. A systematic review was conducted in medical, psychological, and management databases. Studies were selected independently by two researchers and included when they presented a conceptual framework of individual work performance. A total of 17 generic frameworks (applying across occupations) and 18 job-specific frameworks (applying to specific occupations) were identified. Dimensions frequently used to describe individual work performance were task performance, contextual performance, counterproductive work behavior, and adaptive performance. On the basis of the literature, a heuristic conceptual framework of individual work performance was proposed. This framework can serve as a theoretical basis for future research and practice.
ERIC Educational Resources Information Center
Sack, Jacqueline J.
2013-01-01
This article explicates the development of top-view numeric coding of 3-D cube structures within a design research project focused on 3-D visualization skills for elementary grades children. It describes children's conceptual development of 3-D cube structures using concrete models, conventional 2-D pictures and abstract top-view numeric…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curca-Tivig, Florin; Merk, Stephan; Pautz, Andreas
2007-07-01
Anticipating future needs of our customers and willing to concentrate synergies and competences existing in the company for the benefit of our customers, AREVA NP decided in 2002 to develop the next generation of coupled neutronics/ core thermal-hydraulic (TH) code systems for fuel assembly and core design calculations for both, PWR and BWR applications. The global CONVERGENCE project was born: after a feasibility study of one year (2002) and a conceptual phase of another year (2003), development was started at the beginning of 2004. The present paper introduces the CONVERGENCE project, presents the main feature of the new code systemmore » ARCADIA{sup R} and concludes on customer benefits. ARCADIA{sup R} is designed to meet AREVA NP market and customers' requirements worldwide. Besides state-of-the-art physical modeling, numerical performance and industrial functionality, the ARCADIA{sup R} system is featuring state-of-the-art software engineering. The new code system will bring a series of benefits for our customers: e.g. improved accuracy for heterogeneous cores (MOX/ UOX, Gd...), better description of nuclide chains, and access to local neutronics/ thermal-hydraulics and possibly thermal-mechanical information (3D pin by pin full core modeling). ARCADIA is a registered trademark of AREVA NP. (authors)« less
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.; Olariu, Stephen
1995-01-01
The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.
Toward Right-Fidelity Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Sinsay, Jeffrey D.; Johnson, Wayne
2010-01-01
The aviation Advanced Design Office (ADO) of the US Army Aeroflightdynamics Directorate (AMRDEC) performs conceptual design of advanced Vertical Takeoff and Landing (VTOL) concepts in support of the Army's development and acquisition of new aviation systems. In particular, ADO engages in system synthesis to assess the impact of new technologies and their application to satisfy emerging warfighter needs and requirements. Fundamental to ADO being successful in accomplishing its role; is the ability to evaluate a wide array of proposed air vehicle concepts, and independently synthesize new concepts to inform Army and DoD decision makers about the tradespace in which decisions will be made (Figure 1). ADO utilizes a conceptual design (CD) process in the execution of its role. Benefiting from colocation with NASA rotorcraft researchers at the Ames Research Center, ADO and NASA have engaged in a survey of the current rotorcraft PD practices and begun the process of improving those capabilities to enable effective design and development of the next generation of VTOL systems. A unique aspect of CD in ADO is the fact that actual designs developed in-house are not intended to move forward in the development process. Rather, they are used as reference points in discussions about requirements development and technology impact. The ultimate products of ADO CD efforts are technology impact assessments and specifications which guide industry design activity. The fact that both the requirement and design are variables in the tradespace adds to the complexity of the CD process. A frequent need is ability to assess the relative "cost" of variations in requirement for a diverse set of VTOL configurations. Each of these configurations may have fundamentally different response characteristics to this requirement variation, and such insight into how different requirements drive different designs is a critical insight ADO attempts to provide decision makers. The processes and tools utilized are driven by the timeline in which questions must be answered. This can range from quick "back-of-the-envelope" assessments of a configuration made in an afternoon, to more detailed tradespace explorations that can take upwards of a year to complete. A variety of spreadsheet based tools and conceptual design codes are currently in use. The in-house developed conceptual sizing code RC (Rotorcraft) has been the preferred tool of choice for CD activity for a number of years. Figure 2 illustrates the long standing coupling between RC and solid modeling tools for layout, as well as a number of ad-hoc interfaces with external analyses. RC contains a sizing routine that is built around the use of momentum theory for rotors, classic finite wing theory, a referred parameter engine model, and semi-emperical weight estimation techniques. These methods lend themselves to rapid solutions, measured in seconds and minutes. The successful use of RC, however requires careful consideration of model input parameters and judicious comparison with existing aircraft to avoid unjustified extrapolation of results. RC is in fact a legacy of a series of codes whose development started in the early 1970s, and is best suited to the study of conventional helicopters and XV-15 style tiltrotors. Other concepts have been analyzed with RC, but typically it became necessary to modify the source code and methods for each unique configuration. Recent activity has lead to the development of a new code, NASA Design and Analysis of Rotorcraft (NDARC). NDARC uses a similar level of analytical fidelity as RC, but is built on a new framework intended to improve modularity and ability to rapidly model a wider array of concepts. Critical to achieving this capability is the decomposition of the aircraft system into a series of fundamental components which can then be assembled to form a wide-array of configurations. The paper will provide an overview of NDARC and its capabilities.
Young children's coding and storage of visual and verbal material.
Perlmutter, M; Myers, N A
1975-03-01
36 preschool children (mean age 4.2 years) were each tested on 3 recognition memory lists differing in test mode (visual only, verbal only, combined visual-verbal). For one-third of the children, original list presentation was visual only, for another third, presentation was verbal only, and the final third received combined visual-verbal presentation. The subjects generally performed at a high level of correct responding. Verbal-only presentation resulted in less correct recognition than did either visual-only or combined visual-verbal presentation. However, because performances under both visual-only and combined visual-verbal presentation were statistically comparable, and a high level of spontaneous labeling was observed when items were presented only visually, a dual-processing conceptualization of memory in 4-year-olds was suggested.
High Efficiency Low Cost CO2 Compression Using Supersonic Shock Wave Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, J; Aarnio, M; Grosvenor, A
2010-12-31
Development and testing results from a supersonic compressor are presented. The compressor achieved record pressure ratio for a fully-supersonic stage and successfully demonstrated the technology potential. Several tasks were performed in compliance with the DOE award objectives. A high-pressure ratio compressor was retrofitted to improve rotordynamics behavior and successfully tested. An outside review panel confirmed test results and design approach. A computational fluid dynamics code used to analyze the Ramgen supersonic flowpath was extensively and successfully modified to improve use on high-performance computing platforms. A comprehensive R&D implementation plan was developed and used to lay the groundwork for a futuremore » full-scale compressor demonstration. Conceptual design for a CO2 demonstration compressor was developed and reviewed.« less
Conceptual design study of small long-life PWR based on thorium cycle fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Subkhi, M. Nurul; Su'ud, Zaki; Waris, Abdul
2014-09-30
A neutronic performance of small long-life Pressurized Water Reactor (PWR) using thorium cycle based fuel has been investigated. Thorium cycle which has higher conversion ratio in thermal region compared to uranium cycle produce some significant of {sup 233}U during burn up time. The cell-burn up calculations were performed by PIJ SRAC code using nuclear data library based on JENDL 3.3, while the multi-energy-group diffusion calculations were optimized in whole core cylindrical two-dimension R-Z geometry by SRAC-CITATION. this study would be introduced thorium nitride fuel system which ZIRLO is the cladding material. The optimization of 350 MWt small long life PWRmore » result small excess reactivity and reduced power peaking during its operation.« less
ERIC Educational Resources Information Center
Lee, Jang Ho
2012-01-01
This paper concerns the conceptual and pedagogical issues that revolve around target language (TL) only instruction and teacher code-switching in the context of TL classrooms. To this end, I first examine four intertwined ideas (that is, monolingualism, naturalism, native-speakerism, and absolutism) that run through the monolingual approach to TL…
Simultaneous Semi-Distributed Model Calibration Guided by ...
Modelling approaches to transfer hydrologically-relevant information from locations with streamflow measurements to locations without such measurements continues to be an active field of research for hydrologists. The Pacific Northwest Hydrologic Landscapes (PNW HL) provide a solid conceptual classification framework based on our understanding of dominant processes. A Hydrologic Landscape code (5 letter descriptor based on physical and climatic properties) describes each assessment unit area, and these units average area 60km2. The core function of these HL codes is to relate and transfer hydrologically meaningful information between watersheds without the need for streamflow time series. We present a novel approach based on the HL framework to answer the question “How can we calibrate models across separate watersheds simultaneously, guided by our understanding of dominant processes?“. We should be able to apply the same parameterizations to assessment units of common HL codes if 1) the Hydrologic Landscapes contain hydrologic information transferable between watersheds at a sub-watershed-scale and 2) we use a conceptual hydrologic model and parameters that reflect the hydrologic behavior of a watershed. In this study, This work specifically tests the ability or inability to use HL-codes to inform and share model parameters across watersheds in the Pacific Northwest. EPA’s Western Ecology Division has published and is refining a framework for defining la
What does music express? Basic emotions and beyond.
Juslin, Patrik N
2013-01-01
Numerous studies have investigated whether music can reliably convey emotions to listeners, and-if so-what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of "multiple layers" of musical expression of emotions. The "core" layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this "core" layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions-though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions.
Modelling guidelines--terminology and guiding principles
NASA Astrophysics Data System (ADS)
Refsgaard, Jens Christian; Henriksen, Hans Jørgen
2004-01-01
Some scientists argue, with reference to Popper's scientific philosophical school, that models cannot be verified or validated. Other scientists and many practitioners nevertheless use these terms, but with very different meanings. As a result of an increasing number of examples of model malpractice and mistrust to the credibility of models, several modelling guidelines are being elaborated in recent years with the aim of improving the quality of modelling studies. This gap between the views and the lack of consensus experienced in the scientific community and the strongly perceived need for commonly agreed modelling guidelines is constraining the optimal use and benefits of models. This paper proposes a framework for quality assurance guidelines, including a consistent terminology and a foundation for a methodology bridging the gap between scientific philosophy and pragmatic modelling. A distinction is made between the conceptual model, the model code and the site-specific model. A conceptual model is subject to confirmation or falsification like scientific theories. A model code may be verified within given ranges of applicability and ranges of accuracy, but it can never be universally verified. Similarly, a model may be validated, but only with reference to site-specific applications and to pre-specified performance (accuracy) criteria. Thus, a model's validity will always be limited in terms of space, time, boundary conditions and types of application. This implies a continuous interaction between manager and modeller in order to establish suitable accuracy criteria and predictions associated with uncertainty analysis.
Catts, Stanley V; Frost, Aaron D J; O'Toole, Brian I; Carr, Vaughan J; Lewin, Terry; Neil, Amanda L; Harris, Meredith G; Evans, Russell W; Crissman, Belinda R; Eadie, Kathy
2011-01-01
Clinical practice improvement carried out in a quality assurance framework relies on routinely collected data using clinical indicators. Herein we describe the development, minimum training requirements, and inter-rater agreement of indicators that were used in an Australian multi-site evaluation of the effectiveness of early psychosis (EP) teams. Surveys of clinician opinion and face-to-face consensus-building meetings were used to select and conceptually define indicators. Operationalization of definitions was achieved by iterative refinement until clinicians could be quickly trained to code indicators reliably. Calculation of percentage agreement with expert consensus coding was based on ratings of paper-based clinical vignettes embedded in a 2-h clinician training package. Consensually agreed upon conceptual definitions for seven clinical indicators judged most relevant to evaluating EP teams were operationalized for ease-of-training. Brief training enabled typical clinicians to code indicators with acceptable percentage agreement (60% to 86%). For indicators of suicide risk, psychosocial function, and family functioning this level of agreement was only possible with less precise 'broad range' expert consensus scores. Estimated kappa values indicated fair to good inter-rater reliability (kappa > 0.65). Inspection of contingency tables (coding category by health service) and modal scores across services suggested consistent, unbiased coding across services. Clinicians are able to agree upon what information is essential to routinely evaluate clinical practice. Simple indicators of this information can be designed and coding rules can be reliably applied to written vignettes after brief training. The real world feasibility of the indicators remains to be tested in field trials.
Conceptual Design and Analysis of Cold Mass Support of the CS3U Feeder for the ITER
NASA Astrophysics Data System (ADS)
Zhu, Yinfeng; Song, Yuntao; Zhang, Yuanbin; Wang, Zhongwei
2013-06-01
In the International Thermonuclear Experimental Reactor (ITER) project, the feeders are one of the most important and critical systems. To convey the power supply and the coolant for the central solenoid (CS) magnet, 6 sets of CS feeders are employed, which consist mainly of an in-cryostat feeder (ICF), a cryostat feed-through (CFT), an S-bend box (SBB), and a coil terminal box (CTB). To compensate the displacements of the internal components of the CS feeders during operation, sliding cold mass supports consisting of a sled plate, a cylindrical support, a thermal shield, and an external ring are developed. To check the strength of the developed cold mass supports of the CS3U feeder, electromagnetic analysis of the two superconducting busbars is performed by using the CATIA V5 and ANSYS codes based on parametric technology. Furthermore, the thermal-structural coupling analysis is performed based on the obtained results, except for the stress concentration, and the max. stress intensity is lower than the allowable stress of the selected material. It is found that the conceptual design of the cold mass support can satisfy the required functions under the worst case of normal working conditions. All these performed activities will provide a firm technical basis for the engineering design and development of cold mass supports.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1992-01-01
The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.
Evidence for multiple, distinct representations of the human body.
Schwoebel, John; Coslett, H Branch
2005-04-01
Previous data from single-case and small group studies have suggested distinctions among structural, conceptual, and online sensorimotor representations of the human body. We developed a battery of tasks to further examine the prevalence and anatomic substrates of these body representations. The battery was administered to 70 stroke patients. Fifty-one percent of the patients were impaired relative to controls on at least one body representation measure. Further, principal components analysis of the patient data as well as direct comparisons of patient and control performance suggested a triple dissociation between measures of the 3 putative body representations. Consistent with previous distinctions between the "what" and "how" pathways, lesions of the left temporal lobe were most consistently associated with impaired performance on tasks assessing knowledge of the shape or lexical-semantic information about the body, whereas lesions of the dorsolateral frontal and parietal regions resulted in impaired performance on tasks requiring on-line coding of body posture.
Leth-Steensen, Craig; Citta, Richie
2016-01-01
Performance in numerical classification tasks involving either parity or magnitude judgements is quicker when small numbers are mapped onto a left-sided response and large numbers onto a right-sided response than for the opposite mapping (i.e., the spatial-numerical association of response codes or SNARC effect). Recent research by Gevers et al. [Gevers, W., Santens, S., Dhooge, E., Chen, Q., Van den Bossche, L., Fias, W., & Verguts, T. (2010). Verbal-spatial and visuospatial coding of number-space interactions. Journal of Experimental Psychology: General, 139, 180-190] suggests that this effect also arises for vocal "left" and "right" responding, indicating that verbal-spatial coding has a role to play in determining it. Another presumably verbal-based, spatial-numerical mapping phenomenon is the linguistic markedness association of response codes (MARC) effect whereby responding in parity tasks is quicker when odd numbers are mapped onto left-sided responses and even numbers onto right-sided responses. A recent account of both the SNARC and MARC effects is based on the polarity correspondence principle [Proctor, R. W., & Cho, Y. S. (2006). Polarity correspondence: A general principle for performance of speeded binary classification tasks. Psychological Bulletin, 132, 416-442]. This account assumes that stimulus and response alternatives are coded along any number of dimensions in terms of - and + polarities with quicker responding when the polarity codes for the stimulus and the response correspond. In the present study, even-odd parity judgements were made using either "left" and "right" or "bad" and "good" vocal responses. Results indicated that a SNARC effect was indeed present for the former type of vocal responding, providing further evidence for the sufficiency of the verbal-spatial coding account for this effect. However, the decided lack of an analogous SNARC-like effect in the results for the latter type of vocal responding provides an important constraint on the presumed generality of the polarity correspondence account. On the other hand, the presence of robust MARC effects for "bad" and "good" but not "left" and "right" vocal responses is consistent with the view that such effects are due to conceptual associations between semantic codes for odd-even and bad-good (but not necessarily left-right).
Three-dimensional modeling of flow through fractured tuff at Fran Ridge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eaton, R.R.; Ho, C.K.; Glass, RJ.
1996-09-01
Numerical studies have been made of an infiltration experiment at Fran Ridge using the TOUGH2 code to aid in the selection of computational models for performance assessment. The exercise investigates the capabilities of TOUGH2 to model transient flows through highly fractured tuff and provides a possible means of calibration. Two distinctly different conceptual models were used in the TOUGH2 code, the dual permeability model and the equivalent continuum model. The infiltration test modeled involved the infiltration of dyed ponded water for 36 minutes. The 205 gallon infiltration of water observed in the experiment was subsequently modeled using measured Fran Ridgemore » fracture frequencies, and a specified fracture aperture of 285 {micro}m. The dual permeability formulation predicted considerable infiltration along the fracture network, which was in agreement with the experimental observations. As expected, al fracture penetration of the infiltrating water was calculated using the equivalent continuum model, thus demonstrating that this model is not appropriate for modeling the highly transient experiment. It is therefore recommended that the dual permeability model be given priority when computing high-flux infiltration for use in performance assessment studies.« less
Three-dimensional modeling of flow through fractured tuff at Fran Ridge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eaton, R.R.; Ho, C.K.; Glass, R.J.
1996-01-01
Numerical studies have been made of an infiltration experiment at Fran Ridge using the TOUGH2 code to aid in the selection of computational models for performance assessment. The exercise investigates the capabilities of TOUGH2 to model transient flows through highly fractured tuff and provides a possible means of calibration. Two distinctly different conceptual models were used in the TOUGH2 code, the dual permeability model and the equivalent continuum model. The infiltration test modeled involved the infiltration of dyed ponded water for 36 minutes. The 205 gallon filtration of water observed in the experiment was subsequently modeled using measured Fran Ridgemore » fracture frequencies, and a specified fracture aperture of 285 {mu}m. The dual permeability formulation predicted considerable infiltration along the fracture network, which was in agreement with the experimental observations. As expected, minimal fracture penetration of the infiltrating water was calculated using the equivalent continuum model, thus demonstrating that this model is not appropriate for modeling the highly transient experiment. It is therefore recommended that the dual permeability model be given priority when computing high-flux infiltration for use in performance assessment studies.« less
Design Oriented Structural Modeling for Airplane Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Livne, Eli
1999-01-01
The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.
Rapid Assessment of Agility for Conceptual Design Synthesis
NASA Technical Reports Server (NTRS)
Biezad, Daniel J.
1996-01-01
This project consists of designing and implementing a real-time graphical interface for a workstation-based flight simulator. It is capable of creating a three-dimensional out-the-window scene of the aircraft's flying environment, with extensive information about the aircraft's state displayed in the form of a heads-up-display (HUD) overlay. The code, written in the C programming language, makes calls to Silicon Graphics' Graphics Library (GL) to draw the graphics primitives. Included in this report is a detailed description of the capabilities of the code, including graphical examples, as well as a printout of the code itself
Semantic deficits in Spanish-English bilingual children with language impairment.
Sheng, Li; Peña, Elizabeth D; Bedore, Lisa M; Fiestas, Christine E
2012-02-01
To examine the nature and extent of semantic deficits in bilingual children with language impairment (LI). Thirty-seven Spanish-English bilingual children with LI (ranging from age 7;0 [years;months] to 9;10) and 37 typically developing (TD) age-matched peers generated 3 associations to 12 pairs of translation equivalents in English and Spanish. Responses were coded as paradigmatic (e.g., dinner-lunch, cena-desayuno [dinner-breakfast]), syntagmatic (e.g., delicious-pizza, delicioso-frijoles [delicious-beans]), and errors (e.g., wearing-where, vestirse-mal [to get dressed-bad]). A semantic depth score was derived in each language and conceptually by combining children's performance in both languages. The LI group achieved significantly lower semantic depth scores than the TD group after controlling for group differences in vocabulary size. Children showed higher conceptual scores than single-language scores. Both groups showed decreases in semantic depth scores across multiple elicitations. Analyses of individual performances indicated that semantic deficits (1 SD below the TD mean semantic depth score) were manifested in 65% of the children with LI and in 14% of the TD children. School-age bilingual children with and without LI demonstrated spreading activation of semantic networks. Consistent with the literature on monolingual children with LI, sparsely linked semantic networks characterize a considerable proportion of bilingual children with LI.
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. Harrington
2004-10-25
The purpose of this model report is to provide documentation of the conceptual and mathematical model (Ashplume) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. These aspects of volcanism-related dose calculation are described in the context of the entire igneous disruptive events conceptual model in ''Characterize Framework for Igneous Activity'' (BSC 2004 [DIRS 169989], Section 6.1.1). The Ashplume conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through themore » Yucca Mountain repository and downwind transport of contaminated tephra. The Ashplume mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the ground surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report update the previous documentation of the Ashplume mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model. In this report, ''Ashplume'' is used when referring to the atmospheric dispersal model and ''ASHPLUME'' is used when referencing the code of that model. Two analysis and model reports provide direct inputs to this model report, namely ''Characterize Eruptive Processes at Yucca Mountain, Nevada and Number of Waste Packages Hit by Igneous Intrusion''. This model report provides direct inputs to the TSPA, which uses the ASHPLUME software described and used in this model report. Thus, ASHPLUME software inputs are inputs to this model report for ASHPLUME runs in this model report. However, ASHPLUME software inputs are outputs of this model report for ASHPLUME runs by TSPA.« less
Anthony, Samantha J; Selkirk, Enid; Sung, Lillian; Klaassen, Robert J; Dix, David; Scheinemann, Katrin; Klassen, Anne F
2014-04-01
An appraisal of pediatric cancer-specific quality-of-life (QOL) instruments revealed a lack of clarity about what constitutes QOL in this population. This study addresses this concern by identifying the concepts that underpin the construct of QOL as determined by a content analysis of all patient-reported outcome (PRO) instruments used in childhood cancer research. A systematic review was performed of key databases (i.e., MEDLINE, CINAHL, PsychINFO) to identify studies of QOL in children with cancer. A content analysis process was used to code and categorize all items from generic and cancer-specified PRO instruments. Our objective was to provide clarification regarding the conceptual underpinnings of these instruments, as well as to help inform the development of theory and contribute to building a conceptual framework of QOL for children with cancer. A total of 6,013 English language articles were screened, identifying 148 studies. Ten generic and ten cancer-specific PRO instruments provided 957 items. Content analysis led to the identification of four major domains of QOL (physical, psychological, social, and general health), with 11 subdomains covering 98 different concepts. While all instruments reflected items relating to the broader domains of QOL, there was substantial heterogeneity in terms of the content and variability in the distribution of items. This systematic review and the proposed model represent a useful starting point in the critical appraisal of the conceptual underpinnings of PRO instruments used in pediatric oncology and contribute to the need to place such tools under a critical, yet reflective and analytical lens.
Sensemaking: Conceptualizing and Coding for “Good” Student Reasoning
NASA Astrophysics Data System (ADS)
Elby, Andrew; Scherr, R.; Bing, T.
2006-12-01
Physics instructors’ goals often go beyond improving students’ conceptual understanding and problem solving. Instructors also want students to engage in inquiry, become scientific/critical thinkers, understand the scientific process, and so on. We see two problems with these “non-content” goals. First, notions such as inquiry and scientific thinking are often defined vaguely or inconsistently across the literature. Second, even when like-minded instructors share a vision of what we’d love to see our students do, descriptions of that vision are often too squishy to communicate, debate, or assess: “We know it when we see it!” In this talk and poster, we address these problems by introducing sensemaking vs. answermaking, two mindsets with which students can approach physics. Our definitions of those notions benefit from a theoretical base, and our coding scheme for sensemaking vs. answermaking displays high interrater reliability and rests upon a list of specific indicators.
Conceptual Design of a 100kW Energy Integrated Type Bi-Directional Tidal Current Turbine
NASA Astrophysics Data System (ADS)
Kim, Ki Pyoung; Ahmed, M. Rafiuddin; Lee, Young Ho
2010-06-01
The development of a tidal current turbine that can extract maximum energy from the tidal current will be extremely beneficial for supplying continuous electric power. The present paper presents a conceptual design of a 100kW energy integrated type tidal current turbine for tidal power generation. The instantaneous power density of a flowing fluid incident on an underwater turbine is proportional to the cubic power of current velocity which is approximately 2.5m/s. A cross-flow turbine, provided with a nozzle and a diffuser, is designed and analyzed. The potential advantages of ducted and diffuser-augmented turbines were taken into consideration in order to achieve higher output at a relatively low speed. This study looks at a cross-flow turbine system which is placed in an augmentation channel to generate electricity bi-directionally. The compatibility of this turbine system is verified using a commercial CFD code, ANSYSCFX. This paper presents the results of the numerical analysis in terms of pressure, streaklines, velocity vectors and performance curves for energy integrated type bi-directional tidal current turbine (BDT) with augmentation.
Compressor Study to Meet Large Civil Tilt Rotor Engine Requirements
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
2009-01-01
A vehicle concept study has been made to meet the requirements of the Large Civil Tilt Rotorcraft vehicle mission. A vehicle concept was determined, and a notional turboshaft engine system study was conducted. The engine study defined requirements for the major engine components, including the compressor. The compressor design-point goal was to deliver a pressure ratio of 31:1 at an inlet weight flow of 28.4 lbm/sec. To perform a conceptual design of two potential compressor configurations to meet the design requirement, a mean-line compressor flow analysis and design code were used. The first configuration is an eight-stage axial compressor. Some challenges of the all-axial compressor are the small blade spans of the rear-block stages being 0.28 in., resulting in the last-stage blade tip clearance-to-span ratio of 2.4%. The second configuration is a seven-stage axial compressor, with a centrifugal stage having a 0.28-in. impeller-exit blade span. The compressors conceptual designs helped estimate the flow path dimensions, rotor leading and trailing edge blade angles, flow conditions, and velocity triangles for each stage.
Compressor Study to Meet Large Civil Tilt Rotor Engine Requirements
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
2009-01-01
A vehicle concept study has been made to meet the requirements of the Large Civil Tilt Rotorcraft vehicle mission. A vehicle concept was determined, and a notional turboshaft engine system study was conducted. The engine study defined requirements for the major engine components, including the compressor. The compressor design-point goal was to deliver a pressure ratio of 31:1 at an inlet weight flow of 28.4 lbm/sec. To perform a conceptual design of two potential compressor configurations to meet the design requirement, a mean-line compressor flow analysis and design code were used. The first configuration is an eight-stage axial compressor. Some challenges of the all-axial compressor are the small blade spans of the rear-block stages being 0.28 in., resulting in the last-stage blade tip clearance-to-span ratio of 2.4 percent. The second configuration is a seven-stage axial compressor, with a centrifugal stage having a 0.28-in. impeller-exit blade span. The compressors conceptual designs helped estimate the flow path dimensions, rotor leading and trailing edge blade angles, flow conditions, and velocity triangles for each stage.
Epigenomics and the concept of degeneracy in biological systems
Mason, Paul H.; Barron, Andrew B.
2014-01-01
Researchers in the field of epigenomics are developing more nuanced understandings of biological complexity, and exploring the multiple pathways that lead to phenotypic expression. The concept of degeneracy—referring to the multiple pathways that a system recruits to achieve functional plasticity—is an important conceptual accompaniment to the growing body of knowledge in epigenomics. Distinct from degradation, redundancy and dilapidation; degeneracy refers to the plasticity of traits whose function overlaps in some environments, but diverges in others. While a redundant system is composed of repeated identical elements performing the same function, a degenerate system is composed of different elements performing similar or overlapping functions. Here, we describe the degenerate structure of gene regulatory systems from the basic genetic code to flexible epigenomic modifications, and discuss how these structural features have contributed to organism complexity, robustness, plasticity and evolvability. PMID:24335757
Tremblay, Marie-Claude; Martin, Debbie H; Macaulay, Ann C; Pluye, Pierre
2017-06-01
A long-standing challenge in community-based participatory research (CBPR) has been to anchor practice and evaluation in a relevant and comprehensive theoretical framework of community change. This study describes the development of a multidimensional conceptual framework that builds on social movement theories to identify key components of CBPR processes. Framework synthesis was used as a general literature search and analysis strategy. An initial conceptual framework was developed from the theoretical literature on social movement. A literature search performed to identify illustrative CBPR projects yielded 635 potentially relevant documents, from which eight projects (corresponding to 58 publications) were retained after record and full-text screening. Framework synthesis was used to code and organize data from these projects, ultimately providing a refined framework. The final conceptual framework maps key concepts of CBPR mobilization processes, such as the pivotal role of the partnership; resources and opportunities as necessary components feeding the partnership's development; the importance of framing processes; and a tight alignment between the cause (partnership's goal), the collective action strategy, and the system changes targeted. The revised framework provides a context-specific model to generate a new, innovative understanding of CBPR mobilization processes, drawing on existing theoretical foundations. © 2017 The Authors American Journal of Community Psychology published by Wiley Periodicals, Inc. on behalf of Society for Community Research and Action.
Comprehensive analysis of transport aircraft flight performance
NASA Astrophysics Data System (ADS)
Filippone, Antonio
2008-04-01
This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance, atmospheric effects, economic Mach number and noise trajectories at F.A.R. landing points.
NASA Technical Reports Server (NTRS)
Martinovic, Zoran N.; Cerro, Jeffrey A.
2002-01-01
This is an interim user's manual for current procedures used in the Vehicle Analysis Branch at NASA Langley Research Center, Hampton, Virginia, for launch vehicle structural subsystem weight estimation based on finite element modeling and structural analysis. The process is intended to complement traditional methods of conceptual and early preliminary structural design such as the application of empirical weight estimation or application of classical engineering design equations and criteria on one dimensional "line" models. Functions of two commercially available software codes are coupled together. Vehicle modeling and analysis are done using SDRC/I-DEAS, and structural sizing is performed with the Collier Research Corp. HyperSizer program.
Simulation of radiation environment for the LHeC detector
NASA Astrophysics Data System (ADS)
Nayaz, Abdullah; Piliçer, Ercan; Joya, Musa
2017-02-01
The detector response and simulation of radiation environment for the Large Hadron electron Collider (LHeC) baseline detector is estimated to predict its performance over the lifetime of the project. In this work, the geometry of the LHeC detector, as reported in LHeC Conceptual Design Report (CDR), built in FLUKA Monte Carlo tool in order to simulate the detector response and radiation environment. For this purpose, events of electrons and protons with high enough energy were sent isotropically from interaction point of the detector. As a result, the detector response and radiation background for the LHeC detector, with different USRBIN code (ENERGY, HADGT20M, ALL-CHAR, ALL-PAR) in FLUKA, are presented.
Optimization of small long-life PWR based on thorium fuel
NASA Astrophysics Data System (ADS)
Subkhi, Moh Nurul; Suud, Zaki; Waris, Abdul; Permana, Sidik
2015-09-01
A conceptual design of small long-life Pressurized Water Reactor (PWR) using thorium fuel has been investigated in neutronic aspect. The cell-burn up calculations were performed by PIJ SRAC code using nuclear data library based on JENDL 3.2, while the multi-energy-group diffusion calculations were optimized in three-dimension X-Y-Z geometry of core by COREBN. The excess reactivity of thorium nitride with ZIRLO cladding is considered during 5 years of burnup without refueling. Optimization of 350 MWe long life PWR based on 5% 233U & 2.8% 231Pa, 6% 233U & 2.8% 231Pa and 7% 233U & 6% 231Pa give low excess reactivity.
What does music express? Basic emotions and beyond
Juslin, Patrik N.
2013-01-01
Numerous studies have investigated whether music can reliably convey emotions to listeners, and—if so—what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of “multiple layers” of musical expression of emotions. The “core” layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this “core” layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions—though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions. PMID:24046758
How collaboration in therapy becomes therapeutic: the therapeutic collaboration coding system.
Ribeiro, Eugénia; Ribeiro, António P; Gonçalves, Miguel M; Horvath, Adam O; Stiles, William B
2013-09-01
The quality and strength of the therapeutic collaboration, the core of the alliance, is reliably associated with positive therapy outcomes. The urgent challenge for clinicians and researchers is constructing a conceptual framework to integrate the dialectical work that fosters collaboration, with a model of how clients make progress in therapy. We propose a conceptual account of how collaboration in therapy becomes therapeutic. In addition, we report on the construction of a coding system - the therapeutic collaboration coding system (TCCS) - designed to analyse and track on a moment-by-moment basis the interaction between therapist and client. Preliminary evidence is presented regarding the coding system's psychometric properties. The TCCS evaluates each speaking turn and assesses whether and how therapists are working within the client's therapeutic zone of proximal development, defined as the space between the client's actual therapeutic developmental level and their potential developmental level that can be reached in collaboration with the therapist. We applied the TCCS to five cases: a good and a poor outcome case of narrative therapy, a good and a poor outcome case of cognitive-behavioural therapy, and a dropout case of narrative therapy. The TCCS offers markers that may help researchers better understand the therapeutic collaboration on a moment-to-moment basis and may help therapists better regulate the relationship. © 2012 The British Psychological Society.
Conceptual Design of a Two Spool Compressor for the NASA Large Civil Tilt Rotor Engine
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Thurman, Douglas R.
2010-01-01
This paper focuses on the conceptual design of a two spool compressor for the NASA Large Civil Tilt Rotor engine, which has a design-point pressure ratio goal of 30:1 and an inlet weight flow of 30.0 lbm/sec. The compressor notional design requirements of pressure ratio and low-pressure compressor (LPC) and high pressure ratio compressor (HPC) work split were based on a previous engine system study to meet the mission requirements of the NASA Subsonic Rotary Wing Projects Large Civil Tilt Rotor vehicle concept. Three mean line compressor design and flow analysis codes were utilized for the conceptual design of a two-spool compressor configuration. This study assesses the technical challenges of design for various compressor configuration options to meet the given engine cycle results. In the process of sizing, the technical challenges of the compressor became apparent as the aerodynamics were taken into consideration. Mechanical constraints were considered in the study such as maximum rotor tip speeds and conceptual sizing of rotor disks and shafts. The rotor clearance-to-span ratio in the last stage of the LPC is 1.5% and in the last stage of the HPC is 2.8%. Four different configurations to meet the HPC requirements were studied, ranging from a single stage centrifugal, two axi-centrifugals, and all axial stages. Challenges of the HPC design include the high temperature (1,560deg R) at the exit which could limit the maximum allowable peripheral tip speed for centrifugals, and is dependent on material selection. The mean line design also resulted in the definition of the flow path geometry of the axial and centrifugal compressor stages, rotor and stator vane angles, velocity components, and flow conditions at the leading and trailing edges of each blade row at the hub, mean and tip. A mean line compressor analysis code was used to estimate the compressor performance maps at off-design speeds and to determine the required variable geometry reset schedules of the inlet guide vane and variable stators that would result in the transonic stages being aerodynamically matched with high efficiency and acceptable stall margins based on user specified maximum levels of rotor diffusion factor and relative velocity ratio.
Turbopump Design and Analysis Approach for Nuclear Thermal Rockets
NASA Technical Reports Server (NTRS)
Chen, Shu-cheng S.; Veres, Joseph P.; Fittje, James E.
2006-01-01
A rocket propulsion system, whether it is a chemical rocket or a nuclear thermal rocket, is fairly complex in detail but rather simple in principle. Among all the interacting parts, three components stand out: they are pumps and turbines (turbopumps), and the thrust chamber. To obtain an understanding of the overall rocket propulsion system characteristics, one starts from analyzing the interactions among these three components. It is therefore of utmost importance to be able to satisfactorily characterize the turbopump, level by level, at all phases of a vehicle design cycle. Here at NASA Glenn Research Center, as the starting phase of a rocket engine design, specifically a Nuclear Thermal Rocket Engine design, we adopted the approach of using a high level system cycle analysis code (NESS) to obtain an initial analysis of the operational characteristics of a turbopump required in the propulsion system. A set of turbopump design codes (PumpDes and TurbDes) were then executed to obtain sizing and performance characteristics of the turbopump that were consistent with the mission requirements. A set of turbopump analyses codes (PUMPA and TURBA) were applied to obtain the full performance map for each of the turbopump components; a two dimensional layout of the turbopump based on these mean line analyses was also generated. Adequacy of the turbopump conceptual design will later be determined by further analyses and evaluation. In this paper, descriptions and discussions of the aforementioned approach are provided and future outlooks are discussed.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
Performance Trades Study for Robust Airfoil Shape Optimization
NASA Technical Reports Server (NTRS)
Li, Wu; Padula, Sharon
2003-01-01
From time to time, existing aircraft need to be redesigned for new missions with modified operating conditions such as required lift or cruise speed. This research is motivated by the needs of conceptual and preliminary design teams for smooth airfoil shapes that are similar to the baseline design but have improved drag performance over a range of flight conditions. The proposed modified profile optimization method (MPOM) modifies a large number of design variables to search for nonintuitive performance improvements, while avoiding off-design performance degradation. Given a good initial design, the MPOM generates fairly smooth airfoils that are better than the baseline without making drastic shape changes. Moreover, the MPOM allows users to gain valuable information by exploring performance trades over various design conditions. Four simulation cases of airfoil optimization in transonic viscous ow are included to demonstrate the usefulness of the MPOM as a performance trades study tool. Simulation results are obtained by solving fully turbulent Navier-Stokes equations and the corresponding discrete adjoint equations using an unstructured grid computational fluid dynamics code FUN2D.
Modular Track System For Positioning Mobile Robots
NASA Technical Reports Server (NTRS)
Miller, Jeff
1995-01-01
Conceptual system for positioning mobile robotic manipulators on large main structure includes modular tracks and ancillary structures assembled easily along with main structure. System, called "tracked robotic location system" (TROLS), originally intended for application to platforms in outer space, but TROLS concept might also prove useful on Earth; for example, to position robots in factories and warehouses. T-cross-section rail keeps mobile robot on track. Bar codes mark locations along track. Each robot equipped with bar-code-recognizing circuitry so it quickly finds way to assigned location.
Surveyor Management of Hospital Accreditation Program: A Thematic Analysis Conducted in Iran.
Teymourzadeh, Ehsan; Ramezani, Mozhdeh; Arab, Mohammad; Rahimi Foroushani, Abbas; Akbari Sari, Ali
2016-05-01
The surveyors in hospital accreditation program are considered as the core of accreditation programs. So, the reliability and validity of the accreditation program heavily depend on their performance. This study aimed to identify the dimensions and factors affecting surveyor management of hospital accreditation programs in Iran. This qualitative study used a thematic analysis method, and was performed in Iran in 2014. The study participants included experts in the field of hospital accreditation, and were derived from three groups: 1. Policy-makers, administrators, and surveyors of the accreditation bureau, the ministry of health and medical education, Iranian universities of medical science; 2. Healthcare service providers, and 3. University professors and faculty members. The data were collected using semi-structured in-depth interviews. Following text transcription and control of compliance with the original text, MAXQDA10 software was used to code, classify, and organize the interviews in six stages. The findings from the analysis of 21 interviews were first classified in the form of 1347 semantic units, 11 themes, 17 sub-themes, and 248 codes. These were further discussed by an expert panel, which then resulted in the emergence of seven main themes - selection and recruitment of the surveyor team, organization of the surveyor team, planning to perform surveys, surveyor motivation and retention, surveyor training, surveyor assessment, and recommendations - as well as 27 sub-themes, and 112 codes. The dimensions and variables affecting the surveyors' management were identified and classified on the basis of existing scientific methods in the form of a conceptual framework. Using the results of this study, it would certainly be possible to take a great step toward enhancing the reliability of surveys and the quality and safety of services, while effectively managing accreditation program surveyors.
NASA Astrophysics Data System (ADS)
Bianchi Janetti, Emanuela; Riva, Monica; Guadagnini, Alberto
2017-04-01
We perform a variance-based global sensitivity analysis to assess the impact of the uncertainty associated with (a) the spatial distribution of hydraulic parameters, e.g., hydraulic conductivity, and (b) the conceptual model adopted to describe the system on the characterization of a regional-scale aquifer. We do so in the context of inverse modeling of the groundwater flow system. The study aquifer lies within the provinces of Bergamo and Cremona (Italy) and covers a planar extent of approximately 785 km2. Analysis of available sedimentological information allows identifying a set of main geo-materials (facies/phases) which constitute the geological makeup of the subsurface system. We parameterize the conductivity field following two diverse conceptual schemes. The first one is based on the representation of the aquifer as a Composite Medium. In this conceptualization the system is composed by distinct (five, in our case) lithological units. Hydraulic properties (such as conductivity) in each unit are assumed to be uniform. The second approach assumes that the system can be modeled as a collection of media coexisting in space to form an Overlapping Continuum. A key point in this model is that each point in the domain represents a finite volume within which each of the (five) identified lithofacies can be found with a certain volumetric percentage. Groundwater flow is simulated with the numerical code MODFLOW-2005 for each of the adopted conceptual models. We then quantify the relative contribution of the considered uncertain parameters, including boundary conditions, to the total variability of the piezometric level recorded in a set of 40 monitoring wells by relying on the variance-based Sobol indices. The latter are derived numerically for the investigated settings through the use of a model-order reduction technique based on the polynomial chaos expansion approach.
Coherent concepts are computed in the anterior temporal lobes.
Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J
2010-02-09
In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.
To master or perform? Exploring relations between achievement goals and conceptual change learning.
Ranellucci, John; Muis, Krista R; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M
2013-09-01
Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Seventy-three undergraduate students were assessed on their prior knowledge and misconceptions about Newtonian mechanics, and then reported their achievement goals and participated in think-aloud protocols while reading Newtonian physics texts. A mastery-approach goal orientation positively predicted deep processing strategies, shallow processing strategies, and conceptual change. In contrast, a performance-approach goal orientation did not predict either of the processing strategies, but negatively predicted conceptual change. A performance-avoidance goal orientation negatively predicted deep processing strategies and conceptual change. Moreover, deep and shallow processing strategies positively predicted conceptual change as well as recall. Finally, both deep and shallow processing strategies mediated relations between mastery-approach goals and conceptual change. Results provide some support for Dole and Sinatra's (1998) Cognitive Reconstruction of Knowledge Model of conceptual change but also challenge specific facets with regard to the role of depth of processing in conceptual change. © 2012 The British Psychological Society.
Nozzle Numerical Analysis Of The Scimitar Engine
NASA Astrophysics Data System (ADS)
Battista, F.; Marini, M.; Cutrone, L.
2011-05-01
This work describes part of the activities on the LAPCAT-II A2 vehicle, in which starting from the available conceptual vehicle design and the related pre- cooled turbo-ramjet engine called SCIMITAR, well- thought assumptions made for performance figures of different components during the iteration process within LAPCAT-I will be assessed in more detail. In this paper it is presented a numerical analysis aimed at the design optimization of the nozzle contour of the LAPCAT A2 SCIMITAR engine designed by Reaction Engines Ltd. (REL) (see Figure 1). In particular, nozzle shape optimization process is presented for cruise conditions. All the computations have been carried out by using the CIRA C3NS code in non equilibrium conditions. The effect of considering detailed or reduced chemical kinetic schemes has been analyzed with a particular focus on the production of pollutants. An analysis of engine performance parameters, such as thrust and combustion efficiency has been carried out.
Computer assessment of interview data using latent semantic analysis.
Dam, Gregory; Kaufmann, Stefan
2008-02-01
Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed.
Computational Analysis of a Low-Boom Supersonic Inlet
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.
2011-01-01
A low-boom supersonic inlet was designed for use on a conceptual small supersonic aircraft that would cruise with an over-wing Mach number of 1.7. The inlet was designed to minimize external overpressures, and used a novel bypass duct to divert the highest shock losses around the engine. The Wind-US CFD code was used to predict the effects of capture ratio, struts, bypass design, and angles of attack on inlet performance. The inlet was tested in the 8-ft by 6-ft Supersonic Wind Tunnel at NASA Glenn Research Center. Test results showed that the inlet had excellent performance, with capture ratios near one, a peak core total pressure recovery of 96 percent, and a stable operating range much larger than that of an engine. Predictions generally compared very well with the experimental data, and were used to help interpret some of the experimental results.
Making Semantic Waves: A Key to Cumulative Knowledge-Building
ERIC Educational Resources Information Center
Maton, Karl
2013-01-01
The paper begins by arguing that knowledge-blindness in educational research represents a serious obstacle to understanding knowledge-building. It then offers sociological concepts from Legitimation Code Theory--"semantic gravity" and "semantic density"--that systematically conceptualize one set of organizing principles underlying knowledge…
Coding Classroom Interactions for Collective and Individual Engagement
ERIC Educational Resources Information Center
Ryu, Suna; Lombardi, Doug
2015-01-01
This article characterizes "engagement in science learning" from a sociocultural perspective and offers a mixed method approach to measuring engagement that combines critical discourse analysis (CDA) and social network analysis (SNA). Conceptualizing engagement from a sociocultural perspective, the article discusses the advantages of a…
Evolution beyond neo-Darwinism: a new conceptual framework.
Noble, Denis
2015-01-01
Experimental results in epigenetics and related fields of biological research show that the Modern Synthesis (neo-Darwinist) theory of evolution requires either extension or replacement. This article examines the conceptual framework of neo-Darwinism, including the concepts of 'gene', 'selfish', 'code', 'program', 'blueprint', 'book of life', 'replicator' and 'vehicle'. This form of representation is a barrier to extending or replacing existing theory as it confuses conceptual and empirical matters. These need to be clearly distinguished. In the case of the central concept of 'gene', the definition has moved all the way from describing a necessary cause (defined in terms of the inheritable phenotype itself) to an empirically testable hypothesis (in terms of causation by DNA sequences). Neo-Darwinism also privileges 'genes' in causation, whereas in multi-way networks of interactions there can be no privileged cause. An alternative conceptual framework is proposed that avoids these problems, and which is more favourable to an integrated systems view of evolution. © 2015. Published by The Company of Biologists Ltd.
A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
2003-01-01
An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.
Stirling engine external heat system design with heat pipe heater
NASA Technical Reports Server (NTRS)
Godett, Ted M.; Ziph, Benjamin
1986-01-01
This final report presents the conceptual design of a liquid fueled external heating system (EHS) and the preliminary design of a heat pipe heater for the STM-4120 Stirling cycle engine, to meet the Air Force mobile electric power (MEP) requirement for units in the range of 20 to 60 kW. The EHS design had the following constraints: (1) Packaging requirements limited the overall system dimensions to about 330 mm x 250 mm x 100 mm; (2) Heat flux to the sodium heat pipe evaporator was limited to an average of 100 kW/m and a maximum of 550 kW/m based on previous experience; and (3) The heat pipe operating temperature was specified to be 800 C based on heat input requirements of the STM4-120. An analysis code was developed to optimize the EHS performance parameters and an analytical development of the sodium heat pipe heater was performed; both are presented and discussed. In addition, construction techniques were evaluated and scale model heat pipe testing performed.
Novais, Sónia Alexandra de Lemos; Mendes, Felismina Rosa Parreira
2016-03-01
This study explores illness representations within Familial Amyloidotic Polyneuropathy Portuguese Association newspaper . A content analysis was performed of the issue data using provisional coding related to the conceptual framework of the study. All dimensions of illness representation in Leventhal's Common Sense Model of illness cognitions and behaviors are present in the data and reflect the experience of living with this disease. Understanding how a person living with an hereditary, rare, neurodegenerative illness is important for developing community nursing interventions. In conclusion, we suggest an integration of common sense knowledge with other approaches for designing an intervention program centered on people living with an hereditary neurodegenerative illness, such as familial amyloidotic polyneuropathy. © 2015 Wiley Publishing Asia Pty Ltd.
Optimization of small long-life PWR based on thorium fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Subkhi, Moh Nurul, E-mail: nsubkhi@students.itb.ac.id; Physics Dept., Faculty of Science and Technology, State Islamic University of Sunan Gunung Djati Bandung Jalan A.H Nasution 105 Bandung; Suud, Zaki, E-mail: szaki@fi.itb.ac.id
2015-09-30
A conceptual design of small long-life Pressurized Water Reactor (PWR) using thorium fuel has been investigated in neutronic aspect. The cell-burn up calculations were performed by PIJ SRAC code using nuclear data library based on JENDL 3.2, while the multi-energy-group diffusion calculations were optimized in three-dimension X-Y-Z geometry of core by COREBN. The excess reactivity of thorium nitride with ZIRLO cladding is considered during 5 years of burnup without refueling. Optimization of 350 MWe long life PWR based on 5% {sup 233}U & 2.8% {sup 231}Pa, 6% {sup 233}U & 2.8% {sup 231}Pa and 7% {sup 233}U & 6% {supmore » 231}Pa give low excess reactivity.« less
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
UNSAT-H Version 2. 0: Unsaturated soil water and heat flow model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fayer, M.J.; Jones, T.L.
1990-04-01
This report documents UNSAT-H Version 2.0, a model for calculating water and heat flow in unsaturated media. The documentation includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plant transpiration, and the code listing. Waste management practices at the Hanford Site have included disposal of low-level wastes by near-surface burial. Predicting the future long-term performance of any such burial site in terms of migration of contaminants requires a model capable of simulating water flow in the unsaturated soils above the buried waste. The model currently used to meet thismore » need is UNSAT-H. This model was developed at Pacific Northwest Laboratory to assess water dynamics of near-surface, waste-disposal sites at the Hanford Site. The code is primarily used to predict deep drainage as a function of such environmental conditions as climate, soil type, and vegetation. UNSAT-H is also used to simulate the effects of various practices to enhance isolation of wastes. 66 refs., 29 figs., 7 tabs.« less
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2016-01-01
Simultaneously achieving the fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project requires innovative and unconventional aircraft concepts. In response, advanced hybrid wing body (HWB) aircraft concepts have been proposed and analyzed as a means of meeting these objectives. For the current study, several HWB concepts were analyzed using the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) analysis code. HCDstruct is a medium-fidelity finite element based conceptual design and structural optimization tool developed to fill the critical analysis gap existing between lower order structural sizing approaches and detailed, often finite element based sizing methods for HWB aircraft concepts. Whereas prior versions of the tool used a half-model approach in building the representative finite element model, a full wing-tip-to-wing-tip modeling capability was recently added to HCDstruct, which alleviated the symmetry constraints at the model centerline in place of a free-flying model and allowed for more realistic center body, aft body, and wing loading and trim response. The latest version of HCDstruct was applied to two ERA reference cases, including the Boeing Open Rotor Engine Integration On an HWB (OREIO) concept and the Boeing ERA-0009H1 concept, and results agreed favorably with detailed Boeing design data and related Flight Optimization System (FLOPS) analyses. Following these benchmark cases, HCDstruct was used to size NASA's ERA HWB concepts and to perform a related scaling study.
Social Information Processing Analysis (SIPA): Coding Ongoing Human Communication.
ERIC Educational Resources Information Center
Fisher, B. Aubrey; And Others
1979-01-01
The purpose of this paper is to present a new analytical system to be used in communication research. Unlike many existing systems devised ad hoc, this research tool, a system for interaction analysis, is embedded in a conceptual rationale based on modern systems theory. (Author)
Conceptualizing Poverty: A Look Inside the Indonesian Household
1999-10-01
physical assets, including: one television, one refrigerator, two radios, two bicycles, and the eggs from seven chickens. Despite the independent...Code River. Ibu Marsono worked as an assistant selling bakso — a meatball soup sold from a pushcart. Her husband owned his own pushcart and sold
A Manual for Coding Teacher's Enacted Interpersonal Understanding.
ERIC Educational Resources Information Center
DeVries, Rheta; And Others
This manual was developed in order to study the sociomoral atmospheres of three kindergarten classrooms. Previous research by Robert Selman et al. conceptualized developmental levels of interpersonal understanding in terms of two types of experiences: negotiation, where the developmental goal is identity separate from others; and shared…
ERIC Educational Resources Information Center
Harris, Frank, III
2008-01-01
Informed by the constructionist epistemological perspective, the purpose of this study was to examine socially constructed conceptualizations of masculinity and gender performance among 12 culturally diverse undergraduate men. The participants espoused seemingly productive conceptualizations of masculinity, yet their gendered behaviors were…
A High-Level Language for Modeling Algorithms and Their Properties
NASA Astrophysics Data System (ADS)
Akhtar, Sabina; Merz, Stephan; Quinson, Martin
Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.
Perceived Noise Analysis for Offset Jets Applied to Commercial Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Huff, Dennis L.; Henderson, Brenda S.; Berton, Jeffrey J.; Seidel, Jonathan A.
2016-01-01
A systems analysis was performed with experimental jet noise data, engine/aircraft performance codes and aircraft noise prediction codes to assess takeoff noise levels and mission range for conceptual supersonic commercial aircraft. A parametric study was done to identify viable engine cycles that meet NASA's N+2 goals for noise and performance. Model scale data from offset jets were used as input to the aircraft noise prediction code to determine the expected sound levels for the lateral certification point where jet noise dominates over all other noise sources. The noise predictions were used to determine the optimal orientation of the offset nozzles to minimize the noise at the lateral microphone location. An alternative takeoff procedure called "programmed lapse rate" was evaluated for noise reduction benefits. Results show there are two types of engines that provide acceptable mission range performance; one is a conventional mixed-flow turbofan and the other is a three-stream variable-cycle engine. Separate flow offset nozzles reduce the noise directed toward the thicker side of the outer flow stream, but have less benefit as the core nozzle pressure ratio is reduced. At the systems level for a three-engine N+2 aircraft with full throttle takeoff, there is a 1.4 EPNdB margin to Chapter 3 noise regulations predicted for the lateral certification point (assuming jet noise dominates). With a 10% reduction in thrust just after clearing the runway, the margin increases to 5.5 EPNdB. Margins to Chapter 4 and Chapter 14 levels will depend on the cumulative split between the three certification points, but it appears that low specific thrust engines with a 10% reduction in thrust (programmed lapse rate) can come close to meeting Chapter 14 noise levels. Further noise reduction is possible with engine oversizing and derated takeoff, but more detailed mission studies are needed to investigate the range impacts as well as the practical limits for safety and takeoff regulations.
A conceptual framework for evaluating impairments in myasthenia gravis.
Barnett, Carolina; Bril, Vera; Kapral, Moira; Kulkarni, Abhaya; Davis, Aileen M
2014-01-01
Myasthenia gravis is characterized by weakness and fatigability of different muscle groups, including ocular, bulbar and the limbs. Therefore, a measure of disease severity at the impairment level in myasthenia needs to reflect all the relevant impairments, as well as their variations with activity and fatigue. We conducted a qualitative study of patients with myasthenia, to explore their experiences and related impairments, aimed at developing a conceptual framework of disease severity at the impairment level in myasthenia gravis. Twenty patients representing the spectrum of disease participated in semi-structured interviews. Interviews were recorded and the transcripts were analyzed by content analysis using an inductive approach with line-by-line open coding. Themes were generated from these codes. Two main themes were identified: the severity of the impairments and fatigability (i.e., triggering or worsening of an impairment with activity). The impairments were further classified within body regions (ocular, bulbar and axial/limbs). Fatigability was described as a phenomenon affecting the whole body but also affecting specific impairments, and was associated with fluctuation of the symptoms. Patients were concerned that clinical examination at a single point in time might not reflect their true clinical state due to fatigability and fluctuations in severity. This conceptual framework reflects the relevance of both severity and fatigability in understanding impairment-based disease severity in myasthenia. This framework could inform the development of impairment measures in myasthenia gravis.
The Nature and Neural Correlates of Semantic Association versus Conceptual Similarity
Jackson, Rebecca L.; Hoffman, Paul; Pobric, Gorana; Lambon Ralph, Matthew A.
2015-01-01
The ability to represent concepts and the relationships between them is critical to human cognition. How does the brain code relationships between items that share basic conceptual properties (e.g., dog and wolf) while simultaneously representing associative links between dissimilar items that co-occur in particular contexts (e.g., dog and bone)? To clarify the neural bases of these semantic components in neurologically intact participants, both types of semantic relationship were investigated in an fMRI study optimized for anterior temporal lobe (ATL) coverage. The clear principal finding was that the same core semantic network (ATL, superior temporal sulcus, ventral prefrontal cortex) was equivalently engaged when participants made semantic judgments on the basis of association or conceptual similarity. Direct comparisons revealed small, weaker differences for conceptual similarity > associative decisions (e.g., inferior prefrontal cortex) and associative > conceptual similarity (e.g., ventral parietal cortex) which appear to reflect graded differences in task difficulty. Indeed, once reaction time was entered as a covariate into the analysis, no associative versus category differences remained. The paper concludes with a discussion of how categorical/feature-based and associative relationships might be represented within a single, unified semantic system. PMID:25636912
NASA Astrophysics Data System (ADS)
Kalnins, L. M.
2015-12-01
Over the last year we implemented a complete restructuring of a second year Matlab-based course on numerical modelling of Earth processes, with changes aimed at 1) strengthening students' independence as programmers, 2) addressing student concerns about support in developing coding skills, and 3) improving key modelling skills such as choosing boundary conditions. To address this, we designed a mastery-based approach where students progress through a series of small programming projects at their own pace. As part of this, all lectures are `flipped' into short videos, allowing all contact hours to be spent on programming. The projects themselves are structured based on a `bottlenecks to learning' approach, explicitly separating out the steps of learning new commands and code structures, creating a conceptual and mathematical model of the problem, and development of more generic programmings skills such as debugging before asking the students to combine all of the above to build a numerical model of an Earth Sciences problem. Compared with the previous, traditionally taught cohort, student questionnaires show a strong improvement in overall satisfaction. Free text responses show a focus on learning for understanding, and that students particularly valued the encouragement to slow down and work towards understanding when they encountered a difficult topic, rather than being pressured by a set timetable to move on. Quantitatively, exam performance improved on key conceptual questions, such as boundary conditions and discretisation, and overall achievement also rose, with 25% of students achieving an `A+' standard of work. Many of the final projects also demonstrated programming and modelling skills that had not been directly taught, ranging from use of new commands to extension of techniques taught in 1D to the 2D case: strong confirmation of the independent skills we aimed to foster with this new approach.
Modulation of the semantic system by word imageability.
Sabsevitz, D S; Medler, D A; Seidenberg, M; Binder, J R
2005-08-01
A prevailing neurobiological theory of semantic memory proposes that part of our knowledge about concrete, highly imageable concepts is stored in the form of sensory-motor representations. While this theory predicts differential activation of the semantic system by concrete and abstract words, previous functional imaging studies employing this contrast have provided relatively little supporting evidence. We acquired event-related functional magnetic resonance imaging (fMRI) data while participants performed a semantic similarity judgment task on a large number of concrete and abstract noun triads. Task difficulty was manipulated by varying the degree to which the words in the triad were similar in meaning. Concrete nouns, relative to abstract nouns, produced greater activation in a bilateral network of multimodal and heteromodal association areas, including ventral and medial temporal, posterior-inferior parietal, dorsal prefrontal, and posterior cingulate cortex. In contrast, abstract nouns produced greater activation almost exclusively in the left hemisphere in superior temporal and inferior frontal cortex. Increasing task difficulty modulated activation mainly in attention, working memory, and response monitoring systems, with almost no effect on areas that were modulated by imageability. These data provide critical support for the hypothesis that concrete, imageable concepts activate perceptually based representations not available to abstract concepts. In contrast, processing abstract concepts makes greater demands on left perisylvian phonological and lexical retrieval systems. The findings are compatible with dual coding theory and less consistent with single-code models of conceptual representation. The lack of overlap between imageability and task difficulty effects suggests that once the neural representation of a concept is activated, further maintenance and manipulation of that information in working memory does not further increase neural activation in the conceptual store.
NASA Astrophysics Data System (ADS)
Shi, Xue-Ming; Peng, Xian-Jue
2016-09-01
Fusion science and technology has made progress in the last decades. However, commercialization of fusion reactors still faces challenges relating to higher fusion energy gain, irradiation-resistant material, and tritium self-sufficiency. Fusion Fission Hybrid Reactors (FFHR) can be introduced to accelerate the early application of fusion energy. Traditionally, FFHRs have been classified as either breeders or transmuters. Both need partition of plutonium from spent fuel, which will pose nuclear proliferation risks. A conceptual design of a Fusion Fission Hybrid Reactor for Energy (FFHR-E), which can make full use of natural uranium with lower nuclear proliferation risk, is presented. The fusion core parameters are similar to those of the International Thermonuclear Experimental Reactor. An alloy of natural uranium and zirconium is adopted in the fission blanket, which is cooled by light water. In order to model blanket burnup problems, a linkage code MCORGS, which couples MCNP4B and ORIGEN-S, is developed and validated through several typical benchmarks. The average blanket energy Multiplication and Tritium Breeding Ratio can be maintained at 10 and 1.15 respectively over tens of years of continuous irradiation. If simple reprocessing without separation of plutonium from uranium is adopted every few years, FFHR-E can achieve better neutronic performance. MCORGS has also been used to analyze the ultra-deep burnup model of Laser Inertial Confinement Fusion Fission Energy (LIFE) from LLNL, and a new blanket design that uses Pb instead of Be as the neutron multiplier is proposed. In addition, MCORGS has been used to simulate the fluid transmuter model of the In-Zinerater from Sandia. A brief comparison of LIFE, In-Zinerater, and FFHR-E will be given.
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
Development of Performance Assessments in Science: Conceptual, Practical, and Logistical Issues.
ERIC Educational Resources Information Center
Solano-Flores, Guillermo; Shavelson, Richard J.
1997-01-01
Conceptual, practical, and logistical issues in the development of science performance assessments (SPAs) are discussed. The conceptual framework identifies task, response format, and scoring system as components, and conceives of SPAs as tasks that attempt to recreate conditions in which scientists work. Developing SPAs is a sophisticated effort…
Concreteness Effects and Syntactic Modification in Written Composition.
ERIC Educational Resources Information Center
Sadoski, Mark; Goetz, Ernest T.
1998-01-01
Investigates whether concreteness was related to a key characteristic of written composition--the cumulative sentence with a final modifier--which has been consistently associated with higher quality writing. Supports the conceptual-peg hypothesis of dual coding theory, with concrete verbs providing the pegs on which cumulative sentences are…
ERIC Educational Resources Information Center
Connor, Carol McDonald; Morrison, Frederick J.; Fishman, Barry J.; Ponitz, Claire Cameron; Glasney, Stephanie; Underwood, Phyllis S.; Piasta, Shayne B.; Crowe, Elizabeth Coyne; Schatschneider, Christopher
2009-01-01
The Individualizing Student Instruction (ISI) classroom observation and coding system is designed to provide a detailed picture of the classroom environment at the level of the individual student. Using a multidimensional conceptualization of the classroom environment, foundational elements (teacher warmth and responsiveness to students, classroom…
THE HYDROCARBON SPILL SCREENING MODEL (HSSM), VOLUME 2: THEORETICAL BACKGROUND AND SOURCE CODES
A screening model for subsurface release of a nonaqueous phase liquid which is less dense than water (LNAPL) is presented. The model conceptualizes the release as consisting of 1) vertical transport from near the surface to the capillary fringe, 2) radial spreading of an LNAPL l...
Uses of Computer Simulation Models in Ag-Research and Everyday Life
USDA-ARS?s Scientific Manuscript database
When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...
NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2010-01-01
Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.
The Fishery Performance Indicators: A Management Tool for Triple Bottom Line Outcomes
Anderson, James L.; Anderson, Christopher M.; Chu, Jingjie; Meredith, Jennifer; Asche, Frank; Sylvia, Gil; Smith, Martin D.; Anggraeni, Dessy; Arthur, Robert; Guttormsen, Atle; McCluney, Jessica K.; Ward, Tim; Akpalu, Wisdom; Eggert, Håkan; Flores, Jimely; Freeman, Matthew A.; Holland, Daniel S.; Knapp, Gunnar; Kobayashi, Mimako; Larkin, Sherry; MacLauchlin, Kari; Schnier, Kurt; Soboil, Mark; Tveteras, Sigbjorn; Uchida, Hirotsugu; Valderrama, Diego
2015-01-01
Pursuit of the triple bottom line of economic, community and ecological sustainability has increased the complexity of fishery management; fisheries assessments require new types of data and analysis to guide science-based policy in addition to traditional biological information and modeling. We introduce the Fishery Performance Indicators (FPIs), a broadly applicable and flexible tool for assessing performance in individual fisheries, and for establishing cross-sectional links between enabling conditions, management strategies and triple bottom line outcomes. Conceptually separating measures of performance, the FPIs use 68 individual outcome metrics—coded on a 1 to 5 scale based on expert assessment to facilitate application to data poor fisheries and sectors—that can be partitioned into sector-based or triple-bottom-line sustainability-based interpretative indicators. Variation among outcomes is explained with 54 similarly structured metrics of inputs, management approaches and enabling conditions. Using 61 initial fishery case studies drawn from industrial and developing countries around the world, we demonstrate the inferential importance of tracking economic and community outcomes, in addition to resource status. PMID:25946194
Porcupine: A visual pipeline tool for neuroimaging analysis
Snoek, Lukas; Knapen, Tomas
2018-01-01
The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461
The Role of Leadership in Safety Performance and Results
NASA Astrophysics Data System (ADS)
Caravello, Halina E.
Employee injury rates in U.S. land-based operations in the energy industry are 2 to 3 times higher relative to other regions in the world. Although a rich literature exists on drivers of safety performance, no previous studies investigated factors influencing this elevated rate. Leadership has been identified as a key contributor to safety outcomes and this grounded theory study drew upon the full range leadership model, situational leadership, and leader-member exchange theories for the conceptual framework. Leadership aspects influencing safety performance were investigated through guided interviews of 27 study participants; data analyses included open and axial coding, and constant comparisons identified higher-level categories. Selective coding integrated categories into the theoretical framework that developed the idealized, transformational leader traits motivating safe behaviors of leading by example, expressing care and concern for employees' well-being, celebrating successes, and communicating the importance of safety (other elements included visibility and commitment). Employee and supervisor participants reported similar views on the idealized leader traits, but low levels of these qualities may be driving elevated injury rates. Identifying these key elements provides the foundation to creating strategies and action plans enabling energy sector companies to prevent employee injuries and fatalities in an industry where tens of thousands of employees are subjected to significant hazards and elevated risks. Creating safer workplaces for U.S. employees by enhancing leaders' skills, building knowledge, and improving behaviors will improve the employees' and their families' lives by reducing the pain and suffering resulting from injuries and fatalities.
Do prominent quality measurement surveys capture the concerns of persons with disability?
Iezzoni, Lisa I; Marsella, Sarah A; Lopinsky, Tiffany; Heaphy, Dennis; Warsett, Kimberley S
2017-04-01
Demonstration programs nationwide aim to control costs and improve care for people dually-eligible for Medicare and Medicaid, including many persons with disability. Ensuring these initiatives maintain or improve care quality requires comprehensive evaluation of quality of care. To examine whether the common quality measures being used to evaluate the Massachusetts One Care duals demonstration program comprehensively address the concerns of persons with disability. Drawing upon existing conceptual frameworks, we developed a model of interrelationships of personal, health care, and environmental factors for achieving wellness for persons with disability. Based on this model, we specified a scheme to code individual quality measurement items and coded the items contained in 12 measures being used to assess Massachusetts One Care, which exclusively enrolls non-elderly adults with disability. Across these 12 measures, we assigned 376 codes to 302 items; some items received two codes. Taken together, the 12 measures contain items addressing most factors in our conceptual model that affect health care quality for persons with disability, including long-term services and supports. Some important gaps exist. No items examine sexual or reproductive health care, peer support, housing security, disability stigmatization, and specific services obtained outside the home like adult day care. Certain key concepts are covered only by a single or several of the 12 quality measures. Common quality metrics cover most - although not all-health care quality concerns of persons with disability. However, multiple different quality measures are required for this comprehensive coverage, raising questions about respondent burden. Copyright © 2017 Elsevier Inc. All rights reserved.
Long non-coding RNAs in cancer metabolism.
Xiao, Zhen-Dong; Zhuang, Li; Gan, Boyi
2016-10-01
Altered cellular metabolism is an emerging hallmark of cancer. Accumulating recent evidence links long non-coding RNAs (lncRNAs), a still poorly understood class of non-coding RNAs, to cancer metabolism. Here we review the emerging findings on the functions of lncRNAs in cancer metabolism, with particular emphasis on how lncRNAs regulate glucose and glutamine metabolism in cancer cells, discuss how lncRNAs regulate various aspects of cancer metabolism through their cross-talk with other macromolecules, explore the mechanistic conceptual framework of lncRNAs in reprogramming metabolism in cancers, and highlight the challenges in this field. A more in-depth understanding of lncRNAs in cancer metabolism may enable the development of novel and effective therapeutic strategies targeting cancer metabolism. © 2016 WILEY Periodicals, Inc.
Weindling, P
2001-01-01
The Nuremberg Code has generally been seen as arising from the Nuremberg Medical Trial. This paper examines developments prior to the Trial, involving the physiologist Andrew Conway Ivy and an inter-Allied Scientific Commission on Medical War Crimes. The paper traces the formulation of the concept of a medical war crime by the physiologist John West Thompson, as part of the background to Ivy's code on human experiments of 1 August 1946. It evaluates subsequent responses by the American Medical Association, and by other war crimes experts, notably Leo Alexander, who developed Ivy's conceptual framework. Ivy's interaction with the judges at Nuremberg alerted them to the importance of formulating ethical guidelines for clinical research.
Code for ethical international recruitment practices: the CGFNS alliance case study.
Shaffer, Franklin A; Bakhshi, Mukul; Dutka, Julia To; Phillips, Janice
2016-06-30
Projections indicate a global workforce shortage of approximately 4.3 million across the health professions. The need to ensure an adequate supply of health workers worldwide has created a context for the increased global migration of these professionals. The global trend in the migration of health professionals has given rise to the international recruitment industry to facilitate the passage of health workers from source to destination countries. This is particularly the case in the United States, where the majority of immigrant health professionals have come by way of the recruiting industry. This industry is largely unregulated in the United States as well as in many other countries, for which voluntary codes have been used as a means to increase transparency of the recruitment process, shape professional conduct, and mitigate harm to foreign-educated health workers. The CGFNS Alliance case study presented herein describes a multi-stakeholder effort in the United States to promote ethical recruitment practices. Such codes not only complement the WHO Global Code of Practice but are necessary to maximize the impact of these global standards on local settings. This case study offers both a historical perspective and a conceptual framework for examining the multiplicity of factors affecting the migration of human resources for health. The lessons learned provide critical insights into the factors pertaining to the relevancy and effectiveness of the WHO Code from the perspectives of both source and destination countries. This study provides a conceptual model for examining the usefulness of the WHO Code as well as how best to ensure its viability, sustainability, relevancy, and effectiveness in the global environment. This case study concludes with recommendations for evolving business models that need to be in place to strengthen the effectiveness of the WHO Code in the marketplace and to ensure its impact on the international recruitment industry in advancing ethical practices. These recommendations include using effective screening mechanisms to determine health professionals' readiness for migration as well as implementing certification processes to raise the practice standards for those directly involved in recruiting skilled workers and managing the migration flow.
NASA Astrophysics Data System (ADS)
Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.
2016-08-01
Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.
Defining Peer-to-Peer Accountability From the Nurse's Perspective.
Lockett, Jacqueline Jansen; Barkley, Leslie; Stichler, Jaynelle; Palomo, Jeanne; Kik, Bozena; Walker, Christopher; Donnelly, Janet; Willon, Judy; Sanborn, Julie; O'Byrne, Noeleen
2015-11-01
The aim of this study was to define and create a conceptual model for peer-to-peer accountability (P to PA). Many organizations cite the importance of peer accountability (PA) as essential in ensuring patient safety. Professionalism in nursing requires self-regulation of practice and PA. Although discussed in the literature, P to PA is not conceptually defined. A grounded theory study design with constant comparative data collection and analysis was used to explore nurses' definitions of P to PA and their perceptions of motivators and barriers to engaging in P to PA. Transcripts of digital recordings of all interviews were analyzed using line-by-line coding until identified themes emerged. P to PA was defined as the act of speaking up when one observes a peer not practicing to acceptable standards. A conceptual model illustrates the antecedents, attributes, and consequences of P to PA. P to PA is the professional responsibility of every nurse and healthcare provider and is essential for safe patient care. The conceptual definition facilitates actualization of P to PA in practice.
ERIC Educational Resources Information Center
Wang, Yu-Lin; Ellinger, Andrea D.
2008-01-01
The purpose of this paper is to develop a conceptual framework and research hypotheses based upon a thorough review of the conceptual and limited published empirical research in the organizational learning and innovation performance literatures. Hypotheses indicate the relationships between organizational learning, its antecedent, perception of…
Use of a Computer Language in Teaching Dynamic Programming. Final Report.
ERIC Educational Resources Information Center
Trimble, C. J.; And Others
Most optimization problems of any degree of complexity must be solved using a computer. In the teaching of dynamic programing courses, it is often desirable to use a computer in problem solution. The solution process involves conceptual formulation and computational Solution. Generalized computer codes for dynamic programing problem solution…
Innovative Moments in Grief Therapy: Reconstructing Meaning Following Perinatal Death
ERIC Educational Resources Information Center
Alves, Daniela; Mendes, Ines; Goncalves, Miguel M.; Neimeyer, Robert A.
2012-01-01
This article presents an intensive analysis of a good outcome case of constructivist grief therapy with a bereaved mother, using the Innovative Moments Coding System (IMCS). Inspired by M. White and D. Epston's narrative therapy, the IMCS conceptualizes therapeutic change as resulting from the elaboration and expansion of unique outcomes (or as we…
A Content Analysis of 10 Years of Clinical Supervision Articles in Counseling
ERIC Educational Resources Information Center
Bernard, Janine M.; Luke, Melissa
2015-01-01
This content analysis follows Borders's (2005) review of counseling supervision literature and includes 184 counselor supervision articles published over the past 10 years. Articles were coded as representing 1 of 3 research types or 1 of 3 conceptual types. Articles were then analyzed for main topics producing 11 topic categories.
Approaches Used by Faculty to Assess Critical Thinking--Implications for General Education
ERIC Educational Resources Information Center
Nicholas, Mark; Raider-Roth, Miriam
2011-01-01
This investigation focused on a group of 17 faculty drawn from disciplines in the humanities social sciences and natural sciences. Using in-depth interviews, focus group discussions and qualitative coding strategies, this study examined how faculty conceptualized the term critical thinking (CT), and how they assessed for it in general education…
Consumer and Family Economics. Second Edition.
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Home Economics Curriculum Center.
This consumer and family economics curriculum guide was developed for use in home economics education in Texas. At the beginning is a list of the competencies and the subcompetencies that are the essential elements and the subelements prescribed in the Texas Administrative Codes for Vocational Home Economics. The conceptual outline as shown in the…
A New Internet Tool for Automatic Evaluation in Control Systems and Programming
ERIC Educational Resources Information Center
Munoz de la Pena, D.; Gomez-Estern, F.; Dormido, S.
2012-01-01
In this paper we present a web-based innovative education tool designed for automating the collection, evaluation and error detection in practical exercises assigned to computer programming and control engineering students. By using a student/instructor code-fusion architecture, the conceptual limits of multiple-choice tests are overcome by far.…
Two Aspects of Meaningful Problem Solving in Science.
ERIC Educational Resources Information Center
Stewart, James
1982-01-01
Presents a model for solving genetics problems when problem statements include information on which alleles are dominant/recessive and on what forms of a trait are coded for by the alleles. Includes procedural steps employed in a solution and conceptual knowledge of genetics/meiosis allowing students to justify what they have done. (Author/JN)
"I Keep That Hush-Hush": Male Survivors of Sexual Abuse and the Challenges of Disclosure
ERIC Educational Resources Information Center
Sorsoli, Lynn; Kia-Keating, Maryam; Grossman, Frances K.
2008-01-01
Disclosure is a prominent variable in child sexual abuse research, but little research has examined male disclosure experiences. Sixteen male survivors of childhood sexual abuse were interviewed regarding experiences of disclosure. Analytic techniques included a grounded theory approach to coding and the use of conceptually clustered matrices.…
ERIC Educational Resources Information Center
Krus, David J.; Krus, Patricia H.
1978-01-01
The conceptual differences between coded regression analysis and traditional analysis of variance are discussed. Also, a modification of several SPSS routines is proposed which allows for direct interpretation of ANOVA and ANCOVA results in a form stressing the strength and significance of scrutinized relationships. (Author)
The Effects of Prohibiting Gestures on Children's Lexical Retrieval Ability
ERIC Educational Resources Information Center
Pine, Karen J.; Bird, Hannah; Kirk, Elizabeth
2007-01-01
Two alternative accounts have been proposed to explain the role of gestures in thinking and speaking. The Information Packaging Hypothesis (Kita, 2000) claims that gestures are important for the conceptual packaging of information before it is coded into a linguistic form for speech. The Lexical Retrieval Hypothesis (Rauscher, Krauss & Chen, 1996)…
In the Rearview Mirror: Social Skill Development in Deaf Youth, 1990-2015
ERIC Educational Resources Information Center
Cawthon, Stephanie W.; Fink, Bentley; Schoffstall, Sarah; Wendel, Erica
2018-01-01
Social skills are a vehicle by which individuals negotiate important relationships. The present article presents historical data on how social skills in deaf students were conceptualized and studied empirically during the period 1990-2015. Using a structured literature review approach, the researchers coded 266 articles for theoretical frameworks…
MATLAB Stability and Control Toolbox Trim and Static Stability Module
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Crespo, Luis
2012-01-01
MATLAB Stability and Control Toolbox (MASCOT) utilizes geometric, aerodynamic, and inertial inputs to calculate air vehicle stability in a variety of critical flight conditions. The code is based on fundamental, non-linear equations of motion and is able to translate results into a qualitative, graphical scale useful to the non-expert. MASCOT was created to provide the conceptual aircraft designer accurate predictions of air vehicle stability and control characteristics. The code takes as input mass property data in the form of an inertia tensor, aerodynamic loading data, and propulsion (i.e. thrust) loading data. Using fundamental nonlinear equations of motion, MASCOT then calculates vehicle trim and static stability data for the desired flight condition(s). Available flight conditions include six horizontal and six landing rotation conditions with varying options for engine out, crosswind, and sideslip, plus three take-off rotation conditions. Results are displayed through a unique graphical interface developed to provide the non-stability and control expert conceptual design engineer a qualitative scale indicating whether the vehicle has acceptable, marginal, or unacceptable static stability characteristics. If desired, the user can also examine the detailed, quantitative results.
Script, code, information: how to differentiate analogies in the "prehistory" of molecular biology.
Kogge, Werner
2012-01-01
The remarkable fact that twentieth-century molecular biology developed its conceptual system on the basis of sign-like terms has been the object of numerous studies and debates. Throughout these, the assumption is made that this vocabulary's emergence should be seen in the historical context of mathematical communication theory and cybernetics. This paper, in contrast, sets out the need for a more differentiated view: whereas the success of the terms "code" and "information" would probably be unthinkable outside that historical context, general semiotic and especially scriptural concepts arose far earlier in the "prehistory" of molecular biology, and in close association with biological research and phenomena. This distinction, established through a reconstruction of conceptual developments between 1870 and 1950, makes it possible to separate off a critique of the reductive implications of particular information-based concepts from the use of semiotic and scriptural concepts, which is fundamental to molecular biology. Gene-centrism and determinism are not implications of semiotic and scriptural analogies, but arose only when the vocabulary of information was superimposed upon them.
Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.
2006-01-01
This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.
Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F
1998-01-01
GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.
Intact and impaired conceptual memory processes in amnesia.
Keane, M M; Gabrieli, J D; Monti, L A; Fleischman, D A; Cantor, J M; Noland, J S
1997-01-01
To examine the status of conceptual memory processes in amnesia, a conceptual memory task with implicit or explicit task instructions was given to amnesic and control groups. After studying a list of category exemplars, participants saw category labels and were asked to generate as many exemplars as possible (an implicit memory task) or to generate exemplars that had been in the prior study list (an explicit memory task). After incidental deep or shallow encoding of exemplars, amnesic patients showed normal implicit memory performance (priming), a normal levels-of-processing effect on priming, and impaired explicit memory performance. After intentional encoding of exemplars, amnesic patients showed impaired implicit and explicit memory performance. Results suggest that although amnesic patients can show impairments on implicit and explicit conceptual memory tasks, their deficit does not generalize to all conceptual memory tasks.
Integrating Conceptual and Quantitative Knowledge
ERIC Educational Resources Information Center
Metzgar, Matthew
2013-01-01
There has been an emphasis in some science courses to focus more on teaching conceptual knowledge. Though certain innovations have been successful in increasing student conceptual knowledge, performance on quantitative problem-solving tasks often remains unaffected. Research also shows that students tend to maintain conceptual and quantitative…
NASA Technical Reports Server (NTRS)
Morris, Shelby J., Jr.; Geiselhart, Karl A.; Coen, Peter G.
1989-01-01
The performance of an advanced technology conceptual turbojet optimized for a high-speed civil aircraft is presented. This information represents an estimate of performance of a Mach 3 Brayton (gas turbine) cycle engine optimized for minimum fuel burned at supersonic cruise. This conceptual engine had no noise or environmental constraints imposed upon it. The purpose of this data is to define an upper boundary on the propulsion performance for a conceptual commercial Mach 3 transport design. A comparison is presented demonstrating the impact of the technology proposed for this conceptual engine on the weight and other characteristics of a proposed high-speed civil transport. This comparison indicates that the advanced technology turbojet described could reduce the gross weight of a hypothetical Mach 3 high-speed civil transport design from about 714,000 pounds to about 545,000 pounds. The aircraft with the baseline engine and the aircraft with the advanced technology engine are described.
NASA Astrophysics Data System (ADS)
Idaszak, R.; Lenhardt, W. C.; Jones, M. B.; Ahalt, S.; Schildhauer, M.; Hampton, S. E.
2014-12-01
The NSF, in an effort to support the creation of sustainable science software, funded 16 science software institute conceptualization efforts. The goal of these conceptualization efforts is to explore approaches to creating the institutional, sociological, and physical infrastructures to support sustainable science software. This paper will present the lessons learned from two of these conceptualization efforts, the Institute for Sustainable Earth and Environmental Software (ISEES - http://isees.nceas.ucsb.edu) and the Water Science Software Institute (WSSI - http://waters2i2.org). ISEES is a multi-partner effort led by National Center for Ecological Analysis and Synthesis (NCEAS). WSSI, also a multi-partner effort, is led by the Renaissance Computing Institute (RENCI). The two conceptualization efforts have been collaborating due to the complementarity of their approaches and given the potential synergies of their science focus. ISEES and WSSI have engaged in a number of activities to address the challenges of science software such as workshops, hackathons, and coding efforts. More recently, the two institutes have also collaborated on joint activities including training, proposals, and papers. In addition to presenting lessons learned, this paper will synthesize across the two efforts to project a unified vision for a science software institute.
ERIC Educational Resources Information Center
Gorecek Baybars, Meryem; Kucukozer, Huseyin
2018-01-01
The object of this study is to determine the conceptual understanding that prospective Science teachers have relating "de Broglie: Matter waves" and to investigate the effect of the instruction performed, on the conceptual understanding. This study was performed at a state university located in the western part of Turkey, with the…
Conceptual Masking: How One Picture Captures Attention from Another Picture.
ERIC Educational Resources Information Center
Loftus, Geoffrey R.; And Others
1988-01-01
Five experiments studied operations of conceptual masking--the reduction of conceptual memory performance for an initial stimulus when it is followed by a masking picture process. The subjects were 337 undergraduates at the University of Washington (Seattle). Conceptual masking is distinguished from perceptual masking. (TJH)
(I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358
Web Services Provide Access to SCEC Scientific Research Application Software
NASA Astrophysics Data System (ADS)
Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.
2003-12-01
Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.
Fourth and eighth grade students' conceptions of energy flow through ecosystems
NASA Astrophysics Data System (ADS)
Arkwright, Ashlie Beals
This mixed methods status study examined 32 fourth grade students' conceptual understandings of energy flow through ecosystems prior to instruction and 40 eighth grade students' conceptual understandings of the same topic after five years of daily standards-based instruction in science. Specific ecological concepts assessed related to: 1) roles of organisms; 2) the sun as the original energy source for most ecosystems; and 3) interdependency of organisms. Fourth and eighth grade students were assessed using the same three-tiered forced-choice instrument, with accompanying tasks for students to defend their forced-choice selections and rate their level of confidence in making the selections. The instrument was developed for the study by a team of researchers and was based on similar tasks presented in the research literature. Distractor options were embedded in each assessment task using common non-scientific ideas also reported in the research literature. Cronbach's alpha values at or greater than .992 for each task indicated interrater consistency of task answers, and Rasch analysis was employed to establish the reliability of the instrument. Qualitative and quantitative analyses were employed to assess the data. Constant comparative methods were employed to analyze students' written responses, which were coded and grouped into emerging themes. These themes were further developed to characterize students' conceptual understandings. Student open responses also were scored and coded by a team of researchers using a rubric to identify level of scientific understanding. Quantitative analyses included Rasch analysis used to normalize survey data. Independent samples t-tests were then employed to compare students' forced-choice responses to their written responses and to the confidence ratings, as well as to compare fourth and eighth grade students' responses. Findings indicated that eighth grade students generally outperformed the fourth grade on both the forced-choice and written responses, but both groups demonstrated conceptual difficulties in all three topics assessed. Thus, results from the current study support the assertion that students' understanding of concepts related to energy flow in ecosystems is not at the expected level according to national science education standards and frameworks. Conceptual difficulties identified in the study are discussed along with implications and curricular recommendations.
Writing and compiling code into biochemistry.
Shea, Adam; Fett, Brian; Riedel, Marc D; Parhi, Keshab
2010-01-01
This paper presents a methodology for translating iterative arithmetic computation, specified as high-level programming constructs, into biochemical reactions. From an input/output specification, we generate biochemical reactions that produce output quantities of proteins as a function of input quantities performing operations such as addition, subtraction, and scalar multiplication. Iterative constructs such as "while" loops and "for" loops are implemented by transferring quantities between protein types, based on a clocking mechanism. Synthesis first is performed at a conceptual level, in terms of abstract biochemical reactions - a task analogous to high-level program compilation. Then the results are mapped onto specific biochemical reactions selected from libraries - a task analogous to machine language compilation. We demonstrate our approach through the compilation of a variety of standard iterative functions: multiplication, exponentiation, discrete logarithms, raising to a power, and linear transforms on time series. The designs are validated through transient stochastic simulation of the chemical kinetics. We are exploring DNA-based computation via strand displacement as a possible experimental chassis.
Using RDF and Git to Realize a Collaborative Metadata Repository.
Stöhr, Mark R; Majeed, Raphael W; Günther, Andreas
2018-01-01
The German Center for Lung Research (DZL) is a research network with the aim of researching respiratory diseases. The participating study sites' register data differs in terms of software and coding system as well as data field coverage. To perform meaningful consortium-wide queries through one single interface, a uniform conceptual structure is required covering the DZL common data elements. No single existing terminology includes all our concepts. Potential candidates such as LOINC and SNOMED only cover specific subject areas or are not granular enough for our needs. To achieve a broadly accepted and complete ontology, we developed a platform for collaborative metadata management. The DZL data management group formulated detailed requirements regarding the metadata repository and the user interfaces for metadata editing. Our solution builds upon existing standard technologies allowing us to meet those requirements. Its key parts are RDF and the distributed version control system Git. We developed a software system to publish updated metadata automatically and immediately after performing validation tests for completeness and consistency.
Highly parallel sparse Cholesky factorization
NASA Technical Reports Server (NTRS)
Gilbert, John R.; Schreiber, Robert
1990-01-01
Several fine grained parallel algorithms were developed and compared to compute the Cholesky factorization of a sparse matrix. The experimental implementations are on the Connection Machine, a distributed memory SIMD machine whose programming model conceptually supplies one processor per data element. In contrast to special purpose algorithms in which the matrix structure conforms to the connection structure of the machine, the focus is on matrices with arbitrary sparsity structure. The most promising algorithm is one whose inner loop performs several dense factorizations simultaneously on a 2-D grid of processors. Virtually any massively parallel dense factorization algorithm can be used as the key subroutine. The sparse code attains execution rates comparable to those of the dense subroutine. Although at present architectural limitations prevent the dense factorization from realizing its potential efficiency, it is concluded that a regular data parallel architecture can be used efficiently to solve arbitrarily structured sparse problems. A performance model is also presented and it is used to analyze the algorithms.
NASA Astrophysics Data System (ADS)
Masciopinto, Costantino; Volpe, Angela; Palmiotta, Domenico; Cherubini, Claudia
2010-09-01
A combination of a parallel fracture model with the PHREEQC-2 geochemical model was developed to simulate sequential flow and chemical transport with reactions in fractured media where both laminar and turbulent flows occur. The integration of non-laminar flow resistances in one model produced relevant effects on water flow velocities, thus improving model prediction capabilities on contaminant transport. The proposed conceptual model consists of 3D rock-blocks, separated by horizontal bedding plane fractures with variable apertures. Particle tracking solved the transport equations for conservative compounds and provided input for PHREEQC-2. For each cluster of contaminant pathways, PHREEQC-2 determined the concentration for mass-transfer, sorption/desorption, ion exchange, mineral dissolution/precipitation and biodegradation, under kinetically controlled reactive processes of equilibrated chemical species. Field tests have been performed for the code verification. As an example, the combined model has been applied to a contaminated fractured aquifer of southern Italy in order to simulate the phenol transport. The code correctly fitted the field available data and also predicted a possible rapid depletion of phenols as a result of an increased biodegradation rate induced by a simulated artificial injection of nitrates, upgradient to the sources.
NASA Technical Reports Server (NTRS)
Koch, L. Danielle; Shook, Tony D.; Astler, Douglas T.; Bittinger, Samantha A.
2011-01-01
A fan tone noise prediction code has been developed at NASA Glenn Research Center that is capable of estimating duct mode sound power levels for a fan ingesting distorted inflow. This code was used to predict the circumferential and radial mode sound power levels in the inlet and exhaust duct of an axial spacecraft cabin ventilation fan. Noise predictions at fan design rotational speed were generated. Three fan inflow conditions were studied: an undistorted inflow, a circumferentially symmetric inflow distortion pattern (cylindrical rods inserted radially into the flowpath at 15deg, 135deg, and 255deg), and a circumferentially asymmetric inflow distortion pattern (rods located at 15deg, 52deg and 173deg). Noise predictions indicate that tones are produced for the distorted inflow cases that are not present when the fan operates with an undistorted inflow. Experimental data are needed to validate these acoustic predictions, as well as the aerodynamic performance predictions. Given the aerodynamic design of the spacecraft cabin ventilation fan, a mechanical and electrical conceptual design study was conducted. Design features of a fan suitable for obtaining detailed acoustic and aerodynamic measurements needed to validate predictions are discussed.
NASA Technical Reports Server (NTRS)
Koch, L. Danielle; Shook, Tony D.; Astler, Douglas T.; Bittinger, Samantha A.
2012-01-01
A fan tone noise prediction code has been developed at NASA Glenn Research Center that is capable of estimating duct mode sound power levels for a fan ingesting distorted inflow. This code was used to predict the circumferential and radial mode sound power levels in the inlet and exhaust duct of an axial spacecraft cabin ventilation fan. Noise predictions at fan design rotational speed were generated. Three fan inflow conditions were studied: an undistorted inflow, a circumferentially symmetric inflow distortion pattern (cylindrical rods inserted radially into the flowpath at 15deg, 135deg, and 255deg), and a circumferentially asymmetric inflow distortion pattern (rods located at 15deg, 52deg and 173deg). Noise predictions indicate that tones are produced for the distorted inflow cases that are not present when the fan operates with an undistorted inflow. Experimental data are needed to validate these acoustic predictions, as well as the aerodynamic performance predictions. Given the aerodynamic design of the spacecraft cabin ventilation fan, a mechanical and electrical conceptual design study was conducted. Design features of a fan suitable for obtaining detailed acoustic and aerodynamic measurements needed to validate predictions are discussed.
The Low-Noise Potential of Distributed Propulsion on a Catamaran Aircraft
NASA Technical Reports Server (NTRS)
Posey, Joe W.; Tinetti, A. F.; Dunn, M. H.
2006-01-01
The noise shielding potential of an inboard-wing catamaran aircraft when coupled with distributed propulsion is examined. Here, only low-frequency jet noise from mid-wing-mounted engines is considered. Because low frequencies are the most difficult to shield, these calculations put a lower bound on the potential shielding benefit. In this proof-of-concept study, simple physical models are used to describe the 3-D scattering of jet noise by conceptualized catamaran aircraft. The Fast Scattering Code is used to predict noise levels on and about the aircraft. Shielding results are presented for several catamaran type geometries and simple noise source configurations representative of distributed propulsion radiation. Computational analyses are presented that demonstrate the shielding benefits of distributed propulsion and of increasing the width of the inboard wing. Also, sample calculations using the FSC are presented that demonstrate additional noise reduction on the aircraft fuselage by the use of acoustic liners on the inboard wing trailing edge. A full conceptual aircraft design would have to be analyzed over a complete mission to more accurately quantify community noise levels and aircraft performance, but the present shielding calculations show that a large acoustic benefit could be achieved by combining distributed propulsion and liner technology with a twin-fuselage planform.
Conceptual Core Analysis of Long Life PWR Utilizing Thorium-Uranium Fuel Cycle
NASA Astrophysics Data System (ADS)
Rouf; Su'ud, Zaki
2016-08-01
Conceptual core analysis of long life PWR utilizing thorium-uranium based fuel has conducted. The purpose of this study is to evaluate neutronic behavior of reactor core using combined thorium and enriched uranium fuel. Based on this fuel composition, reactor core have higher conversion ratio rather than conventional fuel which could give longer operation length. This simulation performed using SRAC Code System based on library SRACLIB-JDL32. The calculation carried out for (Th-U)O2 and (Th-U)C fuel with uranium composition 30 - 40% and gadolinium (Gd2O3) as burnable poison 0,0125%. The fuel composition adjusted to obtain burn up length 10 - 15 years under thermal power 600 - 1000 MWt. The key properties such as uranium enrichment, fuel volume fraction, percentage of uranium are evaluated. Core calculation on this study adopted R-Z geometry divided by 3 region, each region have different uranium enrichment. The result show multiplication factor every burn up step for 15 years operation length, power distribution behavior, power peaking factor, and conversion ratio. The optimum core design achieved when thermal power 600 MWt, percentage of uranium 35%, U-235 enrichment 11 - 13%, with 14 years operation length, axial and radial power peaking factor about 1.5 and 1.2 respectively.
ERIC Educational Resources Information Center
Djambong, Takam; Freiman, Viktor
2016-01-01
While today's schools in several countries, like Canada, are about to bring back programming to their curricula, a new conceptual angle, namely one of computational thinking, draws attention of researchers. In order to understand the articulation between computational thinking tasks in one side, student's targeted skills, and the types of problems…
Towards PCC for Concurrent and Distributed Systems (Work in Progress)
NASA Technical Reports Server (NTRS)
Henriksen, Anders S.; Filinski, Andrzej
2009-01-01
We outline some conceptual challenges in extending the PCC paradigm to a concurrent and distributed setting, and sketch a generalized notion of module correctness based on viewing communication contracts as economic games. The model supports compositional reasoning about modular systems and is meant to apply not only to certification of executable code, but also of organizational workflows.
Modelling dwarf mistletoe at three scales: life history, ballistics and contagion
Donald C. E. Robinson; Brian W. Geils
2006-01-01
The epidemiology of dwarf mistletoe (Arceuthobium) is simulated for the reproduction, dispersal, and spatial patterns of these plant pathogens on conifer trees. A conceptual model for mistletoe spread and intensification is coded as sets of related subprograms that link to either of two individual-tree growth models (FVS and TASS) used by managers to develop...
ERIC Educational Resources Information Center
Marshall, Steve; Moore, Danièle
2013-01-01
In this article, the researchers employ the framework of plurilingualism and plurilingual competence in a field that has traditionally been dominated by reified conceptualizations of multilingualism that view bi/multilingualism as balanced and complete competence in discrete codes. They present data from a qualitative, longitudinal study of the…
Understanding the Common Elements of Evidence-Based Practice: Misconceptions and Clinical Examples
ERIC Educational Resources Information Center
Chorpita, Bruce F.; Becker, Kimberly D.; Daleiden, Eric L.
2007-01-01
In this article, the authors proposed a distillation and matching model (DMM) that describes how evidence-based treatment operations can be conceptualized at a lower order level of analysis than simply by their manuals. Also referred to as the "common elements" approach, this model demonstrates the feasibility of coding and identifying the…
How Are Community Interventions Conceptualized and Conducted? An Analysis of Published Accounts
ERIC Educational Resources Information Center
Trickett, Edison J.; Espino, Susan Ryerson; Hawe, Penelope
2011-01-01
Recent discussions about the conduct of community interventions suggest the importance of developing more comprehensive theorizing about their nature and effects. The present study is an effort to infer how community interventions are theorized by the way they are represented in the peer-reviewed scholarly literature. A coding of a random sample…
Intelligent Patching of Conceptual Geometry for CFD Analysis
NASA Technical Reports Server (NTRS)
Li, Wu
2010-01-01
The iPatch computer code for intelligently patching surface grids was developed to convert conceptual geometry to computational fluid dynamics (CFD) geometry (see figure). It automatically uses bicubic B-splines to extrapolate (if necessary) each surface in a conceptual geometry so that all the independently defined geometric components (such as wing and fuselage) can be intersected to form a watertight CFD geometry. The software also computes the intersection curves of surface patches at any resolution (up to 10.4 accuracy) specified by the user, and it writes the B-spline surface patches, and the corresponding boundary points, for the watertight CFD geometry in the format that can be directly used by the grid generation tool VGRID. iPatch requires that input geometry be in PLOT3D format where each component surface is defined by a rectangular grid {(x(i,j), y(i,j), z(i,j)):1less than or equal to i less than or equal to m, 1 less than or equal to j less than or equal to n} that represents a smooth B-spline surface. All surfaces in the PLOT3D file conceptually represent a watertight geometry of components of an aircraft on the half-space y greater than or equal to 0. Overlapping surfaces are not allowed, but could be fixed by a utility code "fixp3d". The fixp3d utility code first finds the two grid lines on the two surface grids that are closest to each other in Hausdorff distance (a metric to measure the discrepancies of two sets); then uses one of the grid lines as the transition line, extending grid lines on one grid to the other grid to form a merged grid. Any two connecting surfaces shall have a "visually" common boundary curve, or can be described by an intersection relationship defined in a geometry specification file. The intersection of two surfaces can be at a conceptual level. However, the intersection is directional (along either i or j index direction), and each intersecting grid line (or its spine extrapolation) on the first surface should intersect the second surface. No two intersection relationships will result in a common intersection point of three surfaces. The output files of iPatch are IGES, d3m, and mapbc files that define the CFD geometry in VGRID format. The IGES file gives the NURBS definition of the outer mold line in the geometry. The d3m file defines how the outer mold line is broken into surface patches whose boundary curves are defined by points. The mapbc file specifies what the boundary condition is on each patch and the corresponding NURBS surface definition of each non-planar patch in the IGES file.
Newman Unit 1 advanced solar repowering advanced conceptual design. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1982-04-01
The Newman Unit 1 solar repowering design is a water/steam central receiver concept supplying superheated steam. The work reported is to develop a refined baseline conceptual design that has potential for construction and operation by 1986, makes use of existing solar thermal technology, and provides the best economics for this application. Trade studies performed in the design effort are described, both for the conceptual design of the overall system and for the subsystem conceptual design. System-level functional requirements, design, operation, performance, cost, safety, environmental, institutional, and regulatory considerations are described. Subsystems described include the collector, receiver, fossil energy, electrical powermore » generating, and master control subsystems, site and site facilities. The conceptual design, cost, and performance of each subsystem is discussed at length. A detailed economic analysis of the repowered unit is made to realistically assess the economics of the first repowered unit using present cost data for a limited production level for solar hardware. Finally, a development plan is given, including the design, procurement, construction, checkout, startup, performance validation, and commercial operation. (LEW)« less
Optimal technology investment strategies for a reusable launch vehicle
NASA Technical Reports Server (NTRS)
Moore, A. A.; Braun, R. D.; Powell, R. W.
1995-01-01
Within the present budgetary environment, developing the technology that leads to an operationally efficient space transportation system with the required performance is a challenge. The present research focuses on a methodology to determine high payoff technology investment strategies. Research has been conducted at Langley Research Center in which design codes for the conceptual analysis of space transportation systems have been integrated in a multidisciplinary design optimization approach. The current study integrates trajectory, propulsion, weights and sizing, and cost disciplines where the effect of technology maturation on the development cost of a single stage to orbit reusable launch vehicle is examined. Results show that the technology investment prior to full-scale development has a significant economic payoff. The design optimization process is used to determine strategic allocations of limited technology funding to maximize the economic payoff.
Characterizing the Fundamental Intellectual Steps Required in the Solution of Conceptual Problems
NASA Astrophysics Data System (ADS)
Stewart, John
2010-02-01
At some level, the performance of a science class must depend on what is taught, the information content of the materials and assignments of the course. The introductory calculus-based electricity and magnetism class at the University of Arkansas is examined using a catalog of the basic reasoning steps involved in the solution of problems assigned in the class. This catalog was developed by sampling popular physics textbooks for conceptual problems. The solution to each conceptual problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content within the course. Using this characterization technique, an exceptionally detailed picture of the information flow and structure of the class can be produced. The intellectual structure of published conceptual inventories is compared with the information presented in the class and the dependence of conceptual performance on the details of coverage extracted. )
Dealing with conflicts on knowledge in tutorial groups.
Aarnio, Matti; Lindblom-Ylänne, Sari; Nieminen, Juha; Pyörälä, Eeva
2013-05-01
The aim of our study was to gain understanding of different types of conflicts on knowledge in the discussions of problem-based learning tutorial groups, and how such conflicts are dealt with. We examined first-year medical and dental students' (N = 33) conflicts on knowledge in four videotaped reporting phase tutorials. A coding scheme was created for analysing verbatim transcripts of 43 conflict episodes in order to find out whether the conflict episodes were about factual or conceptual knowledge and how the students elaborated the knowledge. Conflict episodes were relatively rare (taking up 7.6 % of the time) in the videotaped groups. Conflict episodes were more frequently about factual knowledge (58 %) than conceptual knowledge (42 %), but conflicts on conceptual knowledge lasted longer and were more often elaborated. Elaboration was, however, more frequently done individually than collaboratively. Conflict episodes were generally fairly short (mean duration 28 s). This was due to a lack of thorough argumentation and collaborative elaboration of conflicting ideas. The results suggest that students' skills to bring out differences in each other's conceptual thinking, the depth of argumentation and the use of questions that elicit elaboration need to be improved. Tutors' skills to facilitate the collaborative resolving of conflicts on knowledge call for further study.
NASA Technical Reports Server (NTRS)
Quirk, James J.
1992-01-01
In this paper we describe an approach for dealing with arbitrary complex, two dimensional geometries, the so-called cartesian boundary method. Conceptually, the cartesian boundary method is quite simple. Solid bodies blank out areas of a background, cartesian mesh, and the resultant cut cells are singled out for special attention. However, there are several obstacles that must be overcome in order to achieve a practical scheme. We present a general strategy that overcomes these obstacles, together with some details of our successful conversion of an adaptive mesh algorithm from a body-fitted code to a cartesian boundary code.
Solar dynamic power for the Space Station
NASA Technical Reports Server (NTRS)
Archer, J. S.; Diamant, E. S.
1986-01-01
This paper describes a computer code which provides a significant advance in the systems analysis capabilities of solar dynamic power modules. While the code can be used to advantage in the preliminary analysis of terrestrial solar dynamic modules its real value lies in the adaptions which make it particularly useful for the conceptualization of optimized power modules for space applications. In particular, as illustrated in the paper, the code can be used to establish optimum values of concentrator diameter, concentrator surface roughness, concentrator rim angle and receiver aperture corresponding to the main heat cycle options - Organic Rankine and Brayton - and for certain receiver design options. The code can also be used to establish system sizing margins to account for the loss of reflectivity in orbit or the seasonal variation of insolation. By the simulation of the interactions among the major components of a solar dynamic module and through simplified formulations of the major thermal-optic-thermodynamic interactions the code adds a powerful, efficient and economic analytical tool to the repertory of techniques available for the design of advanced space power systems.
NASA Technical Reports Server (NTRS)
Rathjen, K. A.
1977-01-01
A digital computer code CAVE (Conduction Analysis Via Eigenvalues), which finds application in the analysis of two dimensional transient heating of hypersonic vehicles is described. The CAVE is written in FORTRAN 4 and is operational on both IBM 360-67 and CDC 6600 computers. The method of solution is a hybrid analytical numerical technique that is inherently stable permitting large time steps even with the best of conductors having the finest of mesh size. The aerodynamic heating boundary conditions are calculated by the code based on the input flight trajectory or can optionally be calculated external to the code and then entered as input data. The code computes the network conduction and convection links, as well as capacitance values, given basic geometrical and mesh sizes, for four generations (leading edges, cooled panels, X-24C structure and slabs). Input and output formats are presented and explained. Sample problems are included. A brief summary of the hybrid analytical-numerical technique, which utilizes eigenvalues (thermal frequencies) and eigenvectors (thermal mode vectors) is given along with aerodynamic heating equations that have been incorporated in the code and flow charts.
NASA Astrophysics Data System (ADS)
Steefel, C. I.
2015-12-01
Over the last 20 years, we have seen the evolution of multicomponent reactive transport modeling and the expanding range and increasing complexity of subsurface environmental applications it is being used to address. Reactive transport modeling is being asked to provide accurate assessments of engineering performance and risk for important issues with far-reaching consequences. As a result, the complexity and detail of subsurface processes, properties, and conditions that can be simulated have significantly expanded. Closed form solutions are necessary and useful, but limited to situations that are far simpler than typical applications that combine many physical and chemical processes, in many cases in coupled form. In the absence of closed form and yet realistic solutions for complex applications, numerical benchmark problems with an accepted set of results will be indispensable to qualifying codes for various environmental applications. The intent of this benchmarking exercise, now underway for more than five years, is to develop and publish a set of well-described benchmark problems that can be used to demonstrate simulator conformance with norms established by the subsurface science and engineering community. The objective is not to verify this or that specific code--the reactive transport codes play a supporting role in this regard—but rather to use the codes to verify that a common solution of the problem can be achieved. Thus, the objective of each of the manuscripts is to present an environmentally-relevant benchmark problem that tests the conceptual model capabilities, numerical implementation, process coupling, and accuracy. The benchmark problems developed to date include 1) microbially-mediated reactions, 2) isotopes, 3) multi-component diffusion, 4) uranium fate and transport, 5) metal mobility in mining affected systems, and 6) waste repositories and related aspects.
Plant, Jennifer; Li, Su-Ting T; Blankenburg, Rebecca; Bogetz, Alyssa L; Long, Michele; Butani, Lavjay
2017-11-01
To explore when and in what form pediatric faculty and residents practice reflection. From February to June 2015, the authors conducted focus groups of pediatric faculty and residents at the University of California, Davis; Stanford University; and the University of California, San Francisco, until thematic saturation occurred. Transcripts were analyzed based on Mezirow's and Schon's models of reflection, using the constant comparative method associated with grounded theory. Two investigators independently coded transcripts and reconciled codes to develop themes. All investigators reviewed the codes and developed a final list of themes through consensus. Through iterative discussions, investigators developed a conceptual model of reflection in the clinical setting. Seventeen faculty and 20 residents from three institutions participated in six focus groups. Five themes emerged: triggers of reflection, intrinsic factors, extrinsic factors, timing, and outcome of reflection. Various triggers led to reflection; whether a specific trigger led to reflection depended on intrinsic and extrinsic factors. When reflection occurred, it happened in action or on action. Under optimal conditions, this reflection was goal and action directed and became critical reflection. In other instances, this process resulted in unproductive rumination or acted as an emotional release or supportive therapy. Participants reflected in clinical settings, but did not always explicitly identify it as reflection or reflect in growth-promoting ways. Strategies to enhance critical reflection include developing knowledge and skills in reflection, providing performance data to inform reflection, creating time and space for safe reflection, and providing mentorship to guide the process.
Local coding based matching kernel method for image classification.
Song, Yan; McLoughlin, Ian Vince; Dai, Li-Rong
2014-01-01
This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV) techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK) method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.
Schweickert, Richard; Xi, Zhuangzhuang
2010-05-01
Dream reports from 21 dreamers in which a metamorphosis of a person-like entity or animal occurred were coded for characters and animals and for inner states attributed to them (Theory of Mind). In myths and fairy tales, Kelly and Keil (1985) found that conscious beings (people, gods) tend to be transformed into entities nearby in the conceptual structure of Keil (1979). This also occurred in dream reports, but perceptual nearness seemed more important than conceptual nearness. In dream reports, most inanimate objects involved in metamorphoses with person-like entities were objects such as statues that ordinarily resemble people physically, and moreover represent people. A metamorphosis of a person-like entity or animal did not lead to an increase in the amount of Theory of Mind attribution. We propose that a character-line starts when a character enters a dream; properties and Theory of Mind attributions tend to be preserved along the line, regardless of whether, metamorphoses occur on it. Copyright © 2009 Cognitive Science Society, Inc.
Perceived Noise Analysis for Offset Jets Applied to Commercial Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Huff, Dennis L.; Henderson, Brenda S.; Berton, Jeffrey J.; Seidel, Jonathan A.
2016-01-01
A systems analysis was performed with experimental jet noise data, engine/aircraft performance codes and aircraft noise prediction codes to assess takeoff noise levels and mission range for conceptual supersonic commercial aircraft. A parametric study was done to identify viable engine cycles that meet NASAs N+2 goals for noise and performance. Model scale data from offset jets was used as input to the aircraft noise prediction code to determine the expected sound levels for the lateral certification point where jet noise dominates over all other noise sources. The noise predictions were used to determine the optimal orientation of the offset nozzles to minimize the noise at the lateral microphone location. An alternative takeoff procedure called programmed lapse rate was evaluated for noise reduction benefits. Results show there are two types of engines that provide acceptable range performance; one is a standard mixed-flow turbofan with a single-stage fan, and the other is a three-stream variable-cycle engine with a multi-stage fan. The engine with a single-stage fan has a lower specific thrust and is 8 to 10 EPNdB quieter for takeoff. Offset nozzles reduce the noise directed toward the thicker side of the outer flow stream, but have less benefit as the core nozzle pressure ratio is reduced and the bypass-to-core area ratio increases. At the systems level for a three-engine N+2 aircraft with full throttle takeoff, there is a 1.4 EPNdB margin to Chapter 3 noise regulations predicted for the lateral certification point (assuming jet noise dominates). With a 10 reduction in thrust just after takeoff rotation, the margin increases to 5.5 EPNdB. Margins to Chapter 4 and Chapter 14 levels will depend on the cumulative split between the three certification points, but it appears that low specific thrust engines with a 10 reduction in thrust (programmed lapse rate) can come close to meeting Chapter 14 noise levels. Further noise reduction is possible with additional reduction in takeoff thrust using programmed lapse rate, but studies are needed to investigate the practical limits for safety and takeoff regulations.
Knox, Lucy; Douglas, Jacinta M; Bigby, Christine
2017-11-01
Although adults who sustain a severe traumatic brain injury (TBI) require support to make decisions in their lives, little is known about their experience of this process. The aim of this study was to explore how participation in decision making contributes to self-conceptualization in adults with severe TBI. We used constructivist grounded theory methods. Data included 20 in-depth interviews with adults with severe TBI. Through a process of constant comparison, analysis involved open and focused coding until clear categories emerged and data saturation was achieved. Self-conceptualization emerged as a complex and multifaceted process, as individuals with TBI aimed to reestablish a sense of autonomy. We describe a recursive relationship in which decision-making participation assists the dynamic construction of self, and self-concept contributes to the experience of making decisions. The role of an individual's social support network in acting as a bridge between participation and self-conceptualization is presented. Findings emphasize that contributing to decisions about one's own goals across a range of life areas can reinforce a positive self-concept. It is vital that supporters understand that participation in decision making provides a pathway to conceptualizing self and aim to maximize the person's participation in the decision-making process. Implications for Rehabilitation Previous research has identified that the experience of sustaining TBI has a significant impact on a person's conceptualization of self. This study identified that decision-making experiences play an important role in the ongoing process of self-conceptualization after injury. Decision-making experiences can reinforce a person's self-concept or lead them to revise (positively or negatively) their sense of self. By maximizing the person's decision-making participation, those around them can support them to develop positive self-attributes and contribute to shaping their future goals.
Coal Market Module - NEMS Documentation
2014-01-01
Documents the objectives and the conceptual and methodological approach used in the development of the National Energy Modeling System's (NEMS) Coal Market Module (CMM) used to develop the Annual Energy Outlook 2014 (AEO2014). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of CMM's two submodules. These are the Coal Production Submodule (CPS) and the Coal Distribution Submodule (CDS).
ERIC Educational Resources Information Center
Harley, Jason M.; Taub, Michelle; Azevedo, Roger; Bouchet, Francois
2018-01-01
Research on collaborative learning between humans and virtual pedagogical agents represents a necessary extension to recent research on the conceptual, theoretical, methodological, analytical, and educational issues behind co- and socially-shared regulated learning between humans. This study presents a novel coding framework that was developed and…
Graphic Novels as Great Books: A Grounded Theory Study of Faculty Teaching Graphic Novels
ERIC Educational Resources Information Center
Evans-Boniecki, Jeannie
2013-01-01
This Glaserian grounded theory study, through conceptual coding of interviews and course syllabi, aimed at exploring the motivations and aspirations university professors had when they offered courses dedicated to the study of graphic novels. As a result, the emergence of the graphic novel as a vital literary influence in 21st-century academia was…
ERIC Educational Resources Information Center
Pobric, Gorana; Jefferies, Elizabeth; Ralph, Matthew A. Lambon
2010-01-01
The key question of how the brain codes the meaning of words and pictures is the focus of vigorous debate. Is there a "semantic hub" in the temporal poles where these different inputs converge to form amodal conceptual representations? Alternatively, are there distinct neural circuits that underpin our comprehension of pictures and words?…
ERIC Educational Resources Information Center
Hsieh, Wen-Min; Tsai, Chin-Chung
2018-01-01
Using the draw-a-picture technique, the authors explored the learning conceptions held by students across grade levels. A total of 1,067 Taiwanese students in Grades 2, 4, 6, 8, 10, and 12 participated in this study. Participants were asked to use drawing to illustrate how they conceptualize learning. A coding checklist was developed to analyze…
A Proposed Conceptual Model of Military Medical Readiness
2007-05-01
critical role in complex military operations in which Medical Readiness 22 technological and information demands necessitate a multi-operator environment...Analysis 33 Coding 34 Data Collection 35 Medical Readiness 6 Boundaries 36 Researcher’s Role and Approach 37 Literature Review 37 The Military Health...Within the external environment, strategic shifts, technological advancements, and changing demographics affect how the Military Health System delivers
Counseling Students' Transformative Learning through a Study Abroad Curriculum
ERIC Educational Resources Information Center
Smith, Jayne E.; McAuliffe, Garrett; Rippard, Kelly S.
2014-01-01
Research on the impact of study abroad curricula on student learning is limited. In this study, data were collected from students the last day abroad and 2 weeks and 6 to 9 months after returning home from a study abroad program. Eight codes conceptualized a grounded theory of student learning during and after completion of the study abroad…
Critical discourse analysis of social justice in nursing's foundational documents.
Valderama-Wallace, Claire P
2017-07-01
Social inequities threaten the health of the global population. A superficial acknowledgement of social justice by nursing's foundational documents may limit the degree to which nurses view injustice as relevant to nursing practice and education. The purpose was to examine conceptualizations of social justice and connections to broader contexts in the most recent editions. Critical discourse analysis examines and uncovers dynamics related to power, language, and inequality within the American Nurses Association's Code of Ethics, Scope and Standards of Practice, and Social Policy Statement. This analysis found ongoing inconsistencies in conceptualizations of social justice. Although the Code of Ethics integrates concepts related to social justice far more than the other two, tension between professionalism and social change emerges. The discourse of professionalism renders interrelated cultural, social, economic, historical, and political contexts nearly invisible. Greater consistency would provide a clearer path for nurses to mobilize and engage in the courageous work necessary to address social injustice. These findings also call for an examination of how nurses can critique and use the power and privilege of professionalism to amplify the connection between social institutions and health equity in nursing education, practice, and policy development. © 2017 Wiley Periodicals, Inc.
Day, Adam M B; Theurer, Julie A; Dykstra, Allyson D; Doyle, Philip C
2012-01-01
This work examines the environmental factors component of the International Classification of Functioning, Disability, and Health (ICF) relative to current health-facilitating evidence about natural environmental factors. We argue that the environmental factors component warrants reconceptualization in order to offer an extended and more systematic framework for identifying and measuring health-facilitating natural environmental factors. Current evidence highlighting the potential health-facilitating benefits of natural environmental factors is synthesized and considered in the context of the ICF framework and its coding system. In its current form, the ICF's conceptual framework and coding system are inadequate for identifying and measuring natural environmental factors in individuals and groups with and/or without health conditions. The ICF provides an advanced framework for health and disability that reflects contemporary conceptualizations about health. However, given the scope of emerging evidence highlighting positive health and well-being outcomes associated with natural environmental factors, we believe the environmental factors component requires further advancement to reflect this current knowledge. Reconceptualizing the environmental factors component supports a more holistic interpretation of the continuum of environmental factors as both facilitators and barriers. In doing so, it strengthens the ICF's utility in identifying and measuring health-facilitating natural environmental factors.
Optimization of 3D Field Design
NASA Astrophysics Data System (ADS)
Logan, Nikolas; Zhu, Caoxiang
2017-10-01
Recent progress in 3D tokamak modeling is now leveraged to create a conceptual design of new external 3D field coils for the DIII-D tokamak. Using the IPEC dominant mode as a target spectrum, the Finding Optimized Coils Using Space-curves (FOCUS) code optimizes the currents and 3D geometry of multiple coils to maximize the total set's resonant coupling. The optimized coils are individually distorted in space, creating toroidal ``arrays'' containing a variety of shapes that often wrap around a significant poloidal extent of the machine. The generalized perturbed equilibrium code (GPEC) is used to determine optimally efficient spectra for driving total, core, and edge neoclassical toroidal viscosity (NTV) torque and these too provide targets for the optimization of 3D coil designs. These conceptual designs represent a fundamentally new approach to 3D coil design for tokamaks targeting desired plasma physics phenomena. Optimized coil sets based on plasma response theory will be relevant to designs for future reactors or on any active machine. External coils, in particular, must be optimized for reliable and efficient fusion reactor designs. Work supported by the US Department of Energy under DE-AC02-09CH11466.
How is shared decision-making defined among African-Americans with diabetes?
Peek, Monica E; Quinn, Michael T; Gorawara-Bhat, Rita; Odoms-Young, Angela; Wilson, Shannon C; Chin, Marshall H
2008-09-01
This study investigates how shared decision-making (SDM) is defined by African-American patients with diabetes, and compares patients' conceptualization of SDM with the Charles model. We utilized race-concordant interviewers/moderators to conduct in-depth interviews and focus groups among a purposeful sample of African-American patients with diabetes. Each interview/focus group was audio-taped, transcribed verbatim and imported into Atlas.ti software. Coding was done using an iterative process and each transcription was independently coded by two members of the research team. Although the conceptual domains were similar, patient definitions of what it means to "share" in the decision-making process differed significantly from the Charles model of SDM. Patients stressed the value of being able to "tell their story and be heard" by physicians, emphasized the importance of information sharing rather than decision-making sharing, and included an acceptable role for non-adherence as a mechanism to express control and act on treatment preferences. Current instruments may not accurately measure decision-making preferences of African-American patients with diabetes. Future research should develop instruments to effectively measure decision-making preferences within this population. Emphasizing information-sharing that validates patients' experiences may be particularly meaningful to African-Americans with diabetes.
ERIC Educational Resources Information Center
Maddox, Alexia; Zhao, Linlin
2017-01-01
This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…
NASA Astrophysics Data System (ADS)
Parrott, Annette M.
Problem. Science teachers are charged with preparing students to become scientifically literate individuals. Teachers are given curriculum that specifies the knowledge that students should come away with; however, they are not necessarily aware of the knowledge with which the student arrives or how best to help them navigate between the two knowledge states. Educators must be aware, not only of where their students are conceptually, but how their students move from their prior knowledge and naive theories, to scientifically acceptable theories. The understanding of how students navigate this course has the potential to revolutionize educational practices. Methods. This study explored how five 9th grade biology students reconstructed their cognitive frameworks and navigated conceptual change from prior conception to consensual genetics knowledge. The research questions investigated were: (1) how do students in the process of changing their naive science theories to accepted science theories describe their journey from prior knowledge to current conception, and (2) what are the methods that students utilize to bridge the gap between alternate and consensual science conceptions to effect conceptual change. Qualitative and quantitative methods were employed to gather and analyze the data. In depth, semi-structured interviews formed the primary data for probing the context and details of students' conceptual change experience. Primary interview data was coded by thematic analysis. Results and discussion. This study revealed information about students' perceived roles in learning, the role of articulation in the conceptual change process, and ways in which a community of learners aids conceptual change. It was ascertained that students see their role in learning primarily as repeating information until they could add that information to their knowledge. Students are more likely to consider challenges to their conceptual frameworks and be more motivated to become active participants in constructing their knowledge when they are working collaboratively with peers instead of receiving instruction from their teacher. Articulation was found to be instrumental in aiding learners in identifying their alternate conceptions as well as in revisiting, investigating and reconstructing their conceptual frameworks. Based on the assumptions generated, suggestions were offered to inform pedagogical practice in support of the conceptual change process.
Impact of thorium based molten salt reactor on the closure of the nuclear fuel cycle
NASA Astrophysics Data System (ADS)
Jaradat, Safwan Qasim Mohammad
Molten salt reactor (MSR) is one of six reactors selected by the Generation IV International Forum (GIF). The liquid fluoride thorium reactor (LFTR) is a MSR concept based on thorium fuel cycle. LFTR uses liquid fluoride salts as a nuclear fuel. It uses 232Th and 233U as the fertile and fissile materials, respectively. Fluoride salt of these nuclides is dissolved in a mixed carrier salt of lithium and beryllium (FLiBe). The objective of this research was to complete feasibility studies of a small commercial thermal LFTR. The focus was on neutronic calculations in order to prescribe core design parameter such as core size, fuel block pitch (p), fuel channel radius, fuel path, reflector thickness, fuel salt composition, and power. In order to achieve this objective, the applicability of Monte Carlo N-Particle Transport Code (MCNP) to MSR modeling was verified. Then, a prescription for conceptual small thermal reactor LFTR and relevant calculations were performed using MCNP to determine the main neutronic parameters of the core reactor. The MCNP code was used to study the reactor physics characteristics for the FUJI-U3 reactor. The results were then compared with the results obtained from the original FUJI-U3 using the reactor physics code SRAC95 and the burnup analysis code ORIPHY2. The results were comparable with each other. Based on the results, MCNP was found to be a reliable code to model a small thermal LFTR and study all the related reactor physics characteristics. The results of this study were promising and successful in demonstrating a prefatory small commercial LFTR design. The outcome of using a small core reactor with a diameter/height of 280/260 cm that would operate for more than five years at a power level of 150 MWth was studied. The fuel system 7LiF - BeF2 - ThF4 - UF4 with a (233U/ 232Th) = 2.01 % was the candidate fuel for this reactor core.
Perceptual Processing Affects Conceptual Processing
ERIC Educational Resources Information Center
van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.
2008-01-01
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…
Results from conceptual design study of potential early commercial MHD/steam power plants
NASA Technical Reports Server (NTRS)
Hals, F.; Kessler, R.; Swallom, D.; Westra, L.; Zar, J.; Morgan, W.; Bozzuto, C.
1981-01-01
This paper presents conceptual design information for a potential early MHD power plant developed in the second phase of a joint study of such plants. Conceptual designs of plant components and equipment with performance, operational characteristics and costs are reported on. Plant economics and overall performance including full and part load operation are reviewed. Environmental aspects and the methods incorporated in plant design for emission control of sulfur and nitrogen oxides are reviewed. Results from reliability/availability analysis conducted are also included.
Coupling Conceptual and Quantitative Problems to Develop Expertise in Introductory Physics Students
NASA Astrophysics Data System (ADS)
Singh, Chandralekha
2008-10-01
We discuss the effect of administering conceptual and quantitative isomorphic problem pairs (CQIPP) back to back vs. asking students to solve only one of the problems in the CQIPP in introductory physics courses. Students who answered both questions in a CQIPP often performed better on the conceptual questions than those who answered the corresponding conceptual questions only. Although students often took advantage of the quantitative counterpart to answer a conceptual question of a CQIPP correctly, when only given the conceptual question, students seldom tried to convert it into a quantitative question, solve it and then reason about the solution conceptually. Even in individual interviews, when students who were only given conceptual questions had difficulty and the interviewer explicitly encouraged them to convert the conceptual question into the corresponding quantitative problem by choosing appropriate variables, a majority of students were reluctant and preferred to guess the answer to the conceptual question based upon their gut feeling.
Torres-Montúfar, Alejandro; Borsch, Thomas; Ochoterena, Helga
2018-05-01
The conceptualization and coding of characters is a difficult issue in phylogenetic systematics, no matter which inference method is used when reconstructing phylogenetic trees or if the characters are just mapped onto a specific tree. Complex characters are groups of features that can be divided into simpler hierarchical characters (reductive coding), although the implied hierarchical relational information may change depending on the type of coding (composite vs. reductive). Up to now, there is no common agreement to either code characters as complex or simple. Phylogeneticists have discussed which coding method is best but have not incorporated the heuristic process of reciprocal illumination to evaluate the coding. Composite coding allows to test whether 1) several characters were linked resulting in a structure described as a complex character or trait or 2) independently evolving characters resulted in the configuration incorrectly interpreted as a complex character. We propose that complex characters or character states should be decomposed iteratively into simpler characters when the original homology hypothesis is not corroborated by a phylogenetic analysis, and the character or character state is retrieved as homoplastic. We tested this approach using the case of fruit types within subfamily Cinchonoideae (Rubiaceae). The iterative reductive coding of characters associated with drupes allowed us to unthread fruit evolution within Cinchonoideae. Our results show that drupes and berries are not homologous. As a consequence, a more precise ontology for the Cinchonoideae drupes is required.
Effects of Divided Attention at Retrieval on Conceptual Implicit Memory
Prull, Matthew W.; Lawless, Courtney; Marshall, Helen M.; Sherman, Annabella T. K.
2016-01-01
This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (CEG; Experiments 1 and 3), or the matched explicit memory test of category-cued recall (Experiment 2), both of which are conceptually driven memory tasks, under one of two divided attention (DA) conditions in which participants simultaneously performed a distracting task. The distracting task was either syllable judgments (dissimilar processes), or semantic judgments (similar processes) on unrelated words. Compared to full attention (FA) in which no distracting task was performed, DA had no effect on CEG priming overall, but reduced category-cued recall similarly regardless of distractor task. Analyses of distractor task performance also revealed differences between implicit and explicit memory retrieval. The evidence suggests that, whereas explicit memory retrieval requires attentional resources and is disrupted by semantic and phonological distracting tasks, conceptual implicit memory is automatic and unaffected even when distractor and memory tasks involve similar processes. PMID:26834678
Effects of Divided Attention at Retrieval on Conceptual Implicit Memory.
Prull, Matthew W; Lawless, Courtney; Marshall, Helen M; Sherman, Annabella T K
2016-01-01
This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (CEG; Experiments 1 and 3), or the matched explicit memory test of category-cued recall (Experiment 2), both of which are conceptually driven memory tasks, under one of two divided attention (DA) conditions in which participants simultaneously performed a distracting task. The distracting task was either syllable judgments (dissimilar processes), or semantic judgments (similar processes) on unrelated words. Compared to full attention (FA) in which no distracting task was performed, DA had no effect on CEG priming overall, but reduced category-cued recall similarly regardless of distractor task. Analyses of distractor task performance also revealed differences between implicit and explicit memory retrieval. The evidence suggests that, whereas explicit memory retrieval requires attentional resources and is disrupted by semantic and phonological distracting tasks, conceptual implicit memory is automatic and unaffected even when distractor and memory tasks involve similar processes.
Computer Code For Turbocompounded Adiabatic Diesel Engine
NASA Technical Reports Server (NTRS)
Assanis, D. N.; Heywood, J. B.
1988-01-01
Computer simulation developed to study advantages of increased exhaust enthalpy in adiabatic turbocompounded diesel engine. Subsytems of conceptual engine include compressor, reciprocator, turbocharger turbine, compounded turbine, ducting, and heat exchangers. Focus of simulation of total system is to define transfers of mass and energy, including release and transfer of heat and transfer of work in each subsystem, and relationship among subsystems. Written in FORTRAN IV.
Connecting to Get Things Done: A Conceptual Model of the Process Used to Respond to Bias Incidents
ERIC Educational Resources Information Center
LePeau, Lucy A.; Morgan, Demetri L.; Zimmerman, Hilary B.; Snipes, Jeremy T.; Marcotte, Beth A.
2016-01-01
In this study, we interviewed victims of bias incidents and members of a bias response team to investigate the process the team used to respond to incidents. Incidents included acts of sexism, homophobia, and racism on a large, predominantly White research university in the Midwest. Data were analyzed using a 4-stage coding process. The emergent…
DEVELOPMENT OF PERMANENT MECHANICAL REPAIR SLEEVE FOR PLASTIC PIPE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hitesh Patadia
2004-09-30
The report presents a comprehensive summary of the project status related to the development of a permanent mechanical repair fitting intended to be installed on damaged PE mains under blowing gas conditions. Specifically, the product definition has been developed taking into account relevant codes and standards and industry input. A conceptual design for the mechanical repair sleeve has been developed which meets the product definition.
Investigation of Near Shannon Limit Coding Schemes
NASA Technical Reports Server (NTRS)
Kwatra, S. C.; Kim, J.; Mo, Fan
1999-01-01
Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.
Differential verbal, visual, and spatial working memory in written language production.
Raulerson, Bascom A; Donovan, Michael J; Whiteford, Alison P; Kellogg, Ronald T
2010-02-01
The contributions of verbal, visual, and spatial working memory to written language production were investigated. Participants composed definitions for nouns while concurrently performing a task which required updating, storing, and retrieving information coded either verbally, visually, or spatially. The present study extended past findings by showing the linguistic encoding of planned conceptual content makes its largest demand on verbal working memory for both low and high frequency nouns. Kellogg, Olive, and Piolat in 2007 found that concrete nouns place substantial demands on visual working memory when imaging the nouns' referents during planning, whereas abstract nouns make no demand. The current study further showed that this pattern was not an artifact of visual working memory being sensitive to manipulation of just any lexical property of the noun prompts. In contrast to past results, writing made a small but detectible demand on spatial working memory.
Stenberg, Nicola; Furness, Penny J
2017-03-01
The outcomes of self-management interventions are commonly assessed using quantitative measurement tools, and few studies ask people with long-term conditions to explain, in their own words, what aspects of the intervention they valued. In this Grounded Theory study, a Health Trainers service in the north of England was evaluated based on interviews with eight service-users. Open, focused, and theoretical coding led to the development of a preliminary model explaining participants' experiences and perceived impact of the service. The model reflects the findings that living well with a long-term condition encompassed social connectedness, changed identities, acceptance, and self-care. Health trainers performed four related roles that were perceived to contribute to these outcomes: conceptualizer, connector, coach, and champion. The evaluation contributes a grounded theoretical understanding of a personalized self-management intervention that emphasizes the benefits of a holistic approach to enable cognitive, behavioral, emotional, and social adjustments.
NASA Astrophysics Data System (ADS)
Reinartz-Estrada, Monica
Based on difficulties observed on the subject of technical-scientific conceptualization and the integration of theory and practice in learning animal physiology for students in the Animal Science program at the National University of Colombia in Medellin, this research paper proposes a problem-based learning strategy founded on the method of Problem Based Learning (PBL), applied specifically to the issues of thermoregulation and physiological stress in domestic animals. In this case study, a sample size of eight students was presented with a pedagogical problem during the first session that would then be solved during the course. In order to evaluate the process, three surveys were conducted called Level Test Formulations (NF) performed at different times of the trial: one before beginning the topic (NF 1), one after three theoretical classes had been given and before beginning the fieldwork (NF 2), and another one after the end of the process (NF 3). Finally, individual interviews were conducted with each student to know the students' perceptions regarding the method. The information obtained was subjected to a qualitative analysis and categorization, using the QDA Miner program which reviewed and coded texts from the surveys and individual interviews, supplemented in turn, by field observation, analyzing the conceptual change, the theory-practice relationship and the correlation between the variables and categories established. Among the main results obtained, it should be noted that following the implementation of PBL in this Animal Physiology course, support for conceptual change was demonstrated and the formulated problem served as a connector between theory and practice. Moreover, there was a fusion of prior knowledge with newly acquired knowledge, meaningful learning, improvement in the level of conceptualization and an increase in the scientificness of definitions; it also led to problem-solving and overcoming epistemological obstacles such as multidisciplinarity and nonlinearity. As a result of this research, it is recommended that this method be evaluated in other topics related to Animal Physiology, in other sciences, in larger sample sizes, as well as to address the issue of evaluation applied directly to this method. Key words: Problem Based Learning (PBL), conceptual change, integration of theory and practice, significatif learning, animal physiology, thermoregulation, physiological stress.
Conceptual model for collision detection and avoidance for runway incursion prevention
NASA Astrophysics Data System (ADS)
Latimer, Bridgette A.
The Federal Aviation Administration (FAA), National Transportation and Safety Board (NTSB), National Aeronautics and Space Administration (NASA), numerous corporate entities, and research facilities have each come together to determine ways to make air travel safer and more efficient. These efforts have resulted in the development of a concept known as the Next Generation (Next Gen) of Aircraft or Next Gen. The Next Gen concept promises to be a clear departure from the way in which aircraft operations are performed today. The Next Gen initiatives require that modifications are made to the existing National Airspace System (NAS) concept of operations, system level requirements, software (SW) and hardware (HW) requirements, SW and HW designs and implementations. A second example of the changes in the NAS is the shift away from air traffic controllers having the responsibility for separation assurance. In the proposed new scheme of free flight, each aircraft would be responsible for assuring that it is safely separated from surrounding aircraft. Free flight would allow the separation minima for enroute aircraft to be reduced from 2000 nautical miles (nm) to 1000 nm. Simply put "Free Flight is a concept of air traffic management that permits pilots and controllers to share information and work together to manage air traffic from pre-flight through arrival without compromising safety [107]." The primary goal of this research project was to create a conceptual model that embodies the essential ingredients needed for a collision detection and avoidance system. This system was required to operate in two modes: air traffic controller's perspective and pilot's perspective. The secondary goal was to demonstrate that the technologies, procedures, and decision logic embedded in the conceptual model were able to effectively detect and avoid collision risks from both perspectives. Embodied in the conceptual model are five distinct software modules: Data Acquisition, State Processor, Projection, Collision Detection, and Alerting and Resolution. The underlying algorithms in the Projection module are linear projection and Kalman filtering which are used to estimate the future state of the aircraft. The Resolution and Alerting module is comprised of two algorithms: a generic alerting algorithm and the potential fields algorithm [71]. The conceptual model was created using Enterprise Architect RTM and MATLAB RTM was used to code the methods and to simulate conflict scenarios.
Status report on the development of a tubular electron beam ion source
NASA Astrophysics Data System (ADS)
Donets, E. D.; Donets, E. E.; Becker, R.; Liljeby, L.; Rensfelt, K.-G.; Beebe, E. N.; Pikin, A. I.
2004-05-01
The theoretical estimations and numerical simulations of tubular electron beams in both beam and reflex mode of source operation as well as the off-axis ion extraction from a tubular electron beam ion source (TEBIS) are presented. Numerical simulations have been done with the use of the IGUN and OPERA-3D codes. Numerical simulations with IGUN code show that the effective electron current can reach more than 100 A with a beam current density of about 300-400 A/cm2 and the electron energy in the region of several KeV with a corresponding increase of the ion output. Off-axis ion extraction from the TEBIS, being the nonaxially symmetric problem, was simulated with OPERA-3D (SCALA) code. The conceptual design and main parameters of new tubular sources which are under consideration at JINR, MSL, and BNL are based on these simulations.
Schultz, Wolfram
2004-04-01
Neurons in a small number of brain structures detect rewards and reward-predicting stimuli and are active during the expectation of predictable food and liquid rewards. These neurons code the reward information according to basic terms of various behavioural theories that seek to explain reward-directed learning, approach behaviour and decision-making. The involved brain structures include groups of dopamine neurons, the striatum including the nucleus accumbens, the orbitofrontal cortex and the amygdala. The reward information is fed to brain structures involved in decision-making and organisation of behaviour, such as the dorsolateral prefrontal cortex and possibly the parietal cortex. The neural coding of basic reward terms derived from formal theories puts the neurophysiological investigation of reward mechanisms on firm conceptual grounds and provides neural correlates for the function of rewards in learning, approach behaviour and decision-making.
Gene and genon concept: coding versus regulation
2007-01-01
We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760
Development of photovoltaic array and module safety requirements
NASA Technical Reports Server (NTRS)
1982-01-01
Safety requirements for photovoltaic module and panel designs and configurations likely to be used in residential, intermediate, and large-scale applications were identified and developed. The National Electrical Code and Building Codes were reviewed with respect to present provisions which may be considered to affect the design of photovoltaic modules. Limited testing, primarily in the roof fire resistance field was conducted. Additional studies and further investigations led to the development of a proposed standard for safety for flat-plate photovoltaic modules and panels. Additional work covered the initial investigation of conceptual approaches and temporary deployment, for concept verification purposes, of a differential dc ground-fault detection circuit suitable as a part of a photovoltaic array safety system.
Maximum likelihood decoding analysis of Accumulate-Repeat-Accumulate Codes
NASA Technical Reports Server (NTRS)
Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung
2004-01-01
Repeat-Accumulate (RA) codes are the simplest turbo-like codes that achieve good performance. However, they cannot compete with Turbo codes or low-density parity check codes (LDPC) as far as performance is concerned. The Accumulate Repeat Accumulate (ARA) codes, as a subclass of LDPC codes, are obtained by adding a pre-coder in front of RA codes with puncturing where an accumulator is chosen as a precoder. These codes not only are very simple, but also achieve excellent performance with iterative decoding. In this paper, the performance of these codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. The weight distribution of some simple ARA codes is obtained, and through existing tightest bounds we have shown the ML SNR threshold of ARA codes approaches very closely to the performance of random codes. We have shown that the use of precoder improves the SNR threshold but interleaving gain remains unchanged with respect to RA code with puncturing.
Kindergarten students' explanations during science learning
NASA Astrophysics Data System (ADS)
Harris, Karleah
The study examines kindergarten students' explanations during science learning. The data on children's explanations are drawn from videotaped and transcribed discourse collected from four public kindergarten science classrooms engaged in a life science inquiry unit on the life cycle of the monarch butterfly. The inquiry unit was implemented as part of a larger intervention conducted as part of the Scientific Literacy Project or SLP (Mantzicopoulos, Patrick & Samarapungavan, 2005). The children's explanation data were coded and analyzed using quantitative content analysis procedures. The coding procedures involved initial "top down" explanation categories derived from the existing theoretical and empirical literature on scientific explanation and the nature of students' explanations, followed by an inductive or "bottom up" analysis, that evaluated and refined the categorization scheme as needed. The analyses provide important descriptive data on the nature and frequency of children's explanations generated in classroom discourse during the inquiry unit. The study also examines how teacher discourse strategies during classroom science discourse are related to children's explanations. Teacher discourse strategies were coded and analyzed following the same procedures as the children's explanations as noted above. The results suggest that, a) kindergarten students have the capability of generating a variety of explanations during inquiry-based science learning; b) teachers use a variety of classroom discourse strategies to support children's explanations during inquiry-based science learning; and c) The conceptual discourse (e.g., asking for or modeling explanations, asking for clarifications) to non-conceptual discourse (e.g., classroom management discourse) is related to the ratio of explanatory to non-explanatory discourse produced by children during inquiry-based science learning.
The future of 3D and video coding in mobile and the internet
NASA Astrophysics Data System (ADS)
Bivolarski, Lazar
2013-09-01
The current Internet success has already changed our social and economic world and is still continuing to revolutionize the information exchange. The exponential increase of amount and types of data that is currently exchanged on the Internet represents significant challenge for the design of future architectures and solutions. This paper reviews the current status and trends in the design of solutions and research activities in the future Internet from point of view of managing the growth of bandwidth requirements and complexity of the multimedia that is being created and shared. Outlines the challenges that are present before the video coding and approaches to the design of standardized media formats and protocols while considering the expected convergence of multimedia formats and exchange interfaces. The rapid growth of connected mobile devices adds to the current and the future challenges in combination with the expected, in near future, arrival of multitude of connected devices. The new Internet technologies connecting the Internet of Things with wireless visual sensor networks and 3D virtual worlds requires conceptually new approaches of media content handling from acquisition to presentation in the 3D Media Internet. Accounting for the entire transmission system properties and enabling adaptation in real-time to context and content throughout the media proceeding path will be paramount in enabling the new media architectures as well as the new applications and services. The common video coding formats will need to be conceptually redesigned to allow for the implementation of the necessary 3D Media Internet features.
Expanding the Andersen Model: The Role of Psychosocial Factors in Long-Term Care Use
Bradley, Elizabeth H; McGraw, Sarah A; Curry, Leslie; Buckser, Alison; King, Kinda L; Kasl, Stanislav V; Andersen, Ronald
2002-01-01
Objective To examine a prevailing conceptual model of health services use (Andersen 1995) and to suggest modifications that may enhance its explanatory power when applied to empirical studies of race/ethnicity and long-term care. Study Setting Twelve focus groups of African-American (five groups) and white (seven groups) individuals, aged 65 and older, residing in Connecticut during 2000. Study Design Using qualitative analysis, data were coded and analyzed in NUD-IST 4 software to facilitate the reporting of recurrent themes, supporting quotations, and links among the themes for developing the conceptual framework. Specific analysis was conducted to assess distinctions in common themes between African-American and white focus groups. Data Collection Data were collected using a standardized discussion guide, augmented by prompts for clarification. Audio taped sessions were transcribed and independently coded by investigators and crosschecked to enhance coding validity. An audit trail was maintained to document analytic decisions during data analysis and interpretation. Principal Findings Psychosocial factors (e.g., attitudes and knowledge, social norms, and perceived control) are identified as determinants of service use, thereby expanding the Andersen model (1995). African-American and white focus group members differed in their reported accessibility of information about long-term care, social norms concerning caregiving expectations and burden, and concerns of privacy and self-determination. Conclusions More comprehensive identification of psychosocial factors may enhance our understanding of the complex role of race/ethnicity in long-term care use as well as the effectiveness of policies and programs designed to address disparities in long-term care service use among minority and nonminority groups. PMID:12479494
Expanding the Andersen model: the role of psychosocial factors in long-term care use.
Bradley, Elizabeth H; McGraw, Sarah A; Curry, Leslie; Buckser, Alison; King, Kinda L; Kasl, Stanislav V; Andersen, Ronald
2002-10-01
To examine a prevailing conceptual model of health services use (Andersen 1995) and to suggest modifications that may enhance its explanatory power when applied to empirical studies of race/ethnicity and long-term care. Twelve focus groups of African-American (five groups) and white (seven groups) individuals, aged 65 and older, residing in Connecticut during 2000. Using qualitative analysis, data were coded and analyzed in NUD-IST 4 software to facilitate the reporting of recurrent themes, supporting quotations, and links among the themes for developing the conceptual framework. Specific analysis was conducted to assess distinctions in common themes between African-American and white focus groups. Data were collected using a standardized discussion guide, augmented by prompts for clarification. Audio taped sessions were transcribed and independently coded by investigators and crosschecked to enhance coding validity. An audit trail was maintained to document analytic decisions during data analysis and interpretation. Psychosocial factors (e.g., attitudes and knowledge, social norms, and perceived control) are identified as determinants of service use, thereby expanding the Andersen model (1995). African-American and white focus group members differed in their reported accessibility of information about long-term care, social norms concerning caregiving expectations and burden, and concerns of privacy and self-determination. More comprehensive identification of psychosocial factors may enhance our understanding of the complex role of race/ethnicity in long-term care use as well as the effectiveness of policies and programs designed to address disparities in long-term care service use among minority and nonminority groups.
Sensemaking, stakeholder discord, and long-term risk communication at a US Superfund site.
Hoover, Anna Goodman
2017-03-01
Risk communication can help reduce exposures to environmental contaminants, mitigate negative health outcomes, and inform community-based decisions about hazardous waste sites. While communication best practices have long guided such efforts, little research has examined unintended consequences arising from such guidelines. As rhetoric informs stakeholder sensemaking, the language used in and reinforced by these guidelines can challenge relationships and exacerbate stakeholder tensions. This study evaluates risk communication at a U.S. Superfund site to identify unintended consequences arising from current risk communication practices. This qualitative case study crystallizes data spanning 6 years from three sources: 1) local newspaper coverage of site-related topics; 2) focus-group transcripts from a multi-year project designed to support future visioning of site use; and 3) published blog entries authored by a local environmental activist. Constant comparative analysis provides the study's analytic foundation, with qualitative data analysis software QSR NVivo 8 supporting a three-step process: 1) provisional coding to identify broad topic categories within datasets, 2) coding occurrences of sensemaking constructs and emergent intra-dataset patterns, and 3) grouping related codes across datasets to examine the relationships among them. Existing risk communication practices at this Superfund site contribute to a dichotomous conceptualization of multiple and diverse stakeholders as members of one of only two categories: the government or the public. This conceptualization minimizes perceptions of capacity, encourages public commitment to stances aligned with a preferred group, and contributes to negative expectations that can become self-fulfilling prophecies. Findings indicate a need to re-examine and adapt risk communication guidelines to encourage more pluralistic understanding of the stakeholder landscape.
Telemetry: Summary of concept and rationale
NASA Astrophysics Data System (ADS)
1987-12-01
This report presents the concept and supporting rationale for the telemetry system developed by the Consultative Committee for Space Data Systems (CCSDS). The concepts, protocols and data formats developed for the telemetry system are designed for flight and ground data systems supporting conventional, contemporary free-flyer spacecraft. Data formats are designed with efficiency as a primary consideration, i.e., format overhead is minimized. The results reflect the consensus of experts from many space agencies. An overview of the CCSDS telemetry system introduces the notion of architectural layering to achieve transparent and reliable delivery of scientific and engineering sensor data (generated aboard space vehicles) to users located in space or on earth. The system is broken down into two major conceptual categories: a packet telemetry concept and a telemetry channel coding concept. Packet telemetry facilitates data transmission from source to user in a standardized and highly automated manner. It provides a mechanism for implementing common data structures and protocols which can enhance the development and operation of space mission systems. Telemetry channel coding is a method by which data can be sent from a source to a destination by processing it in such a way that distinct messages are created which are easily distinguishable from one another. This allows construction of the data with low error probability, thus improving performance of the channel.
Carbohydrates and sports practice: a Twitter virtual ethnography
Rodríguez-Martín, Beatriz; Castillo, Carlos Alberto
2017-02-01
Introduction: Although carbohydrates consumption is a key factor to enhance sport performance, intake levels seem questioned by some amateur athletes, leading to develop an irrational aversion to carbohydrate known as “carbophobia”. On the other hand, food is the origin of virtual communities erected as a source of knowledge and a way to exchange information. Despite this, very few studies have analysed the influence of social media in eating behaviours. Objectives: To know the conceptualizations about carbohydrates intake and eating patterns related to carbophobia expressed in amateur athletes’ Twitter accounts. Methods: Qualitative research designed from Hine’s Virtual Ethnography. Virtual immersion was used for data collection in Twitter open accounts in a theoretical sample of tweets from amateur athletes. Discourse analysis of narrative information of tweets was carried out through open, axial and selective coding process and the constant comparison method. Results: Data analysis revealed four main categories that offered a picture of conceptualizations of carbohydrates: carbohydrates as suspects or guilty from slowing down training, carbophobia as a lifestyle, carbophobia as a religion and finally the love/hate relationship with carbohydrates. Conclusions: Low-carbohydrate diet is considered a healthy lifestyle in some amateur athletes. The results of this study show the power of virtual communication tools such as Twitter to support, promote and maintain uncommon and not necessarily healthy eating behaviours. Future studies should focus on the context in which these practices appear.
Students' conceptual performance on synthesis physics problems with varying mathematical complexity
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-06-01
A body of research on physics problem solving has focused on single-concept problems. In this study we use "synthesis problems" that involve multiple concepts typically taught in different chapters. We use two types of synthesis problems, sequential and simultaneous synthesis tasks. Sequential problems require a consecutive application of fundamental principles, and simultaneous problems require a concurrent application of pertinent concepts. We explore students' conceptual performance when they solve quantitative synthesis problems with varying mathematical complexity. Conceptual performance refers to the identification, follow-up, and correct application of the pertinent concepts. Mathematical complexity is determined by the type and the number of equations to be manipulated concurrently due to the number of unknowns in each equation. Data were collected from written tasks and individual interviews administered to physics major students (N =179 ) enrolled in a second year mechanics course. The results indicate that mathematical complexity does not impact students' conceptual performance on the sequential tasks. In contrast, for the simultaneous problems, mathematical complexity negatively influences the students' conceptual performance. This difference may be explained by the students' familiarity with and confidence in particular concepts coupled with cognitive load associated with manipulating complex quantitative equations. Another explanation pertains to the type of synthesis problems, either sequential or simultaneous task. The students split the situation presented in the sequential synthesis tasks into segments but treated the situation in the simultaneous synthesis tasks as a single event.
Conceptualizing Conceptual Teaching: Practical Strategies for Large Instrumental Ensembles
ERIC Educational Resources Information Center
Tan, Leonard
2016-01-01
Half a century ago, calls had already been made for instrumental ensemble directors to move beyond performance to include the teaching of musical concepts in the rehearsal hall. Relatively recent research, however, suggests that conceptual teaching remains relatively infrequent during rehearsals. Given the importance of teaching for long-term…
Conceptual Understanding of Multiplicative Properties through Endogenous Digital Game Play
ERIC Educational Resources Information Center
Denham, Andre
2012-01-01
This study purposed to determine the effect of an endogenously designed instructional game on conceptual understanding of the associative and distributive properties of multiplication. Additional this study sought to investigate if performance on measures of conceptual understanding taken prior to and after game play could serve as predictors of…
NASA Astrophysics Data System (ADS)
Zacharia, Zacharias C.; Lazaridou, Charalambia; Avraamidou, Lucy
2016-03-01
The purpose of this study was to examine the impact of mobile learning among young learners. Specifically, we investigated whether the use of mobile devices for data collection during field trips outside the classroom could enhance fourth graders' learning about the parts of the flower and their functions, flower pollinators and the process of pollination/fertilization, and the interrelationship between animals and plants, more than students' use of traditional means of data collection. For this purpose, we designed a pre-post experimental design study with two conditions: one in which participants used a mobile device for data collection and another using traditional means (e.g. sketching and note-taking). The sample comprised 48 fourth graders (24 in each condition), who studied the flower, its parts, and their functions. A conceptual test was administered to assess students' understanding before and after instruction. Moreover, the students' science notebooks and accompanying artifacts were used as a data source for examining students' progress during the study's intervention. The conceptual test and notebook data were analyzed statistically, whereas we used open coding for the artifacts. Findings revealed that using mobile devices for data collection enhanced students' conceptual understanding more than using traditional means of data collection.
Perceptual and conceptual information processing in schizophrenia and depression.
Dreben, E K; Fryer, J H; McNair, D M
1995-04-01
Schizophrenic patients (n = 20), depressive patients (n = 20), and normal adults (n = 20) were compared on global vs local analyses of perceptual information using tachistoscopic tasks and on top-down vs bottom-up conceptual processing using card-sort tasks. The schizophrenic group performed more poorly on tasks requiring either global analyses (counting lines when distracting circles were present) or top-down conceptual processing (rule learning) than they did on tasks requiring local analyses (counting heterogeneous lines) or bottom-up processing (attribute identification). The schizophrenic group appeared not to use conceptually guided processing. Normal adults showed the reverse pattern. The depressive group performed similarly to the schizophrenic group on perceptual tasks but closer to the normal group on conceptual tasks, thereby appearing to be less dependent on a particular information-processing strategy. These deficits in organizational strategy may be related to the use of available processing resources as well as the allocation of attention.
An empirically based conceptual framework for fostering meaningful patient engagement in research.
Hamilton, Clayon B; Hoens, Alison M; Backman, Catherine L; McKinnon, Annette M; McQuitty, Shanon; English, Kelly; Li, Linda C
2018-02-01
Patient engagement in research (PEIR) is promoted to improve the relevance and quality of health research, but has little conceptualization derived from empirical data. To address this issue, we sought to develop an empirically based conceptual framework for meaningful PEIR founded on a patient perspective. We conducted a qualitative secondary analysis of in-depth interviews with 18 patient research partners from a research centre-affiliated patient advisory board. Data analysis involved three phases: identifying the themes, developing a framework and confirming the framework. We coded and organized the data, and abstracted, illustrated, described and explored the emergent themes using thematic analysis. Directed content analysis was conducted to derive concepts from 18 publications related to PEIR to supplement, confirm or refute, and extend the emergent conceptual framework. The framework was reviewed by four patient research partners on our research team. Participants' experiences of working with researchers were generally positive. Eight themes emerged: procedural requirements, convenience, contributions, support, team interaction, research environment, feel valued and benefits. These themes were interconnected and formed a conceptual framework to explain the phenomenon of meaningful PEIR from a patient perspective. This framework, the PEIR Framework, was endorsed by the patient research partners on our team. The PEIR Framework provides guidance on aspects of PEIR to address for meaningful PEIR. It could be particularly useful when patient-researcher partnerships are led by researchers with little experience of engaging patients in research. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.
van Rensburg, Elsie S Janse; Poggenpoel, Marie; Myburgh, Chris
2015-11-25
Student nurses (SNs) experience emotional discomfort during placement in the clinical psychiatric learning environment. This may negatively influence their mental health. Limited support is available to assist both SNs working with persons with intellectual disabilities and nurse educators during clinical accompaniment. This article aims to discuss the generation of this framework to enhance student support. A theory-generative, qualitative, exploratory, descriptive, contextual design was utilised to develop the framework by applying four steps. In step 1 concept analysis identified the central concept through field work. Data were collected from 13 SNs purposively selected from a specific higher educational institution in Gauteng through two focus group interviews, reflective journals, a reflective letter, naïve sketches, drawings and field notes and analysed with thematic coding. The central concept was identified from the results, supported by a literature review and defined by essential attributes. The central concept was classified through a survey list and demonstrated in a model case. In step 2 the central concepts were placed into relationships with each other. The conceptual framework was described and evaluated in step 3 and guidelines for implementation were described in step 4. The focus of this article will be on generating the conceptual framework. The central concept was 'the facilitation of engagement on a deeper emotional level of SNs'. The conceptual framework was described and evaluated. The conceptual framework can enhance the educational practices of nurse educators and can SN's practices of care for persons with intellectual disabilities.
Conceptual Model for Quality of Life among Adults With Congenital or Early Deafness
Kushalnagar, P; McKee, M; Smith, SR; Hopper, M; Kavin, D; Atcherson, SR
2015-01-01
Background A conceptual model of health-related quality of life (QoL) is needed to describe key themes that impact perceived QoL in adults with congenital or early deafness. Objective: To revise University of Washington Center for Disability Policy and Research's conceptual model of health promotion and QoL, with suggestions for applying the model to improving programs or services that target deaf adults with early deafness. Methods Purposive and theoretical sampling of 35 adults who were born or became deaf early was planned in a 1-year study. In-depth semi-structured interviews probed deaf adult participants' perceptions about quality of life as a deaf individual. Data saturation was reached at the 17th interview with 2 additional interviews for validation, resulting in a total sample of 19 deaf adults. Coding and thematic analysis were conducted to develop the conceptual model. Results Our conceptual model delineates the relationships between health status (self-acceptance, coping with limitations), intrinsic (functional communication skills, navigating barriers/self-advocacy, resilience) and extrinsic (acceptance by others, access to information, educating others) factors in their influence on deaf adult quality of life outcomes at home, college, work, and in the community. Conclusions Findings demonstrate the need for the programs and services to consider not only factors intrinsic to the deaf individual but also extrinsic factors in enhancing perceived quality of life outcomes among people with a range of functional hearing and language preferences, including American Sign Language. PMID:24947577
Conceptual Design Oriented Wing Structural Analysis and Optimization
NASA Technical Reports Server (NTRS)
Lau, May Yuen
1996-01-01
Airplane optimization has always been the goal of airplane designers. In the conceptual design phase, a designer's goal could be tradeoffs between maximum structural integrity, minimum aerodynamic drag, or maximum stability and control, many times achieved separately. Bringing all of these factors into an iterative preliminary design procedure was time consuming, tedious, and not always accurate. For example, the final weight estimate would often be based upon statistical data from past airplanes. The new design would be classified based on gross characteristics, such as number of engines, wingspan, etc., to see which airplanes of the past most closely resembled the new design. This procedure works well for conventional airplane designs, but not very well for new innovative designs. With the computing power of today, new methods are emerging for the conceptual design phase of airplanes. Using finite element methods, computational fluid dynamics, and other computer techniques, designers can make very accurate disciplinary-analyses of an airplane design. These tools are computationally intensive, and when used repeatedly, they consume a great deal of computing time. In order to reduce the time required to analyze a design and still bring together all of the disciplines (such as structures, aerodynamics, and controls) into the analysis, simplified design computer analyses are linked together into one computer program. These design codes are very efficient for conceptual design. The work in this thesis is focused on a finite element based conceptual design oriented structural synthesis capability (CDOSS) tailored to be linked into ACSYNT.
Conceptual model for quality of life among adults with congenital or early deafness.
Kushalnagar, Poorna; McKee, Michael; Smith, Scott R; Hopper, Melinda; Kavin, Denise; Atcherson, Samuel R
2014-07-01
A conceptual model of health-related quality of life (QoL) is needed to describe key themes that impact perceived QoL in adults with congenital or early deafness. To revise University of Washington Center for Disability Policy and Research's conceptual model of health promotion and QoL, with suggestions for applying the model to improving programs or services that target deaf adults with early deafness. Purposive and theoretical sampling of 35 adults who were born or became deaf early was planned in a 1-year study. In-depth semi-structured interviews probed deaf adult participants' perceptions about quality of life as a deaf individual. Data saturation was reached at the 17th interview with 2 additional interviews for validation, resulting in a total sample of 19 deaf adults. Coding and thematic analysis were conducted to develop the conceptual model. Our conceptual model delineates the relationships between health status (self-acceptance, coping with limitations), intrinsic (functional communication skills, navigating barriers/self-advocacy, resilience) and extrinsic (acceptance by others, access to information, educating others) factors in their influence on deaf adult quality of life outcomes at home, college, work, and in the community. Findings demonstrate the need for the programs and services to consider not only factors intrinsic to the deaf individual but also extrinsic factors in enhancing perceived quality of life outcomes among people with a range of functional hearing and language preferences, including American Sign Language. Copyright © 2014 Elsevier Inc. All rights reserved.
Basic physical processes and reduced models for plasma detachment
NASA Astrophysics Data System (ADS)
Stangeby, P. C.
2018-04-01
The divertor of a tokamak reactor will have to satisfy a number of critical constraints, the first of which is that the divertor targets not fail due to excessive heating or sputter-erosion. This paramount constraint of target survival defines the operating window for the principal plasma properties at the divertor target, the density n t and temperature, T t. In particular T et < 10 eV is shown to be required. Code and experimental studies show that the pressure–momentum loss by the plasma that occurs along flux tubes in the edge, between the divertor entrance and target, (i) correlates strongly with T et, and (ii) begins to increase as T et falls below 10 eV, becoming very strong by 1 eV. The transition between the high-recycling regime and the detached divertor regime has therefore been defined here to occur when T et < 10 eV. Simple analytic models are developed (i) to relate (T t, n t) to the controlling conditions ‘upstream’ e.g. at the divertor entrance, and (ii) in turn to relate (T t, n t) to other important divertor quantities including (a) the required level of radiative cooling in the divertor, and (b) the ion flux to the target in the presence of volumetric loss of particles, momentum and power in the divertor. The 2 Point Model, 2PM, is a widely used analytic model for relating (T t, n t) to the controlling upstream conditions. The 2PM is derived here for various levels of complexity regarding the effects included. Analytic models of divertor detachment provide valuable insight and useful approximations, but more complete modeling requires the use of edge codes such as EDGE2D, SOLPS, SONIC, UEDGE, etc. Edge codes have grown to become quite sophisticated and now constitute, in effect, ‘code-experiments’ that—just as for actual experiments—can benefit from interpretation in terms of simple conceptual frameworks. 2 Point Model Formatting, 2PMF, of edge code output can provide such a conceptual framework. Methods of applying 2PMF are illustrated here with some examples.
Effect of Lecture Instruction on Student Performance on Qualitative Questions
ERIC Educational Resources Information Center
Heron, Paula R. L.
2015-01-01
The impact of lecture instruction on student conceptual understanding in physics has been the subject of research for several decades. Most studies have reported disappointingly small improvements in student performance on conceptual questions despite direct instruction on the relevant topics. These results have spurred a number of attempts to…
Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J. A.
A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).
Effect of Teaching a Conceptual Hierarchy on Concept Classification Performance.
ERIC Educational Resources Information Center
Wilcox, Wayne C.; And Others
1981-01-01
In the research study described, 80 subjects in one control and four treatment groups viewed sets of slides on types of sailboats. Results indicate that making apparent the hierarchical relationships among concepts of a conceptual hierarchy enhances learner performance in classifying unencountered instances of those concepts. Thirty-eight…
Three Women's Educational Doctoral Program Experiences: A Case Study of Performances and Journeys
ERIC Educational Resources Information Center
Selmer, Sarah; Graham, Meadow; Goodykoontz, Erin
2011-01-01
Three academic women joined to write this piece to explore individual doctoral program experiences and to establish common understandings. They collectively analyzed their experiences using the conceptual approach of doctoral program performances and journeys. This case study shares their experiences within the conceptual approach through emerging…
How Academic Leaders Conceptualize the Phenomenon of Faculty Performance Appraisal Practices
ERIC Educational Resources Information Center
Soo Kim, Tatum
2016-01-01
This dissertation addresses the phenomenon of how academic leaders conceptualize faculty performance practices. Qualitative research methods were used to explore the experiences of 11 academic leaders from 4-year higher education institutions in the metropolitan area of New York, NY. Each academic leader had direct responsibility for faculty…
Validation of engineering methods for predicting hypersonic vehicle controls forces and moments
NASA Technical Reports Server (NTRS)
Maughmer, M.; Straussfogel, D.; Long, L.; Ozoroski, L.
1991-01-01
This work examines the ability of the aerodynamic analysis methods contained in an industry standard conceptual design code, the Aerodynamic Preliminary Analysis System (APAS II), to estimate the forces and moments generated through control surface deflections from low subsonic to high hypersonic speeds. Predicted control forces and moments generated by various control effectors are compared with previously published wind-tunnel and flight-test data for three vehicles: the North American X-15, a hypersonic research airplane concept, and the Space Shuttle Orbiter. Qualitative summaries of the results are given for each force and moment coefficient and each control derivative in the various speed ranges. Results show that all predictions of longitudinal stability and control derivatives are acceptable for use at the conceptual design stage.
NASA Technical Reports Server (NTRS)
Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu
2011-01-01
This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.
2013-01-01
Background Despite the critical role of nursing care in determining high-performing healthcare delivery, performance science in this area is still at an early stage of development and nursing’s contribution most often remains invisible to policy-makers and managers. The objectives of this study were: 1) to develop a theoretically based framework to conceptualize nursing care performance; 2) to analyze how the different components of the framework have been operationalized in the literature; and 3) to develop a pool of indicators sensitive to various aspects of nursing care that can be used as a basis for designing a performance measurement system. Methods We carried out a systematic review of published literature across three databases (MEDLINE, EMBASE and CINAHL), focusing on literature between 1990 and 2008. Screening of 2,103 papers resulted in final selection of 101 papers. A detailed template was used to extract the data. For the analysis, we used the method of interpretive synthesis, focusing first on 31 papers with theoretical or conceptual frameworks; the remaining 70 articles were used to strengthen and consolidate the findings. Results Current conceptualizations of nursing care performance mostly reflect a system perspective that builds on system theory, Donabedian’s earlier works on healthcare organization, and Parsons’ theory of social action. Drawing on these foundational works and the evidence collated, the Nursing Care Performance Framework (NCPF) we developed conceptualizes nursing care performance as resulting from three nursing subsystems that operate together to achieve three key functions: (1) acquiring, deploying and maintaining nursing resources, (2) transforming nursing resources into nursing services, and (3) producing changes in patients’ conditions. Based on the literature review, these three functions are operationalized through 14 dimensions that cover 51 variables. The NCPF not only specifies core aspects of nursing performance, it also provides decision-makers with a conceptual tool to serve as a common ground from which to define performance, devise a common and balanced set of performance indicators for a given sector of nursing care, and derive benchmarks for this sector. Conclusions The NCPF provides a comprehensive, integrated and theoretically based model that allows performance evaluation of both the overall nursing system and its subsystems. Such an approach widens the view of nursing performance to embrace a multidimensional perspective that encompasses the diverse aspects of nursing care. PMID:23496961
Schmidmaier, Ralf; Eiber, Stephan; Ebersbach, Rene; Schiller, Miriam; Hege, Inga; Holzer, Matthias; Fischer, Martin R
2013-02-22
Medical knowledge encompasses both conceptual (facts or "what" information) and procedural knowledge ("how" and "why" information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula.
The nature of dissection: Exploring student conceptions
NASA Astrophysics Data System (ADS)
York, Katharine
The model of conceptual change in science describes the process of learning as a complete restructuring of knowledge, when learners discover or are shown more plausible, intelligent alternatives to existing conceptions. Emotions have been acknowledged as part of a learner's conceptual ecology, but the effects of emotions on learning have yet to be described. This research was conducted to examine the role that emotions have on learning for thirteen high school students, as they dissected cats in a Human Anatomy and Physiology class. The project also investigated whether a student's emotional reactions may be used to develop a sense of connectedness with the nonhuman world, which is defined as ecological literacy. This study utilized a grounded theory approach, in which student responses to interviews were the primary source of data. Interviews were transcribed, and responses were coded according to a constant comparative method of analysis. Responses were compared with the four conditions necessary for conceptual change to occur, and also to five principles of ecological literacy. Students who had negative reactions to dissection participated less in the activity, and demonstrated less conceptual change. Two female students showed the strongest emotional reactions to dissection, and also the lowest amount of conceptual change. One male student also had strong negative reactions to death, and showed no conceptual change. The dissection experiences of the students in this study did not generally reflect ecological principles. The two students whose emotional reactions to dissection were the most negative demonstrated the highest degree of ecological literacy. These results provide empirical evidence of the effects that emotions have on learning, and also supports the opinions of educators who do not favor dissection, because it does not teach students to respect all forms of life.
Intercomparison of 3D pore-scale flow and solute transport simulation methods
Mehmani, Yashar; Schoenherr, Martin; Pasquali, Andrea; ...
2015-09-28
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This paper provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This study provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
The Oceanographic Multipurpose Software Environment (OMUSE v1.0)
NASA Astrophysics Data System (ADS)
Pelupessy, Inti; van Werkhoven, Ben; van Elteren, Arjen; Viebahn, Jan; Candy, Adam; Portegies Zwart, Simon; Dijkstra, Henk
2017-08-01
In this paper we present the Oceanographic Multipurpose Software Environment (OMUSE). OMUSE aims to provide a homogeneous environment for existing or newly developed numerical ocean simulation codes, simplifying their use and deployment. In this way, numerical experiments that combine ocean models representing different physics or spanning different ranges of physical scales can be easily designed. Rapid development of simulation models is made possible through the creation of simple high-level scripts. The low-level core of the abstraction in OMUSE is designed to deploy these simulations efficiently on heterogeneous high-performance computing resources. Cross-verification of simulation models with different codes and numerical methods is facilitated by the unified interface that OMUSE provides. Reproducibility in numerical experiments is fostered by allowing complex numerical experiments to be expressed in portable scripts that conform to a common OMUSE interface. Here, we present the design of OMUSE as well as the modules and model components currently included, which range from a simple conceptual quasi-geostrophic solver to the global circulation model POP (Parallel Ocean Program). The uniform access to the codes' simulation state and the extensive automation of data transfer and conversion operations aids the implementation of model couplings. We discuss the types of couplings that can be implemented using OMUSE. We also present example applications that demonstrate the straightforward model initialization and the concurrent use of data analysis tools on a running model. We give examples of multiscale and multiphysics simulations by embedding a regional ocean model into a global ocean model and by coupling a surface wave propagation model with a coastal circulation model.
Bar-Eli, Michael; Tenenbaum, Gershon; Geister, Sabine
2006-10-01
This study documents the effect of players' dismissals on team performance in professional soccer. Our aim was to determine whether the punishment meted out for unacceptable player behaviour results in reduced team performance. The official web site of the German Soccer Association was used for coding data from games played in the first Bundesliga between the 1963 - 64 and 2003 - 04 (n = 41) seasons. A sample of 743 games where at least one red card was issued was used to test hypotheses derived from crisis theory (Bar-Eli & Tenenbaum, 1989a). Players' dismissals weaken a sanctioned team in terms of the goals and final score following the punishment. The chances of a sanctioned team scoring or winning were substantially reduced following the sanction. Most cards were issued in the later stages of matches. The statistics pertaining to outcome results as a function of game standing, game location, and time phases - all strongly support the view that teams can be considered conceptually similar to individuals regarding the link between stress and performance. To further develop the concept of team and individual psychological performance crisis in competition, it is recommended that reversal theory (Apter, 1982) and self-monitoring and distraction theories (Baumeister, 1984) be included in the design of future investigations pertaining to choking under pressure.
Wranik, W Dominika; Haydt, Susan M; Katz, Alan; Levy, Adrian R; Korchagina, Maryna; Edwards, Jeanette M; Bower, Ian
2017-05-15
Reliance on interdisciplinary teams in the delivery of primary care is on the rise. Funding bodies strive to design financial environments that support collaboration between providers. At present, the design of financial arrangements has been fragmented and not based on evidence. The root of the problem is a lack of systematic evidence demonstrating the superiority of any particular financial arrangement, or a solid understanding of options. In this study we develop a framework for the conceptualization and analysis of financial arrangements in interdisciplinary primary care teams. We use qualitative data from three sources: (i) interviews with 19 primary care decision makers representing 215 clinics in three Canadian provinces, (ii) a research roundtable with 14 primary care decision makers and/or researchers, and (iii) policy documents. Transcripts from interviews and the roundtable were coded thematically and a framework synthesis approach was applied. Our conceptual framework differentiates between team level funding and provider level remuneration, and characterizes the interplay and consonance between them. Particularly the notions of hierarchy, segregation, and dependence of provider incomes, and the link between funding and team activities are introduced as new clarifying concepts, and their implications explored. The framework is applied to the analysis of collaboration incentives, which appear strongest when provider incomes are interdependent, funding is linked to the team as a whole, and accountability does not have multiple lines. Emergent implementation issues discussed by respondents include: (i) centrality of budget negotiations; (ii) approaches to patient rostering; (iii) unclear funding sources for space and equipment; and (iv) challenges with community engagement. The creation of patient rosters is perceived as a surprisingly contentious issue, and the challenges of funding for space and equipment remain unresolved. The development and application of a conceptual framework is an important step to the systematic study of the best performing financial models in the context of interdisciplinary primary care. The identification of optimal financial arrangements must be contextualized in terms of feasibility and the implementation environment. In general, financial hierarchy, both overt and covert, is considered a barrier to collaboration.
NASA Astrophysics Data System (ADS)
Franco, Gina M.
The purpose of this study was to investigate the role of epistemic beliefs and knowledge representations in cognitive and metacognitive processing and conceptual change when learning about physics concepts through text. Specifically, I manipulated the representation of physics concepts in texts about Newtonian mechanics and explored how these texts interacted with individuals' epistemic beliefs to facilitate or constrain learning. In accordance with definitions from Royce's (1983) framework of psychological epistemology, texts were developed to present Newtonian concepts in either a rational or a metaphorical format. Seventy-five undergraduate students completed questionnaires designed to measure their epistemic beliefs and their misconceptions about Newton's laws of motion. Participants then read the first of two instructional texts (in either a rational or metaphorical format), and were asked to think aloud while reading. After reading the text, participants completed a recall task and a post-test of selected items regarding Newtonian concepts. These steps were repeated with a second instructional text (in either a rational or metaphorical format, depending on which format was assigned previously). Participants' think-aloud sessions were audio-recorded, transcribed, and then blindly coded, and their recalls were scored for total number of correctly recalled ideas from the text. Changes in misconceptions were analyzed by examining changes in participants' responses to selected questions about Newtonian concepts from pretest to posttest. Results revealed that when individuals' epistemic beliefs were congruent with the knowledge representations in their assigned texts, they performed better on both online measures of learning (e.g., use of processing strategies) and offline products of learning (e.g., text recall, changes in misconceptions) than when their epistemic beliefs were incongruent with the knowledge representations. These results have implications for how researchers conceptualize epistemic beliefs and are in line with contemporary views regarding the context sensitivity of individuals' epistemic beliefs. Moreover, the findings from this study not only support current theory about the dynamic and interactive nature of conceptual change, but also advance empirical work in this area by identifying knowledge representations as a text characteristic that may play an important role in the change process.
ERIC Educational Resources Information Center
Hwang, SungWon; Kim, Mijung
2009-01-01
We review Brown and Kloser's article, "Conceptual continuity and the science of baseball: using informal science literacy to promote students science learning" from a Vygotskian cultural-historical and dialectic perspective. Brown and Kloser interpret interview data with student baseball players and claim that students' conceptual understanding…
To Master or Perform? Exploring Relations between Achievement Goals and Conceptual Change Learning
ERIC Educational Resources Information Center
Ranellucci, John; Muis, Krista R.; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M.
2013-01-01
Background: Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. Aims: To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Sample and Method:…
Taylor, Randolph S; Francis, Wendy S
2017-03-01
Previous literature has demonstrated conceptual repetition priming across languages in bilinguals. This between-language priming effect is taken as evidence that translation equivalents have shared conceptual representations across languages. However, the vast majority of this research has been conducted using only concrete nouns as stimuli. The present experiment examined conceptual repetition priming within and between languages in adjectives, a part of speech not previously investigated in studies of bilingual conceptual representation. The participants were 100 Spanish-English bilinguals who had regular exposure to both languages. At encoding, participants performed a shallow processing task and a deep-processing task on English and Spanish adjectives. At test, they performed an antonym-generation task in English, in which the target responses were either adjectives presented at encoding or control adjectives not previously presented. The measure of priming was the response time advantage for producing repeated adjectives relative to control adjectives. Significant repetition priming was observed both within and between languages under deep, but not shallow, encoding conditions. The results indicate that the conceptual representations of adjective translation equivalents are shared across languages.
Architectural design of an Algol interpreter
NASA Technical Reports Server (NTRS)
Jackson, C. K.
1971-01-01
The design of a syntax-directed interpreter for a subset of Algol is described. It is a conceptual design with sufficient details and completeness but as much independence of implementation as possible. The design includes a detailed description of a scanner, an analyzer described in the Floyd-Evans productions, a hash-coded symbol table, and an executor. Interpretation of sample programs is also provided to show how the interpreter functions.
Role Allocation and Team Structure in Command and Control Teams
2014-06-01
organizational psychology and management sciences literature show concepts such as empowered self-management and self-regulating work teams (see Cooney, 2004...tankers (FT), search units (S) and rescue units (R). Each unit is represented on the map by a numbered icon. Each type of unit is colour -coded and...Understanding team adaptation: A conceptual analysis and model. Journal of Applied Psychology , 91, 1189-1207. Cannon-Bowers, J. A., Tannenbaum
CHIME: A Metadata-Based Distributed Software Development Environment
2005-01-01
structures by using typography , graphics , and animation. The Software Im- mersion in our conceptual model for CHIME can be seen as a form of Software...Even small- to medium-sized development efforts may involve hundreds of artifacts -- design documents, change requests, test cases and results, code...for managing and organizing information from all phases of the software lifecycle. CHIME is designed around an XML-based metadata architecture, in
Exploration of Configuration Options for a Large Civil Compound Helicopter
NASA Technical Reports Server (NTRS)
Russell, Carl; Johnson, Wayne
2013-01-01
Multiple compound helicopter configurations are designed using a combination of rotorcraft sizing and comprehensive analysis codes. Results from both the conceptual design phase and rotor comprehensive analysis are presented. The designs are evaluated for their suitability to a short-to-medium-haul civil transport mission carrying a payload of 90 passengers. Multiple metrics are used to determine the best configuration, with heavy emphasis placed on minimizing fuel burn.
How is Shared Decision-Making Defined among African-Americans with Diabetes?
Peek, Monica E.; Quinn, Michael T.; Gorawara-Bhat, Rita; Odoms-Young, Angela; Wilson, Shannon C.; Chin, Marshall H.
2011-01-01
Objective This study investigates how shared decision-making (SDM) is defined by African-American patients with diabetes, and compares patients’ conceptualization of SDM with the Charles model. Methods We utilized race-concordant interviewers/moderators to conduct in-depth interviews and focus groups among a purposeful sample of African-American patients with diabetes. Each interview/focus group was audio-taped, transcribed verbatim and imported into Atlas.ti software. Coding was done using an iterative process and each transcription was independently coded by two members of the research team. Results Although the conceptual domains were similar, patient definitions of what it means to “share” in the decision-making process differed significantly from the Charles model of SDM. Patients stressed the value of being able to “tell their story and be heard” by physicians, emphasized the importance of information sharing rather than decision-making sharing, and included an acceptable role for non-adherence as a mechanism to express control and act on treatment preferences. Conclusion Current instruments may not accurately measure decision-making preferences of African-American patients with diabetes. Practice Implications Future research should develop instruments to effectively measure decision-making preferences within this population. Emphasizing information-sharing that validates patients’ experiences may be particularly meaningful to African-Americans with diabetes. PMID:18684581
van den Hurk, J; Gentile, F; Jansma, B M
2011-12-01
The identification of a face comprises processing of both visual features and conceptual knowledge. Studies showing that the fusiform face area (FFA) is sensitive to face identity generally neglect this dissociation. The present study is the first that isolates conceptual face processing by using words presented in a person context instead of faces. The design consisted of 2 different conditions. In one condition, participants were presented with blocks of words related to each other at the categorical level (e.g., brands of cars, European cities). The second condition consisted of blocks of words linked to the personality features of a specific face. Both conditions were created from the same 8 × 8 word matrix, thereby controlling for visual input across conditions. Univariate statistical contrasts did not yield any significant differences between the 2 conditions in FFA. However, a machine learning classification algorithm was able to successfully learn the functional relationship between the 2 contexts and their underlying response patterns in FFA, suggesting that these activation patterns can code for different semantic contexts. These results suggest that the level of processing in FFA goes beyond facial features. This has strong implications for the debate about the role of FFA in face identification.
Components for digitally controlled aircraft engines
NASA Technical Reports Server (NTRS)
Meador, J. D.
1981-01-01
Control system components suitable for use in digital electronic control systems are defined. Compressor geometry actuation concepts and fuel handling system concepts suitable for use in large high performance turbofan/turbojet engines are included. Eight conceptual system designs were formulated for the actuation of the compressor geometry. Six conceptual system designs were formulated for the engine fuel handling system. Assessment criteria and weighting factors were established and trade studies performed on their candidate systems to establish the relative merits of the various concepts. Fuel pumping and metering systems for small turboshaft engines were also studied. Seven conceptual designs were formulated, and trade studies performed. A simplified bypassing fuel metering scheme was selected and a preliminary design defined.
NASA Astrophysics Data System (ADS)
Maries, Alexandru; Singh, Chandralekha
2015-12-01
It has been found that activation of a stereotype, for example by indicating one's gender before a test, typically alters performance in a way consistent with the stereotype, an effect called "stereotype threat." On a standardized conceptual physics assessment, we found that asking test takers to indicate their gender right before taking the test did not deteriorate performance compared to an equivalent group who did not provide gender information. Although a statistically significant gender gap was present on the standardized test whether or not students indicated their gender, no gender gap was observed on the multiple-choice final exam students took, which included both quantitative and conceptual questions on similar topics.
The Effect of Contextual and Conceptual Rewording on Mathematical Problem-Solving Performance
ERIC Educational Resources Information Center
Haghverdi, Majid; Wiest, Lynda R.
2016-01-01
This study shows how separate and combined contextual and conceptual problem rewording can positively influence student performance in solving mathematical word problems. Participants included 80 seventh-grade Iranian students randomly assigned in groups of 20 to three experimental groups involving three types of rewording and a control group. All…
The role of exhibits in teacher workshops at science museums
NASA Astrophysics Data System (ADS)
Stein, Fred D.
Between 1986 and 1998, the Exploratorium Institute for Inquiry offered multi-week science institutes for elementary educators involving museum exhibit use during a three-day independent investigation on light and color. Many museums tend to underutilize exhibit use in their teacher education programs. This study addresses the question, "What are the contributions of exhibit use to teachers' learning of science content during a workshop at a science museum?" Data from workshops over three successive years was collected in the form of 13 case studies of participants' investigations. Pre- and post-testing of six participants showed a large (ES = 3.0 SD) and significant gain in their understanding of light and color concepts. The case studies were analyzed by coding each incident of exhibit use according to how the exhibit interaction might have helped the participant in his or her learning. Clusters of recurring themes emerged inductively from the coding process suggesting that the exhibits conferred both logistical and conceptual benefits. Logistically, the exhibits acted as "labor-saving" devices, saving participants time because they were always set up and ready to use, and saving the workshop facilitators time because facilitators could recommend that a participant visit an exhibit rather than spend time giving them individual attention or helping them construct their own investigation apparatus. Conceptually, the exhibits supported each aspect of the Piagetian conceptual change process---accommodation, assimilation, and disequilibrium. They supported accommodation of idea structures and the development of new ones by encouraging participants to ask and answer "What would happen if...?" questions which often generated ideas to explain the new experiences. They supported assimilation of experiences into recently developed idea structures or schemes (supporting and consolidating them) by providing opportunities for participants to ask and answer "Will it happen if...?" questions that reinforced ideas by hypothesis testing through predicting. And they supported creating disequilibrium by presenting perplexing phenomena, provoking participants to ask and want to answer "Why did that happen?" questions. Exhibit qualities that make them particularly effective at supporting conceptual change are discussed and illustrated by examples from the cases. Recommendations for using science museum exhibits in teacher workshops are offered.
Lin, Changyu; Zou, Ding; Liu, Tao; Djordjevic, Ivan B
2016-08-08
A mutual information inspired nonbinary coded modulation design with non-uniform shaping is proposed. Instead of traditional power of two signal constellation sizes, we design 5-QAM, 7-QAM and 9-QAM constellations, which can be used in adaptive optical networks. The non-uniform shaping and LDPC code rate are jointly considered in the design, which results in a better performance scheme for the same SNR values. The matched nonbinary (NB) LDPC code is used for this scheme, which further improves the coding gain and the overall performance. We analyze both coding performance and system SNR performance. We show that the proposed NB LDPC-coded 9-QAM has more than 2dB gain in symbol SNR compared to traditional LDPC-coded star-8-QAM. On the other hand, the proposed NB LDPC-coded 5-QAM and 7-QAM have even better performance than LDPC-coded QPSK.
ERIC Educational Resources Information Center
Guthormsen, Amy M.; Fisher, Kristie J.; Bassok, Miriam; Osterhout, Lee; DeWolf, Melissa; Holyoak, Keith J.
2016-01-01
Research on language processing has shown that the disruption of conceptual integration gives rise to specific patterns of event-related brain potentials (ERPs)--N400 and P600 effects. Here, we report similar ERP effects when adults performed cross-domain conceptual integration of analogous semantic and mathematical relations. In a problem-solving…
Vakil, E; Sigal, J
1997-07-01
Twenty-four closed-head-injured (CHI) and 24 control participants studied two word lists under shallow (i.e., nonsemantic) and deep (i.e., semantic) encoding conditions. They were then tested on free recall, perceptual priming (i.e., perceptual partial word identification) and conceptual priming (i.e., category production) tasks. Previous findings have demonstrated that memory in CHI is characterized by inefficient conceptual processing of information. It was thus hypothesized that the CHI participants would perform more poorly than the control participants on the explicit and on the conceptual priming tasks. On these tasks the CHI group was expected to benefit to a lesser degree from prior deep encoding, as compared to controls. The groups were not expected to significantly differ from each other on the perceptual priming task. Prior deep encoding was not expected to improve the perceptual priming performance of either group. All findings were as predicted, with the exception that a significant effect was not found between groups for deep encoding in the conceptual priming task. The results are discussed (1) in terms of their theoretical contribution in further validating the dissociation between perceptual and conceptual priming; and (2) in terms of the contribution in differentiating between amnesic and CHI patients. Conceptual priming is preserved in amnesics but not in CHI patients.
Performance and Fabrication Status of TREAT LEU Conversion Conceptual Design Concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
IJ van Rooyen; SR Morrell; AE Wright
2014-10-01
Resumption of transient testing at the TREAT facility was approved in February 2014 to meet U.S. Department of Energy (DOE) objectives. The National Nuclear Security Administration’s Global Threat Reduction Initiative Convert Program is evaluating conversion of TREAT from its existing highly enriched uranium (HEU) core to a new core containing low enriched uranium (LEU). This paper describes briefly the initial pre-conceptual designs screening decisions with more detailed discussions on current feasibility, qualification and fabrication approaches. Feasible fabrication will be shown for a LEU fuel element assembly that can meet TREAT design, performance, and safety requirements. The statement of feasibility recognizesmore » that further development, analysis, and testing must be completed to refine the conceptual design. Engineering challenges such as cladding oxidation, high temperature material properties, and fuel block fabrication along with neutronics performance, will be highlighted. Preliminary engineering and supply chain evaluation provided confidence that the conceptual designs can be achieved.« less
NASA Astrophysics Data System (ADS)
Said, Asma
Despite the advances made in various fields, women are still considered as minorities in the fields of science and mathematics. There is a gender gap regarding women's participation and achievement in physics. Self-efficacy and attitudes and beliefs toward physics have been identified as predictors of students' performance on conceptual surveys in physics courses. The present study, which used two-way analysis of variance and multiple linear regression analyses at a community college in California, revealed there is no gender gap in achievement between male and female students in physics courses. Furthermore, there is an achievement gap between students who are enrolled in algebra-based and calculus-based physics courses. The findings indicate that attitudes and beliefs scores can be used as predictors of students' performance on conceptual surveys in physics courses. However, scores of self-efficacy cannot be used as predictors of students' performance on conceptual surveys in physics courses.
ERIC Educational Resources Information Center
Fenollar, Pedro; Roman, Sergio; Cuestas, Pedro J.
2007-01-01
Background: The prediction and explanation of academic performance and the investigation of the factors relating to the academic success and persistence of students are topics of utmost importance in higher education. Aims: The main aim of the present study is to develop and test a conceptual framework in a university context, where the effects of…
ERIC Educational Resources Information Center
Svensson, Goran; Wood, Greg
2011-01-01
Purpose: The objective of this paper is to introduce and describe a conceptual framework of corporate and business ethics across organizations in terms of ethical structures, ethical processes and ethical performance. Design/methodology/approach: A framework is outlined and positioned incorporating an ethical frame of reference in the field of…
A Conceptual Framework to Help Evaluate the Quality of Institutional Performance
ERIC Educational Resources Information Center
Kettunen, Juha
2008-01-01
Purpose: This study aims to present a general conceptual framework which can be used to evaluate quality and institutional performance in higher education. Design/methodology/approach: The quality of higher education is at the heart of the setting up of the European Higher Education Area. Strategic management is widely used in higher education…
ERIC Educational Resources Information Center
Gok, Tolga; Gok, Ozge
2016-01-01
The aim of this research was to investigate the effects of peer instruction on learning strategies, problem solving performance, and conceptual understanding of college students in a general chemistry course. The research was performed students enrolled in experimental and control groups of a chemistry course were selected. Students in the…
Sensemaking, Stakeholder Discord, and Long-Term Risk Communication at a U.S. Superfund Site
Hoover, Anna Goodman
2018-01-01
Introduction Risk communication can help reduce exposures to environmental contaminants, mitigate negative health outcomes, and inform community-based decisions about hazardous waste sites. While communication best practices have long guided such efforts, little research has examined unintended consequences arising from such guidelines. As rhetoric informs stakeholder sensemaking, the language used in and reinforced by these guidelines can challenge relationships and exacerbate stakeholder tensions. Objectives This study evaluates risk communication at a U.S. Superfund site to identify unintended consequences arising from current risk communication practices. Methods This qualitative case study crystallizes data spanning 6 years from three sources: 1) local newspaper coverage of site-related topics; 2) focus-group transcripts from a multi-year project designed to support future visioning of site use; and 3) published blog entries authored by a local environmental activist. Constant comparative analysis provides the study’s analytic foundation, with qualitative data analysis software QSR NVivo 8 supporting a three-step process: 1) provisional coding to identify broad topic categories within datasets, 2) coding occurrences of sensemaking constructs and emergent intra-dataset patterns, and 3) grouping related codes across datasets to examine the relationships among them. Results Existing risk communication practices at this Superfund site contribute to a dichotomous conceptualization of multiple and diverse stakeholders as members of one of only two categories: the government or the public. This conceptualization minimizes perceptions of capacity, encourages public commitment to stances aligned with a preferred group, and contributes to negative expectations that can become self-fulfilling prophecies. Conclusion Findings indicate a need to re-examine and adapt risk communication guidelines to encourage more pluralistic understanding of the stakeholder landscape. PMID:28282297
Numerical simulation of groundwater flow in Dar es Salaam Coastal Plain (Tanzania)
NASA Astrophysics Data System (ADS)
Luciani, Giulia; Sappa, Giuseppe; Cella, Antonella
2016-04-01
They are presented the results of a groundwater modeling study on the Coastal Aquifer of Dar es Salaam (Tanzania). Dar es Salaam is one of the fastest-growing coastal cities in Sub-Saharan Africa, with with more than 4 million of inhabitants and a population growth rate of about 8 per cent per year. The city faces periodic water shortages, due to the lack of an adequate water supply network. These two factors have determined, in the last ten years, an increasing demand of groundwater exploitation, carried on by quite a number of private wells, which have been drilled to satisfy human demand. A steady-state three dimensional groundwater model has been set up by the MODFLOW code, and calibrated with the UCODE code for inverse modeling. The aim of the model was to carry out a characterization of groundwater flow system in the Dar es Salaam Coastal Plain. The inputs applied to the model included net recharge rate, calculated from time series of precipitation data (1961-2012), estimations of average groundwater extraction, and estimations of groundwater recharge, coming from zones, outside the area under study. Parametrization of the hydraulic conductivities was realized referring to the main geological features of the study area, based on available literature data and information. Boundary conditions were assigned based on hydrogeological boundaries. The conceptual model was defined in subsequent steps, which added some hydrogeological features and excluded other ones. Calibration was performed with UCODE 2014, using 76 measures of hydraulic head, taken in 2012 referred to the same season. Data were weighted on the basis of the expected errors. Sensitivity analysis of data was performed during calibration, and permitted to identify which parameters were possible to be estimated, and which data could support parameters estimation. Calibration was evaluated based on statistical index, maps of error distribution and test of independence of residuals. Further model analysis was performed after calibration, to test model performance under a range of variations of input variables.
Coding, Organization and Feedback Variables in Motor Skills.
1982-04-01
teachers) as anyone else--has been its nondirectional and incompletely conceptualized nature . Those involved in research now are being urged to avoid...functional evaluations. It constitutes more than simply a methodology; it is an ideology for studying ’how things work’ and by its nature draws on many...not necessarily dependent on the physical nature of the system. It furnishes a superstructure for interpreting and comparing input from a multitude of
Clayton, Margaret F; Latimer, Seth; Dunn, Todd W; Haas, Leonard
2011-09-01
This study evaluated variables thought to influence patient's perceptions of patient-centeredness. We also compared results from two coding schemes that purport to evaluate patient-centeredness, the Measure of Patient-Centered Communication (MPCC) and the 4 Habits Coding Scheme (4HCS). 174 videotaped family practice office visits, and patient self-report measures were analyzed. Patient factors contributing to positive perceptions of patient-centeredness were successful negotiation of decision-making roles and lower post-visit uncertainty. MPCC coding found visits were on average 59% patient-centered (range 12-85%). 4HCS coding showed an average of 83 points (maximum possible 115). However, patients felt their visits were highly patient-centered (mean 3.7, range 1.9-4; maximum possible 4). There was a weak correlation between coding schemes, but no association between coding results and patient variables (number of pre-visit concerns, attainment of desired decision-making role, post-visit uncertainty, patients' perception of patient-centeredness). Coder inter-rater reliability was lower than expected; convergent and divergent validity were not supported. The 4HCS and MPCC operationalize patient-centeredness differently, illustrating a lack of conceptual clarity. The patient's perspective is important. Family practice providers can facilitate a more positive patient perception of patient-centeredness by addressing patient concerns to help reduce patient uncertainty, and by negotiating decision-making roles. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mueller, Anneliese Marie
Given the prominence of sense of place in new environmental education curricula, this study aims to strengthen the conceptual and empirical foundations of sense of place, and to determine how sense of place may be linked to environmentally responsible behavior. For this study, five commercial fishermen and five organic farmers from the New England Seacoast region participated in a series of in-depth phenomenological interviews and observations. The data was systematically coded in order to allow themes and categories to emerge. The results indicate that aspects of the existing conceptual framework of sense of place, such as place attachment, ecological knowledge, and public involvement, do in fact describe the relationship between people and place. However, the results also indicate that two conceptual elements---attention to social context and awareness of moral theory---are missing from the current conceptual framework in EE theory. These results suggest that the current framework should be expanded to emphasize the role of human and non-human communities: the development of a sense of place and the learning of environmentally responsible behavior must be situated within a social context. This study lends support to the view that for sense of place to move people to ethical action, it is crucial for them to recognize, and to participate in, a community of support and care.
NASA Technical Reports Server (NTRS)
Rajpal, Sandeep; Rhee, DoJun; Lin, Shu
1997-01-01
In this paper, we will use the construction technique proposed in to construct multidimensional trellis coded modulation (TCM) codes for both the additive white Gaussian noise (AWGN) and the fading channels. Analytical performance bounds and simulation results show that these codes perform very well and achieve significant coding gains over uncoded reference modulation systems. In addition, the proposed technique can be used to construct codes which have a performance/decoding complexity advantage over the codes listed in literature.
ERIC Educational Resources Information Center
Ishimoto, Michi; Thornton, Ronald K.; Sokoloff, David R.
2014-01-01
This study assesses the Japanese translation of the Force and Motion Conceptual Evaluation (FMCE). Researchers are often interested in comparing the conceptual ideas of students with different cultural backgrounds. The FMCE has been useful in identifying the concepts of English-speaking students from different backgrounds. To identify effectively…
Eckert, Katharina G; Lange, Martin A
2015-03-14
Physical activity questionnaires (PAQ) have been extensively used to determine physical activity (PA) levels. Most PAQ are derived from an energy expenditure-based perspective and assess activities with a certain intensity level. Activities with a moderate or vigorous intensity level are predominantly used to determine a person's PA level in terms of quantity. Studies show that the time spent engaging in moderate and vigorous intensity PA does not appropriately reflect the actual PA behavior of older people because they perform more functional, everyday activities. Those functional activities are more likely to be considered low-intense and represent an important qualitative health-promoting activity. For the elderly, functional, light intensity activities are of special interest but are assessed differently in terms of quantity and quality. The aim was to analyze the content of PAQ for the elderly. N = 18 sufficiently validated PAQ applicable to adults (60+) were included. Each item (N = 414) was linked to the corresponding code of the International Classification of Functioning, Disability and Health (ICF) using established linking rules. Kappa statistics were calculated to determine rater agreement. Items were linked to 598 ICF codes and 62 different ICF categories. A total of 43.72% of the codes were for sports-related activities and 14.25% for walking-related activities. Only 9.18% of all codes were related to household tasks. Light intensity, functional activities are emphasized differently and are underrepresented in most cases. Additionally, sedentary activities are underrepresented (5.55%). κ coefficients were acceptable for n = 16 questionnaires (0.48-1.00). There is a large inconsistency in the understandings of PA in elderly. Further research should focus (1) on a conceptual understanding of PA in terms of the behavior of the elderly and (2) on developing questionnaires that inquire functional, light intensity PA, as well as sedentary activities more explicitly.
SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARRO CA
2010-03-09
This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less
Miceli, G; Capasso, R; Daniele, A; Esposito, T; Magarelli, M; Tomaiuolo, F
2000-09-01
As a consequence of a head trauma, APA presented with selective anomia for the names of familiar people, in the absence of comparable disorders for common names and other proper names. Face recognition was normal; and naming performance was unaffected by stimulus and response types. Selective proper name anomia was not due to effects of frequency of usage or of age of acquisition, or to selective memory/learning deficits for the names of people. Even though APA was able to provide at least some information on many celebrities whom she failed to name, she was clearly impaired in all tasks that required full conceptual information on the same people (but she performed flawlessly in similar tasks that involved common names). This pattern of performance indicates that in our subject the inability to name familiar persons results from damage to conceptual information. It is argued that detailed analyses of conceptual knowledge are necessary before it is concluded that a subject with proper name anomia suffers from a purely output disorder, as opposed to a conceptual disorder. The behaviour observed in APA is consistent with the domain-specific hypothesis of conceptual organisation (Caramazza & Shelton, 1998), and in this framework can be explained by assuming selective damage to knowledge of conspecifics. The anatomo-clinical correlates of our subject's disorder are discussed with reference to recent hypotheses on the neural structures representing knowledge of familiar people.
The adaption and use of research codes for performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebetrau, A.M.
1987-05-01
Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less
ERIC Educational Resources Information Center
Adeleke, M. A.
2007-01-01
The paper examined the possibility of finding out if improvements in students' problem solving performance in simultaneous linear equation will be recorded with the use of procedural and conceptual learning strategies and in addition to find out which of the strategies will be more effective. The study adopted a pretest, post test control group…
2013-01-01
Background Medical knowledge encompasses both conceptual (facts or “what” information) and procedural knowledge (“how” and “why” information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Methods Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Conclusions Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula. PMID:23433202
Clinician roles and responsibilities during care transitions of older adults.
Schoenborn, Nancy L; Arbaje, Alicia I; Eubank, Kathryn J; Maynor, Kenric; Carrese, Joseph A
2013-02-01
To identify the perceived roles and responsibilities of clinicians during care transitions of older adults. Qualitative study involving 1-hour in-depth semistructured interviews. Audiotapes of interviews were transcribed, coded, and analyzed, and themes and subthemes were generated. An acute care hospital, a skilled nursing facility, two community-based outpatient practices, and one home healthcare agency. Forty healthcare professionals directly involved in care transitions of older adults (18 physicians, 11 home healthcare administrative and field staff, four social workers, three nurse practitioners, three physician assistants, and one hospital case manager). Perspectives of healthcare professionals regarding clinicians' roles and responsibilities during care transitions were examined and described. Content analysis revealed several themes: components of clinicians' roles during care transitions; congruence between self- and others' perceived ideal roles but incongruence between ideal and routine roles; ambiguity in accountability in the postdischarge period; factors prompting clinicians to act closer to ideal roles; and barriers to performing ideal roles. A conceptual framework was created to summarize clinicians' roles during care transitions. This study reports differences between what healthcare professionals perceive as ideal roles of clinicians during care transitions and what clinicians actually do routinely. Certain patient and clinician factors prompt clinicians to act closer to the ideal roles. Multiple barriers interfere with consistent practice of ideal roles. Future investigations could evaluate interventions targeting various components of the conceptual framework and relevant outcomes. © 2013, Copyright the Authors Journal compilation © 2013, The American Geriatrics Society.
Mars orbiter conceptual systems design study
NASA Technical Reports Server (NTRS)
Dixon, W.; Vogl, J.
1982-01-01
Spacecraft system and subsystem designs at the conceptual level to perform either of two Mars Orbiter missions, a Climatology Mission and an Aeronomy Mission were developed. The objectives of these missions are to obtain and return data.
Creative Writing and Learning in a Conceptual Astrophysics Course
NASA Astrophysics Data System (ADS)
Berenson, R.
2012-08-01
Creative writing assignments in a conceptual astrophysics course for liberal arts students can reduce student anxiety. This study demonstrates that such assignments also can aid learning as demonstrated by significantly improved performance on exams.
Conceptual design study of potential early commercial MHD powerplant. Report of task 2 results
NASA Astrophysics Data System (ADS)
Hals, F. A.
1981-03-01
The conceptual design of one of the potential early commercial MHD power plants was studied. The plant employs oxygen enrichment of the combustion air and preheating of this oxygen enriched air to an intermediate temperature of 1200 F attainable with a tubular type recuperative heat exchanger. Conceptual designs of plant componets and equipment with performance, operational characteristics, and costs are reported. Plant economics and overall performance including full and part load operation are reviewed. The projected performance and estimated cost of this early MHD plant are compared to conventional power plants, although it does not offer the same high efficiency and low costs as the mature MHD power plant. Environmental aspects and the methods incorporated in plant design for emission control of sulfur and nitrogen are reviewed.
Zavalkoff, Anne
2002-10-01
SUMMARY This paper presents a conceptual tool designed to help teacher education students think critically about the roots and consequences of personal, parental, community, and institutional resistance to diverse sexual identities and behaviours. To explore the roots of sexualized and gendered prejudice and ground the conceptual tool theoretically, it begins with a careful examination of Judith Butler's work on performativity. The paper then describes and illustrates the conceptual tool. The Continuum of (Subversive) Drag Performance helps stimulate critical thinking about the power implications of people's sexed and gendered performances through its six ranges: Radical, Stealth, Commercial, Passing, Mainstream, and Privileged. Because these ranges are independent of common considerations of "normalcy," they offer teacher education students a relatively unthreatening framework for analyzing conceptions of sexuality and gender that, left unexamined, can contribute to sexism, heterosexism, and homophobia.
Conceptual design study of potential early commercial MHD powerplant. Report of task 2 results
NASA Technical Reports Server (NTRS)
Hals, F. A.
1981-01-01
The conceptual design of one of the potential early commercial MHD power plants was studied. The plant employs oxygen enrichment of the combustion air and preheating of this oxygen enriched air to an intermediate temperature of 1200 F attainable with a tubular type recuperative heat exchanger. Conceptual designs of plant componets and equipment with performance, operational characteristics, and costs are reported. Plant economics and overall performance including full and part load operation are reviewed. The projected performance and estimated cost of this early MHD plant are compared to conventional power plants, although it does not offer the same high efficiency and low costs as the mature MHD power plant. Environmental aspects and the methods incorporated in plant design for emission control of sulfur and nitrogen are reviewed.
ERIC Educational Resources Information Center
Seker, Burcu Sezginsoy; Erdem, Aliye
2017-01-01
Students learning a defined subject only perform by learning of thinking based on the concepts forming that subjects. Otherwise, students may move away from the scientific meaning of concepts and may fall into conceptual errors. Students' conceptual errors affect their following learning and cause them resist change. It is possible to prevent this…
Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Aubin-Auger, Isabelle; Mercier, Alain; Pasquet, Thomas; Rusch, Emmanuel; Hendrickx, Kristin; Vermeire, Etienne
2014-07-02
Therapeutic inertia has been defined as the failure of health-care provider to initiate or intensify therapy when therapeutic goals are not reached. It is regarded as a major cause of uncontrolled hypertension. The exploration of its causes and the interventions to reduce it are plagued by unclear conceptualizations and hypothesized mechanisms. We therefore systematically searched the literature for definitions and discussions on the concept of therapeutic inertia in hypertension in primary care, to try and form an operational definition. A systematic review of all types of publications related to clinical inertia in hypertension was performed. Medline, EMbase, PsycInfo, the Cochrane library and databases, BDSP, CRD and NGC were searched from the start of their databases to June 2013. Articles were selected independently by two authors on the basis of their conceptual content, without other eligibility criteria or formal quality appraisal. Qualitative data were extracted independently by two teams of authors. Data were analyzed using a constant comparative qualitative method. The final selection included 89 articles. 112 codes were grouped in 4 categories: terms and definitions (semantics), "who" (physician, patient or system), "how and why" (mechanisms and reasons), and "appropriateness". Regarding each of these categories, a number of contradictory assertions were found, most of them relying on little or no empirical data. Overall, the limits of what should be considered as inertia were not clear. A number of authors insisted that what was considered deleterious inertia might in fact be appropriate care, depending on the situation. Our data analysis revealed a major lack of conceptualization of therapeutic inertia in hypertension and important discrepancies regarding its possible causes, mechanisms and outcomes. The concept should be split in two parts: appropriate inaction and inappropriate inertia. The development of consensual and operational definitions relying on empirical data and the exploration of the intimate mechanisms that underlie these behaviors are now needed.
2014-01-01
Background Therapeutic inertia has been defined as the failure of health-care provider to initiate or intensify therapy when therapeutic goals are not reached. It is regarded as a major cause of uncontrolled hypertension. The exploration of its causes and the interventions to reduce it are plagued by unclear conceptualizations and hypothesized mechanisms. We therefore systematically searched the literature for definitions and discussions on the concept of therapeutic inertia in hypertension in primary care, to try and form an operational definition. Methods A systematic review of all types of publications related to clinical inertia in hypertension was performed. Medline, EMbase, PsycInfo, the Cochrane library and databases, BDSP, CRD and NGC were searched from the start of their databases to June 2013. Articles were selected independently by two authors on the basis of their conceptual content, without other eligibility criteria or formal quality appraisal. Qualitative data were extracted independently by two teams of authors. Data were analyzed using a constant comparative qualitative method. Results The final selection included 89 articles. 112 codes were grouped in 4 categories: terms and definitions (semantics), “who” (physician, patient or system), “how and why” (mechanisms and reasons), and “appropriateness”. Regarding each of these categories, a number of contradictory assertions were found, most of them relying on little or no empirical data. Overall, the limits of what should be considered as inertia were not clear. A number of authors insisted that what was considered deleterious inertia might in fact be appropriate care, depending on the situation. Conclusions Our data analysis revealed a major lack of conceptualization of therapeutic inertia in hypertension and important discrepancies regarding its possible causes, mechanisms and outcomes. The concept should be split in two parts: appropriate inaction and inappropriate inertia. The development of consensual and operational definitions relying on empirical data and the exploration of the intimate mechanisms that underlie these behaviors are now needed. PMID:24989986
NASA Technical Reports Server (NTRS)
Hinds, Erold W. (Principal Investigator)
1996-01-01
This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.
A conceptual framework for evaluation of public health and primary care system performance in iran.
Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Akbari Sari, Ali; Mesdaghinia, Alireza
2015-01-26
The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report.
In the Rearview Mirror: Social Skill Development in Deaf Youth, 1990-2015.
Cawthon, Stephanie W; Fink, Bentley; Schoffstall, Sarah; Wendel, Erica
2018-01-01
Social skills are a vehicle by which individuals negotiate important relationships. The present article presents historical data on how social skills in deaf students were conceptualized and studied empirically during the period 1990-2015. Using a structured literature review approach, the researchers coded 266 articles for theoretical frameworks used and constructs studied. The vast majority of articles did not explicitly align with a specific theoretical framework. Of the 37 that did, most focused on socioemotional and cognitive frameworks, while a minority drew from frameworks focusing on attitudes, developmental theories, or ecological systems theory. In addition, 315 social-skill constructs were coded across the data set; the majority focused on socioemotional functioning. Trends in findings across the past quarter century and implications for research and practice are examined.
Development of Methodology for Programming Autonomous Agents
NASA Technical Reports Server (NTRS)
Erol, Kutluhan; Levy, Renato; Lang, Lun
2004-01-01
A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently
Perceptual processing affects conceptual processing.
Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W
2008-04-05
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.
ERIC Educational Resources Information Center
Leary, Mark R.; Jongman-Sereno, Katrina P.
2014-01-01
The authors begin their commentary by saying that Sommer, Leuschner, and Scheithauer (2014) did an admirable job of reviewing and integrating research on school shootings across a broad array of studies that relied on varied conceptualizations, operational definitions, coding strategies, and samples of shootings. The authors agree with Sommer et…
2014-08-01
Dynamics” (Project Code: 10az01). The Socio- cognitive Systems Section (SCSS) at Defence Research and Development Canada (DRDC), Toronto Research Centre...these actors to its core strategic-level factors. It serves as a cognitive model—or “primer”—on this class of irregular adversary as well as a...1.1 Background The Socio- Cognitive Systems Section (SCSS) at Defence Research and Development Canada (DRDC), Toronto Research Centre has completed a
Data Quality and Reliability Analysis of U.S. Marine Corps Ground Vehicle Maintenance Records
2015-06-01
Corporation conducted a study on data quality issues present in U. S. Army logistics data ( Galway & Hanks, 1996). The study breaks data issues into three...categories: operational, conceptual, and organizational problems ( Galway & Hanks, 1996). Operational data problems relate to the number of missing or...codes (EIC) are left blank ( Galway & Hanks, 1996, p. 26). Missing entries are attributed to an assumed lack of significance of the EIC. The issue is
2016-01-05
is we made progress, ex post facto , in terms of methodology development towards what DARPA recently called for, namely “improve[ing] our understanding...for their bases of activity. Prior quantitative research , by the P.I. and by others, has shown that tribalism is an important incubator of Islamist...undergraduate students partook in the research , and were involved in conceptualization of problems, quantitative and qualitative research , coding of
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1995-01-01
This report focuses on the results obtained during the PI's recent sabbatical leave at the Swiss Federal Institute of Technology (ETH) in Zurich, Switzerland, from January 1, 1995 through June 30, 1995. Two projects investigated various properties of TURBO codes, a new form of concatenated coding that achieves near channel capacity performance at moderate bit error rates. The performance of TURBO codes is explained in terms of the code's distance spectrum. These results explain both the near capacity performance of the TURBO codes and the observed 'error floor' for moderate and high signal-to-noise ratios (SNR's). A semester project, entitled 'The Realization of the Turbo-Coding System,' involved a thorough simulation study of the performance of TURBO codes and verified the results claimed by previous authors. A copy of the final report for this project is included as Appendix A. A diploma project, entitled 'On the Free Distance of Turbo Codes and Related Product Codes,' includes an analysis of TURBO codes and an explanation for their remarkable performance. A copy of the final report for this project is included as Appendix B.
Wilson, Stephen M; Isenberg, Anna Lisette; Hickok, Gregory
2009-11-01
Word production is a complex multistage process linking conceptual representations, lexical entries, phonological forms and articulation. Previous studies have revealed a network of predominantly left-lateralized brain regions supporting this process, but many details regarding the precise functions of different nodes in this network remain unclear. To better delineate the functions of regions involved in word production, we used event-related functional magnetic resonance imaging (fMRI) to identify brain areas where blood oxygen level-dependent (BOLD) responses to overt picture naming were modulated by three psycholinguistic variables: concept familiarity, word frequency, and word length, and one behavioral variable: reaction time. Each of these variables has been suggested by prior studies to be associated with different aspects of word production. Processing of less familiar concepts was associated with greater BOLD responses in bilateral occipitotemporal regions, reflecting visual processing and conceptual preparation. Lower frequency words produced greater BOLD signal in left inferior temporal cortex and the left temporoparietal junction, suggesting involvement of these regions in lexical selection and retrieval and encoding of phonological codes. Word length was positively correlated with signal intensity in Heschl's gyrus bilaterally, extending into the mid-superior temporal gyrus (STG) and sulcus (STS) in the left hemisphere. The left mid-STS site was also modulated by reaction time, suggesting a role in the storage of lexical phonological codes.
NASA Astrophysics Data System (ADS)
Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald
2017-09-01
In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
NASA Astrophysics Data System (ADS)
Thacker, Beth
2017-01-01
Large-scale assessment data from Texas Tech University yielded evidence that most students taught traditionally in large lecture classes with online homework and predominantly multiple choice question exams, when asked to answer free-response (FR) questions, did not support their answers with logical arguments grounded in physics concepts. In addition to a lack of conceptual understanding, incorrect and partially correct answers lacked evidence of the ability to apply even lower level reasoning skills in order to solve a problem. Correct answers, however, did show evidence of at least lower level thinking skills as coded using a rubric based on Bloom's taxonomy. With the introduction of evidence-based instruction into the labs and recitations of the large courses and in a small, completely laboratory-based, hands-on course, the percentage of correct answers with correct explanations increased. The FR format, unlike other assessment formats, allowed assessment of both conceptual understanding and the application of thinking skills, clearly pointing out weaknesses not revealed by other assessment instruments, and providing data on skills beyond conceptual understanding for course and program assessment. Supported by National Institutes of Health (NIH) Challenge grant #1RC1GM090897-01.
Nayak, Shalini G; Pai, Mamatha Shivananda; George, Linu Sara
2018-01-01
Conceptual models developed through qualitative research are based on the unique experiences of suffering and individuals' adoptions of each participant. A wide array of problems are faced by head-and-neck cancer (HNC) patients due to disease pathology and treatment modalities which are sufficient to influence the quality of life (QOL). Men possess greater self-acceptance and are better equipped with intrapersonal strength to cope with stress and adequacy compared to women. A qualitative phenomenology study was conducted among seven women suffering from HNC, with the objective to understand their experiences of suffering and to describe the phenomenon. Data were collected by face-to-face, in-depth, open-ended interviews. Data were analyzed using Open Code software (OPC 4.0) by following the steps of Colaizzi process. The phenomenon that emerged out of the lived experiences of HNC women was "Personified as paragon of suffering.optimistic being of achieving normalcy," with five major themes and 13 subthemes. The conceptual model developed with the phenomenological approach is very specific to the women suffering from HNC, which will be contributing to develop strategies to improve the QOL of women.
Nayak, Shalini G; Pai, Mamatha Shivananda; George, Linu Sara
2018-01-01
Background: Conceptual models developed through qualitative research are based on the unique experiences of suffering and individuals’ adoptions of each participant. A wide array of problems are faced by head-and-neck cancer (HNC) patients due to disease pathology and treatment modalities which are sufficient to influence the quality of life (QOL). Men possess greater self-acceptance and are better equipped with intrapersonal strength to cope with stress and adequacy compared to women. Methodology: A qualitative phenomenology study was conducted among seven women suffering from HNC, with the objective to understand their experiences of suffering and to describe the phenomenon. Data were collected by face-to-face, in-depth, open-ended interviews. Data were analyzed using Open Code software (OPC 4.0) by following the steps of Colaizzi process. Results: The phenomenon that emerged out of the lived experiences of HNC women was "Personified as paragon of suffering.optimistic being of achieving normalcy," with five major themes and 13 subthemes. Conclusion: The conceptual model developed with the phenomenological approach is very specific to the women suffering from HNC, which will be contributing to develop strategies to improve the QOL of women. PMID:29440812
NASA Astrophysics Data System (ADS)
Kaya, Ebru
2017-11-01
In this review essay I respond to issues raised in Mijung Kim and Wolff-Michael Roth's paper titled "Dialogical argumentation in elementary science classrooms", which presents a study dealing with dialogical argumentation in early elementary school classrooms. Since there is very limited research on lower primary school students' argumentation in school science, their paper makes a contribution to research on children's argumentation skills. In this response, I focus on two main issues to extend the discussion in Kim and Roth's paper: (a) methodological issues including conducting a quantitative study on children's argumentation levels and focusing on children's written argumentation in addition to their dialogical argumentation, and (b) investigating children's conceptual understanding along with their argumentation levels. Kim and Roth emphasize the difficulty in determining the level of children's argumentation through the Toulmin's Argument Pattern and lack of high level arguments by children due to their difficulties in writing texts. Regarding these methodological issues, I suggest designing quantitative research on coding children's argument levels because such research could potentially provide important findings on children's argumentation. Furthermore, I discuss alternative written products including posters, figures, or pictures generated by children in order to trace children's arguments, and finally articulating argumentation and conceptual understanding of children.
Hammond, Flora M; Davis, Christine; Cook, James R; Philbrick, Peggy; Hirsch, Mark A
2016-01-01
Individuals with a history of traumatic brain injury (TBI) may have chronic problems with irritability, which can negatively affect their lives. (1) To describe the experience (thoughts and feelings) of irritability from the perspectives of multiple people living with or affected by the problem, and (2) to develop a conceptual model of irritability. Qualitative, participatory research. Forty-four stakeholders (individuals with a history of TBI, family members, community professionals, healthcare providers, and researchers) divided into 5 focus groups. Each group met 10 times to discuss the experience of irritability following TBI. Data were coded using grounded theory to develop themes, metacodes, and theories. Not applicable. A conceptual model emerged in which irritability has 5 dimensions: affective (related to moods and feelings); behavioral (especially in areas of self-regulation, impulse control, and time management); cognitive-perceptual (self-talk and ways of seeing the world); relational issues (interpersonal and family dynamics); and environmental (including environmental stimuli, change, disruptions in routine, and cultural expectations). This multidimensional model provides a framework for assessment, treatment, and future research aimed at better understanding irritability, as well as the development of assessment tools and treatment interventions.
Revising a conceptual model of partnership and sustainability in global health.
Upvall, Michele J; Leffers, Jeanne M
2018-05-01
Models to guide global health partnerships are rare in the nursing literature. The Conceptual Model for Partnership and Sustainability in Global Health while significant was based on Western perspectives. The purpose of this study was to revise the model to include the voice of nurses from low- and middle-resource countries. Grounded theory was used to maintain fidelity with the design in the original model. A purposive sample of 15 participants from a variety of countries in Africa, the Caribbean, and Southeast Asia and having extensive experience in global health partnerships were interviewed. Skype recordings and in-person interviews were audiotaped using the same questions as the original study. Theoretical coding and a comparison of results with the original study was completed independently by the researchers. The process of global health partnerships was expanded from the original model to include engagement processes and processes for ongoing partnership development. New concepts of Transparency, Expanded World View, and Accompaniment were included as well as three broad themes: Geopolitical Influence, Power differential/Inequities, and Collegial Friendships. The revised conceptual model embodies a more comprehensive model of global health partnerships with representation of nurses from low- and middle-resource countries. © 2018 Wiley Periodicals, Inc.
Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2015-01-01
HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.
The Transformative Experience in Engineering Education
NASA Astrophysics Data System (ADS)
Goodman, Katherine Ann
This research evaluates the usefulness of transformative experience (TE) in engineering education. With TE, students 1) apply ideas from coursework to everyday experiences without prompting (motivated use); 2) see everyday situations through the lens of course content (expanded perception); and 3) value course content in new ways because it enriches everyday affective experience (affective value). In a three-part study, we examine how engineering educators can promote student progress toward TE and reliably measure that progress. For the first study, we select a mechanical engineering technical elective, Flow Visualization, that had evidence of promoting expanded perception of fluid physics. Through student surveys and interviews, we compare this elective to the required Fluid Mechanics course. We found student interest in fluids fell into four categories: complexity, application, ubiquity, and aesthetics. Fluid Mechanics promotes interest from application, while Flow Visualization promotes interest based in ubiquity and aesthetics. Coding for expanded perception, we found it associated with students' engineering identity, rather than a specific course. In our second study, we replicate atypical teaching methods from Flow Visualization in a new design course: Aesthetics of Design. Coding of surveys and interviews reveals that open-ended assignments and supportive teams lead to increased ownership of projects, which fuels risk-taking, and produces increased confidence as an engineer. The third study seeks to establish parallels between expanded perception and measurable perceptual expertise. Our visual expertise experiment uses fluid flow images with both novices and experts (students who had passed fluid mechanics). After training, subjects sort images into laminar and turbulent categories. The results demonstrate that novices learned to sort the flow stimuli in ways similar to subjects in prior perceptual expertise studies. In contrast, the experts' significantly better results suggest they are accessing conceptual fluids knowledge to perform this new, visual task. The ability to map concepts onto visual information is likely a necessary step toward expanded perception. Our findings suggest that open-ended aesthetic experiences with engineering content unexpectedly support engineering identity development, and that visual tasks could be developed to measure conceptual understanding, promoting expanded perception. Overall, we find TE a productive theoretical framework for engineering education research.
Design for Safety - The Ares Launch Vehicles Paradigm Change
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Maggio, Gaspare
2010-01-01
The lessons learned from the S&MA early involvement in the Ares I launch vehicle design phases proved that performing an in-line function jointly with engineering is critical for S&MA to have an effective role in supporting the system, element, and component design. These lessons learned were used to effectively support the Ares V conceptual design phase and planning for post conceptual design phases. The Top level Conceptual LOM assessment for Ares V performed by the S&MA community jointly with the engineering Advanced Concept Office (ACO) was influential in the final selection of the Ares V system configuration. Post conceptual phase, extensive reliability effort should be planned to support future Heavy Lift Launch Vehicles (HLLV) design. In-depth reliability analysis involving the design, manufacturing, and system engineering communities is critical to understand design and process uncertainties and system integrated failures.
ERIC Educational Resources Information Center
Kapli, Natalia V.
2010-01-01
The study investigated the effects of non-segmented multimedia worked examples (NS-MWE), segmented multimedia worked examples (S-MWE), and segmented multimedia worked examples enhanced with self-explanation prompts (S-MWE-SE) on acquisition of conceptual knowledge and problem solving performance in an undergraduate engineering course. In addition,…
Modular biowaste monitoring system
NASA Technical Reports Server (NTRS)
Fogal, G. L.
1975-01-01
The objective of the Modular Biowaste Monitoring System Program was to generate and evaluate hardware for supporting shuttle life science experimental and diagnostic programs. An initial conceptual design effort established requirements and defined an overall modular system for the collection, measurement, sampling and storage of urine and feces biowastes. This conceptual design effort was followed by the design, fabrication and performance evaluation of a flight prototype model urine collection, volume measurement and sampling capability. No operational or performance deficiencies were uncovered as a result of the performance evaluation tests.
CFD Validation with Experiment and Verification with Physics of a Propellant Damping Device
NASA Technical Reports Server (NTRS)
Yang, H. Q.; Peugeot, John
2011-01-01
This paper will document our effort in validating a coupled fluid-structure interaction CFD tool in predicting a damping device performance in the laboratory condition. Consistently good comparisons of "blind" CFD predictions against experimental data under various operation conditions, design parameters, and cryogenic environment will be presented. The power of the coupled CFD-structures interaction code in explaining some unexpected phenomena of the device observed during the technology development will be illustrated. The evolution of the damper device design inside the LOX tank will be used to demonstrate the contribution of the tool in understanding, optimization and implementation of LOX damper in Ares I vehicle. It is due to the present validation effort, the LOX damper technology has matured to TRL 5. The present effort has also contributed to the transition of the technology from an early conceptual observation to the baseline design of thrust oscillation mitigation for the Ares I within a 10 month period.
A Web Terminology Server Using UMLS for the Description of Medical Procedures
Burgun, Anita; Denier, Patrick; Bodenreider, Olivier; Botti, Geneviève; Delamarre, Denis; Pouliquen, Bruno; Oberlin, Philippe; Lévéque, Jean M.; Lukacs, Bertrand; Kohler, François; Fieschi, Marius; Le Beux, Pierre
1997-01-01
Abstract The Model for Assistance in the Orientation of a User within Coding Systems (MAOUSSC) project has been designed to provide a representation for medical and surgical procedures that allows several applications to be developed from several viewpoints. It is based on a conceptual model, a controlled set of terms, and Web server development. The design includes the UMLS knowledge sources associated with additional knowledge about medico-surgical procedures. The model was implemented using a relational database. The authors developed a complete interface for the Web presentation, with the intermediary layer being written in PERL. The server has been used for the representation of medico-surgical procedures that occur in the discharge summaries of the national survey of hospital activities that is performed by the French Health Statistics Agency in order to produce inpatient profiles. The authors describe the current status of the MAOUSSC server and discuss their interest in using such a server to assist in the coordination of terminology tasks and in the sharing of controlled terminologies. PMID:9292841
A simplified analysis of propulsion installation losses for computerized aircraft design
NASA Technical Reports Server (NTRS)
Morris, S. J., Jr.; Nelms, W. P., Jr.; Bailey, R. O.
1976-01-01
A simplified method is presented for computing the installation losses of aircraft gas turbine propulsion systems. The method has been programmed for use in computer aided conceptual aircraft design studies that cover a broad range of Mach numbers and altitudes. The items computed are: inlet size, pressure recovery, additive drag, subsonic spillage drag, bleed and bypass drags, auxiliary air systems drag, boundary-layer diverter drag, nozzle boattail drag, and the interference drag on the region adjacent to multiple nozzle installations. The methods for computing each of these installation effects are described and computer codes for the calculation of these effects are furnished. The results of these methods are compared with selected data for the F-5A and other aircraft. The computer program can be used with uninstalled engine performance information which is currently supplied by a cycle analysis program. The program, including comments, is about 600 FORTRAN statements long, and uses both theoretical and empirical techniques.
Recent developments in deployment analysis simulation using a multi-body computer code
NASA Technical Reports Server (NTRS)
Housner, Jerrold M.
1989-01-01
Deployment is a candidate mode for construction of structural space systems components. By its very nature, deployment is a dynamic event, often involving large angle unfolding of flexible beam members. Validation of proposed designs and conceptual deployment mechanisms is enhanced through analysis. Analysis may be used to determine member loads thus helping to establish deployment rates and deployment control requirements for a given concept. Futhermore, member flexibility, joint free-play, manufacturing tolerances, and imperfections can affect the reliability of deployment. Analyses which include these effects can aid in reducing risks associated with a particular concept. Ground tests which can play a similar role to that of analyses are difficult and expensive to perform. Suspension systems just for vibration ground tests of large space structures in a 1 g environment present many challenges. Suspension of a structure which spatially expands is even more challenging. Analysis validation through experimental confirmation on relatively small simple models would permit analytical extrapolation to larger more complex space structures.
Decay Heat Removal in GEN IV Gas-Cooled Fast Reactors
Cheng, Lap-Yan; Wei, Thomas Y. C.
2009-01-01
The safety goal of the current designs of advanced high-temperature thermal gas-cooled reactors (HTRs) is that no core meltdown would occur in a depressurization event with a combination of concurrent safety system failures. This study focused on the analysis of passive decay heat removal (DHR) in a GEN IV direct-cycle gas-cooled fast reactor (GFR) which is based on the technology developments of the HTRs. Given the different criteria and design characteristics of the GFR, an approach different from that taken for the HTRs for passive DHR would have to be explored. Different design options based on maintaining core flow weremore » evaluated by performing transient analysis of a depressurization accident using the system code RELAP5-3D. The study also reviewed the conceptual design of autonomous systems for shutdown decay heat removal and recommends that future work in this area should be focused on the potential for Brayton cycle DHRs.« less
Structural concept studies for a horizontal cylindrical lunar habitat and a lunar guyed tower
NASA Technical Reports Server (NTRS)
Yin, Paul K.
1990-01-01
A conceptual structural design of a horizontal cylindrical lunar habitat is presented. The design includes the interior floor framing, the exterior support structure, the foundation mat, and the radiation shielding. Particular attention was given on its efficiency in shipping and field erection, and on selection of structural materials. Presented also is a conceptual design of a 2000-foot lunar guyed tower. A special field erection scheme is implemented in the design. In order to analyze the over-all column buckling of the mast, where its axial compression includes its own body weight, a simple numerical procedure is formulated in a form ready for coding in FORTRAN. Selection of structural materials, effect of temperature variations, dynamic response of the tower to moonquake, and guy anchoring system are discussed. Proposed field erection concepts for the habitat and for the guyed tower are described.
Bikkur Holim: the origins of Jewish pastoral care.
Sheer, Charles
2008-01-01
This paper surveys classical Jewish texts--from the Hebrew Bible through Medieval codes--regarding the concept and practice of Bikkur Holim, literally, "the sick visit." How does this literature understand this ethical, religious act; who are the practitioners; what are their objectives? Although the Hebrew Bible does not contain a biblical precedent or legal mandate for Bikkur Holim, various categories of pastoral actions are traced in midrashic and talmudic texts. Their nuances are examined closely and a conceptualization of Jewish pastoral care is identified in a work by thirteenth century rabbi, jurist and physician, Nahmanides. Ezekiel 34 is proposed as the source for the rabbinic term, Bikkur Holim, as well as the conceptual understanding of Jewish pastoral care. Finally, the author posits various questions regarding the implication of his findings on the conduct of Jewish pastoral care, the value of spiritual assessment, and the nature of chaplaincy work in our various religious traditions.
Developing child autonomy in pediatric healthcare: towards an ethical model.
Martakis, Kyriakos; Brand, Helmut; Schröder-Bäck, Peter
2018-06-01
The changes initiated by the new National Civil and Commercial Code in Argentina underline the pediatric task to empower children's and adolescents' developing autonomy. In this paper, we have framed a model describing autonomy in child healthcare. We carried out a literature review focusing on i) the concept of autonomy referring to the absolute value of the autonomous individual, and ii) the age-driven process of competent decisionmaking development. We summarized our findings developing a conceptual model that includes the child, the pediatrician and the parents. The pediatricianchild relationship is based on different forms of guidance and cooperation, resulting in varying levels of activity and passivity. Parental authority influences the extent of autonomy, based on the level of respect of the child's moral equality. Contextual, existential, conceptual, and socialethical conditions shall be considered when applying the model to facilitate dialogue between pediatricians, children, parents and other actors. Sociedad Argentina de Pediatría.
Using a theory-driven conceptual framework in qualitative health research.
Macfarlane, Anne; O'Reilly-de Brún, Mary
2012-05-01
The role and merits of highly inductive research designs in qualitative health research are well established, and there has been a powerful proliferation of grounded theory method in the field. However, tight qualitative research designs informed by social theory can be useful to sensitize researchers to concepts and processes that they might not necessarily identify through inductive processes. In this article, we provide a reflexive account of our experience of using a theory-driven conceptual framework, the Normalization Process Model, in a qualitative evaluation of general practitioners' uptake of a free, pilot, language interpreting service in the Republic of Ireland. We reflect on our decisions about whether or not to use the Model, and describe our actual use of it to inform research questions, sampling, coding, and data analysis. We conclude with reflections on the added value that the Model and tight design brought to our research.
Exploring the gender gap in the conceptual survey of electricity and magnetism
NASA Astrophysics Data System (ADS)
Henderson, Rachel; Stewart, Gay; Stewart, John; Michaluk, Lynnette; Traxler, Adrienne
2017-12-01
The "gender gap" on various physics conceptual evaluations has been extensively studied. Men's average pretest scores on the Force Concept Inventory and Force and Motion Conceptual Evaluation are 13% higher than women's, and post-test scores are on average 12% higher than women's. This study analyzed the gender differences within the Conceptual Survey of Electricity and Magnetism (CSEM) in which the gender gap has been less well studied and is less consistent. In the current study, data collected from 1407 students (77% men, 23% women) in a calculus-based physics course over ten semesters showed that male students outperformed female students on the CSEM pretest (5%) and post-test (6%). Separate analyses were conducted for qualitative and quantitative problems on lab quizzes and course exams and showed that male students outperformed female students by 3% on qualitative quiz and exam problems. Male and female students performed equally on the quantitative course exam problems. The gender gaps within CSEM post-test scores, qualitative lab quiz scores, and qualitative exam scores were insignificant for students with a CSEM pretest score of 25% or less but grew as pretest scores increased. Structural equation modeling demonstrated that a latent variable, called Conceptual Physics Performance/Non-Quantitative (CPP/NonQnt), orthogonal to quantitative test performance was useful in explaining the differences observed in qualitative performance; this variable was most strongly related to CSEM post-test scores. The CPP/NonQnt of male students was 0.44 standard deviations higher than female students. The CSEM pretest measured CPP/NonQnt much less accurately for women (R2=4 % ) than for men (R2=17 % ). The failure to detect a gender gap for students scoring 25% or less on the pretest suggests that the CSEM instrument itself is not gender biased. The failure to find a performance difference in quantitative test performance while detecting a gap in qualitative performance suggests the qualitative differences do not result from psychological factors such as science anxiety or stereotype threat.
Conceptual design optimization study
NASA Technical Reports Server (NTRS)
Hollowell, S. J.; Beeman, E. R., II; Hiyama, R. M.
1990-01-01
The feasibility of applying multilevel functional decomposition and optimization techniques to conceptual design of advanced fighter aircraft was investigated. Applying the functional decomposition techniques to the conceptual design phase appears to be feasible. The initial implementation of the modified design process will optimize wing design variables. A hybrid approach, combining functional decomposition techniques for generation of aerodynamic and mass properties linear sensitivity derivatives with existing techniques for sizing mission performance and optimization, is proposed.
Curing and caring competences in the skills training of physiotherapy students.
Dahl-Michelsen, Tone
2015-01-01
This article explores the significance of curing and caring competences in physiotherapy education, as well as how curing and caring competences intersect within the professional training of physiotherapy students. The empirical data include participant observations and interviews with students attending skills training in the first year of a bachelor's degree program in Norway. Curing and caring are conceptualized as gender-coded competences. That is, curing and caring are viewed as historical and cultural constructions of masculinities and femininities within the physiotherapy profession, as well as performative actions. The findings illuminate the complexity of curing and caring competences in the skills training of physiotherapy students. Curing and caring are both binary and intertwined competences; however, whereas binary competences are mostly concerned with contextual frames, intertwined competences are mostly concerned with performative aspects. The findings also point to how female and male students attend to curing and caring competences in similar ways; thus, the possibilities of transcending traditional gender norms turn out to be significant in this context. The findings suggest that, although curing somehow remains hegemonic to caring, the future generation of physiotherapists seemingly will be able to use their skills for both caring and curing.
Wave Augmented Diffusers for Centrifugal Compressors
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Skoch, Gary J.
1998-01-01
A conceptual device is introduced which would utilize unsteady wave motion to slow and turn flows in the diffuser section of a centrifugal compressor. The envisioned device would substantially reduce the size of conventional centrifugal diffusers by eliminating the relatively large ninety degree bend needed to turn the flow from the radial/tangential to the axial direction. The bend would be replaced by a wall and the flow would instead exit through a series of rotating ports located on a disk, adjacent to the diffuser hub, and fixed to the impeller shaft. The ports would generate both expansion and compression waves which would rapidly transition from the hub/shroud (axial) direction to the radial/tangential direction. The waves would in turn induce radial/tangential and axial flow. This paper presents a detailed description of the device. Simplified cycle analysis and performance results are presented which were obtained using a time accurate, quasi-one-dimensional CFD code with models for turning, port flow conditions, and losses due to wall shear stress. The results indicate that a periodic wave system can be established which yields diffuser performance comparable to a conventional diffuser. Discussion concerning feasibility, accuracy, and integration follow.
Lidar and radar measurements of the melting layer: observations of dark and bright band phenomena
NASA Astrophysics Data System (ADS)
Di Girolamo, P.; Summa, D.; Cacciani, M.; Norton, E. G.; Peters, G.; Dufournet, Y.
2012-05-01
Multi-wavelength lidar measurements in the melting layer revealing the presence of dark and bright bands have been performed by the University of BASILicata Raman lidar system (BASIL) during a stratiform rain event. Simultaneously radar measurements have been also performed from the same site by the University of Hamburg cloud radar MIRA 36 (35.5 GHz), the University of Hamburg dual-polarization micro rain radar (24.15 GHz) and the University of Manchester UHF wind profiler (1.29 GHz). Measurements from BASIL and the radars are illustrated and discussed in this paper for a specific case study on 23 July 2007 during the Convective and Orographically-induced Precipitation Study (COPS). Simulations of the lidar dark and bright band based on the application of concentric/eccentric sphere Lorentz-Mie codes and a melting layer model are also provided. Lidar and radar measurements and model results are also compared with measurements from a disdrometer on ground and a two-dimensional cloud (2DC) probe on-board the ATR42 SAFIRE. Measurements and model results are found to confirm and support the conceptual microphysical/scattering model elaborated by Sassen et al. (2005).
On the Performance of Alternate Conceptual Ecohydrological Models for Streamflow Prediction
NASA Astrophysics Data System (ADS)
Naseem, Bushra; Ajami, Hoori; Cordery, Ian; Sharma, Ashish
2016-04-01
A merging of a lumped conceptual hydrological model with two conceptual dynamic vegetation models is presented to assess the performance of these models for simultaneous simulations of streamflow and leaf area index (LAI). Two conceptual dynamic vegetation models with differing representation of ecological processes are merged with a lumped conceptual hydrological model (HYMOD) to predict catchment scale streamflow and LAI. The merged RR-LAI-I model computes relative leaf biomass based on transpiration rates while the RR-LAI-II model computes above ground green and dead biomass based on net primary productivity and water use efficiency in response to soil moisture dynamics. To assess the performance of these models, daily discharge and 8-day MODIS LAI product for 27 catchments of 90 - 1600km2 in size located in the Murray - Darling Basin in Australia are used. Our results illustrate that when single-objective optimisation was focussed on maximizing the objective function for streamflow or LAI, the other un-calibrated predicted outcome (LAI if streamflow is the focus) was consistently compromised. Thus, single-objective optimization cannot take into account the essence of all processes in the conceptual ecohydrological models. However, multi-objective optimisation showed great strength for streamflow and LAI predictions. Both response outputs were better simulated by RR-LAI-II than RR-LAI-I due to better representation of physical processes such as net primary productivity (NPP) in RR-LAI-II. Our results highlight that simultaneous calibration of streamflow and LAI using a multi-objective algorithm proves to be an attractive tool for improved streamflow predictions.
Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling
NASA Astrophysics Data System (ADS)
March, Salvatore T.; Allen, Gove N.
Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.
Leffler, Daniel A; Acaster, Sarah; Gallop, Katy; Dennis, Melinda; Kelly, Ciarán P; Adelman, Daniel C
2017-04-01
Celiac disease is a chronic inflammatory condition with wide ranging effects on individual's lives caused by a combination of symptoms and the burden of adhering to a gluten-free diet (GFD). To further understand patients' experience of celiac disease, the impact it has on health-related quality of life (HRQOL), and to develop a conceptual model describing this impact. Adults with celiac disease on a GFD reporting symptoms within the previous 3 months were included; patients with refractory celiac disease and confounding medical conditions were excluded. A semistructured discussion guide was developed exploring celiac disease symptoms and impact on patients' HRQOL. An experienced interviewer conducted in-depth interviews. The data set was coded and analyzed using thematic analysis to identify concepts, themes, and the inter-relationships between them. Data saturation was monitored and concepts identified formed the basis of the conceptual model. Twenty-one participants were recruited, and 32 distinct gluten-related symptoms were reported and data saturation was reached. Analysis identified several themes impacting patients' HRQOL: fears and anxiety, day-to-day management of celiac disease, physical functioning, sleep, daily activities, social activities, emotional functioning, and relationships. The conceptual model highlights the main areas of impact and the relationships between concepts. Both symptoms and maintaining a GFD have a substantial impact on patient functioning and HRQOL in adults with celiac disease. The conceptual model derived from these data may help to design future patient-reported outcomes as well as interventions to improve the quality of life in an individual with celiac disease. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Brunton, Ginny; Thomas, James; O'Mara-Eves, Alison; Jamal, Farah; Oliver, Sandy; Kavanagh, Josephine
2017-12-11
Government policy increasingly supports engaging communities to promote health. It is critical to consider whether such strategies are effective, for whom, and under what circumstances. However, 'community engagement' is defined in diverse ways and employed for different reasons. Considering the theory and context we developed a conceptual framework which informs understanding about what makes an effective (or ineffective) community engagement intervention. We conducted a systematic review of community engagement in public health interventions using: stakeholder involvement; searching, screening, appraisal and coding of research literature; and iterative thematic syntheses and meta-analysis. A conceptual framework of community engagement was refined, following interactions between the framework and each review stage. From 335 included reports, three products emerged: (1) two strong theoretical 'meta-narratives': one, concerning the theory and practice of empowerment/engagement as an independent objective; and a more utilitarian perspective optimally configuring health services to achieve defined outcomes. These informed (2) models that were operationalized in subsequent meta-analysis. Both refined (3) the final conceptual framework. This identified multiple dimensions by which community engagement interventions may differ. Diverse combinations of intervention purpose, theory and implementation were noted, including: ways of defining communities and health needs; initial motivations for community engagement; types of participation; conditions and actions necessary for engagement; and potential issues influencing impact. Some dimensions consistently co-occurred, leading to three overarching models of effective engagement which either: utilised peer-led delivery; employed varying degrees of collaboration between communities and health services; or built on empowerment philosophies. Our conceptual framework and models are useful tools for considering appropriate and effective approaches to community engagement. These should be tested and adapted to facilitate intervention design and evaluation. Using this framework may disentangle the relative effectiveness of different models of community engagement, promoting effective, sustainable and appropriate initiatives.
Slaughter, Susan E; Bampton, Erin; Erin, Daniel F; Ickert, Carla; Jones, C Allyson; Estabrooks, Carole A
2017-06-01
Innovative approaches are required to facilitate the adoption and sustainability of evidence-based care practices. We propose a novel implementation strategy, a peer reminder role, which involves offering a brief formal reminder to peers during structured unit meetings. This study aims to (a) identify healthcare aide (HCA) perceptions of a peer reminder role for HCAs, and (b) develop a conceptual framework for the role based on these perceptions. In 2013, a qualitative focus group study was conducted in five purposively sampled residential care facilities in western Canada. A convenience sample of 24 HCAs agreed to participate in five focus groups. Concurrent with data collection, two researchers coded the transcripts and identified themes by consensus. They jointly determined when saturation was achieved and took steps to optimize the trustworthiness of the findings. Five HCAs from the original focus groups commented on the resulting conceptual framework. HCAs were cautious about accepting a role that might alienate them from their co-workers. They emphasized feeling comfortable with the peer reminder role and identified circumstances that would optimize their comfort including: effective implementation strategies, perceptions of the role, role credibility and a supportive context. These intersecting themes formed a peer reminder conceptual framework. We identified HCAs' perspectives of a new peer reminder role designed specifically for them. Based on their perceptions, a conceptual framework was developed to guide the implementation of a peer reminder role for HCAs. This role may be a strategic implementation strategy to optimize the sustainability of new practices in residential care settings, and the related framework could offer guidance on how to implement this role. © 2017 Sigma Theta Tau International.
MASCOT - MATLAB Stability and Control Toolbox
NASA Technical Reports Server (NTRS)
Kenny, Sean; Crespo, Luis
2011-01-01
MASCOT software was created to provide the conceptual aircraft designer accurate predictions of air vehicle stability and control characteristics. The code takes as input mass property data in the form of an inertia tensor, aerodynamic loading data, and propulsion (i.e. thrust) loading data. Using fundamental non-linear equations of motion, MASCOT then calculates vehicle trim and static stability data for any desired flight condition. Common predefined flight conditions are included. The predefined flight conditions include six horizontal and six landing rotation conditions with varying options for engine out, crosswind and sideslip, plus three takeoff rotation conditions. Results are displayed through a unique graphical interface developed to provide stability and control information to the conceptual design engineers using a qualitative scale indicating whether the vehicle has acceptable, marginal, or unacceptable static stability characteristics. This software allows the user to prescribe the vehicle s CG location, mass, and inertia tensor so that any loading configuration between empty weight and maximum take-off weight can be analyzed. The required geometric and aerodynamic data as well as mass and inertia properties may be entered directly, passed through data files, or come from external programs such as Vehicle Sketch Pad (VSP). The current version of MASCOT has been tested with VSP used to compute the required data, which is then passed directly into the program. In VSP, the vehicle geometry is created and manipulated. The aerodynamic coefficients, stability and control derivatives, are calculated using VorLax, which is now available directly within VSP. MASCOT has been written exclusively using the technical computing language MATLAB . This innovation is able to bridge the gap between low-fidelity conceptual design and higher-fidelity stability and control analysis. This new tool enables the conceptual design engineer to include detailed static stability and trim constraints in the conceptual design loop. The unique graphical interface developed for this tool presents the stability data in a format that is understandable by the conceptual designer, yet also provides the detailed quantitative results if desired.
Liu, Charles; Kayima, Peter; Riesel, Johanna; Situma, Martin; Chang, David; Firth, Paul
2017-11-01
The lack of a classification system for surgical procedures in resource-limited settings hinders outcomes measurement and reporting. Existing procedure coding systems are prohibitively large and expensive to implement. We describe the creation and prospective validation of 3 brief procedure code lists applicable in low-resource settings, based on analysis of surgical procedures performed at Mbarara Regional Referral Hospital, Uganda's second largest public hospital. We reviewed operating room logbooks to identify all surgical operations performed at Mbarara Regional Referral Hospital during 2014. Based on the documented indication for surgery and procedure(s) performed, we assigned each operation up to 4 procedure codes from the International Classification of Diseases, 9th Revision, Clinical Modification. Coding of procedures was performed by 2 investigators, and a random 20% of procedures were coded by both investigators. These codes were aggregated to generate procedure code lists. During 2014, 6,464 surgical procedures were performed at Mbarara Regional Referral Hospital, to which we assigned 435 unique procedure codes. Substantial inter-rater reliability was achieved (κ = 0.7037). The 111 most common procedure codes accounted for 90% of all codes assigned, 180 accounted for 95%, and 278 accounted for 98%. We considered these sets of codes as 3 procedure code lists. In a prospective validation, we found that these lists described 83.2%, 89.2%, and 92.6% of surgical procedures performed at Mbarara Regional Referral Hospital during August to September of 2015, respectively. Empirically generated brief procedure code lists based on International Classification of Diseases, 9th Revision, Clinical Modification can be used to classify almost all surgical procedures performed at a Ugandan referral hospital. Such a standardized procedure coding system may enable better surgical data collection for administration, research, and quality improvement in resource-limited settings. Copyright © 2017 Elsevier Inc. All rights reserved.
Conceptual design study for a teleoperator visual system, phase 1
NASA Technical Reports Server (NTRS)
Adams, D.; Grant, C.; Johnson, C.; Meirick, R.; Polhemus, C.; Ray, A.; Rittenhouse, D.; Skidmore, R.
1972-01-01
Results are reported for work performed during the first phase of the conceptual design study for a teleoperator visual system. This phase consists of four tasks: General requirements, concept development, subsystem requirements and analysis, and concept evaluation.
Performance measurement for people with multiple chronic conditions: conceptual model.
Giovannetti, Erin R; Dy, Sydney; Leff, Bruce; Weston, Christine; Adams, Karen; Valuck, Tom B; Pittman, Aisha T; Blaum, Caroline S; McCann, Barbara A; Boyd, Cynthia M
2013-10-01
Improving quality of care for people with multiple chronic conditions (MCCs) requires performance measures reflecting the heterogeneity and scope of their care. Since most existing measures are disease specific, performance measures must be refined and new measures must be developed to address the complexity of care for those with MCCs. To describe development of the Performance Measurement for People with Multiple Chronic Conditions (PM-MCC) conceptual model. Framework development and a national stakeholder panel. We used reviews of existing conceptual frameworks of performance measurement, review of the literature on MCCs, input from experts in the multistakeholder Steering Committee, and public comment. The resulting model centers on the patient and family goals and preferences for care in the context of multiple care sites and providers, the type of care they are receiving, and the national priority domains for healthcare quality measurement. This model organizes measures into a comprehensive framework and identifies areas where measures are lacking. In this context, performance measures can be prioritized and implemented at different levels, in the context of patients' overall healthcare needs.
Performance Analysis of New Binary User Codes for DS-CDMA Communication
NASA Astrophysics Data System (ADS)
Usha, Kamle; Jaya Sankar, Kottareddygari
2016-03-01
This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.
Preserved conceptual priming in Alzheimer's disease.
Martins, Carla A R; Lloyd-Jones, Toby J
2006-10-01
We assessed Alzheimer's disease (AD) and healthy older adult control (HC) group performance on: (1) a conceptual priming task, in which participants had to make a semantic decision as to whether a degraded picture of an object encountered previously belonged to the category of living or non-living things; and (2) a recognition memory task. The AD group showed a dissociation between impaired performance on the recognition task and preserved priming for semantic decisions to degraded pictures. We argue that it is not whether priming is conceptual or perceptual that is important for the observation of priming in AD, rather it is the nature of the response that is required (c.f., Gabrieli et al., 1999).
NASA Astrophysics Data System (ADS)
Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf
2016-11-01
This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.
2011-01-01
Background The National Institute for Health Research (NIHR) was established in 2006 with the aim of creating an applied health research system embedded within the English National Health Service (NHS). NIHR sought to implement an approach for monitoring its performance that effectively linked early indicators of performance with longer-term research impacts. We attempted to develop and apply a conceptual framework for defining appropriate key performance indicators for NIHR. Method Following a review of relevant literature, a conceptual framework for defining performance indicators for NIHR was developed, based on a hybridisation of the logic model and balanced scorecard approaches. This framework was validated through interviews with key NIHR stakeholders and a pilot in one division of NIHR, before being refined and applied more widely. Indicators were then selected and aggregated to create a basket of indicators aligned to NIHR's strategic goals, which could be reported to NIHR's leadership team on a quarterly basis via an oversight dashboard. Results Senior health research system managers and practitioners endorsed the conceptual framework developed and reported satisfaction with the breadth and balance of indicators selected for reporting. Conclusions The use of the hybrid conceptual framework provides a pragmatic approach to defining performance indicators that are aligned to the strategic aims of a health research system. The particular strength of this framework is its capacity to provide an empirical link, over time, between upstream activities of a health research system and its long-term strategic objectives. PMID:21435265
Eton, David T; Ramalho de Oliveira, Djenane; Egginton, Jason S; Ridgeway, Jennifer L; Odell, Laura; May, Carl R; Montori, Victor M
2012-01-01
Burden of treatment refers to the workload of health care as well as its impact on patient functioning and well-being. We set out to build a conceptual framework of issues descriptive of burden of treatment from the perspective of the complex patient, as a first step in the development of a new patient-reported measure. We conducted semistructured interviews with patients seeking medication therapy management services at a large, academic medical center. All patients had a complex regimen of self-care (including polypharmacy), and were coping with one or more chronic health conditions. We used framework analysis to identify and code themes and subthemes. A conceptual framework of burden of treatment was outlined from emergent themes and subthemes. Thirty-two patients (20 female, 12 male, age 26-85 years) were interviewed. Three broad themes of burden of treatment emerged including: the work patients must do to care for their health; problem-focused strategies and tools to facilitate the work of self-care; and factors that exacerbate the burden felt. The latter theme encompasses six subthemes including challenges with taking medication, emotional problems with others, role and activity limitations, financial challenges, confusion about medical information, and health care delivery obstacles. We identified several key domains and issues of burden of treatment amenable to future measurement and organized them into a conceptual framework. Further development work on this conceptual framework will inform the derivation of a patient-reported measure of burden of treatment.
NASA Astrophysics Data System (ADS)
Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria
2017-07-01
In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.
An analogue conceptual rainfall-runoff model for educational purposes
NASA Astrophysics Data System (ADS)
Herrnegger, Mathew; Riedl, Michael; Schulz, Karsten
2016-04-01
Conceptual rainfall-runoff models, in which runoff processes are modelled with a series of connected linear and non-linear reservoirs, remain widely applied tools in science and practice. Additionally, the concept is appreciated in teaching due to its somewhat simplicity in explaining and exploring hydrological processes of catchments. However, when a series of reservoirs are used, the model system becomes highly parametrized and complex and the traceability of the model results becomes more difficult to explain to an audience not accustomed to numerical modelling. Since normally the simulations are performed with a not visible digital code, the results are also not easily comprehensible. This contribution therefore presents a liquid analogue model, in which a conceptual rainfall-runoff model is reproduced by a physical model. This consists of different acrylic glass containers representing different storage components within a catchment, e.g. soil water or groundwater storage. The containers are equipped and connected with pipes, in which water movement represents different flow processes, e.g. surface runoff, percolation or base flow. Water from a storage container is pumped to the upper part of the model and represents effective rainfall input. The water then flows by gravity through the different pipes and storages. Valves are used for controlling the flows within the analogue model, comparable to the parameterization procedure in numerical models. Additionally, an inexpensive microcontroller-based board and sensors are used to measure storage water levels, with online visualization of the states as time series data, building a bridge between the analogue and digital world. The ability to physically witness the different flows and water levels in the storages makes the analogue model attractive to the audience. Hands-on experiments can be performed with students, in which different scenarios or catchment types can be simulated, not only with the analogue but also in parallel with the digital model, thereby connecting real-world with science. The effects of different parameterization setups, which is important not only in hydrological sciences, can be shown in a tangible way. The use of the analogue model in the context of "children meet University" events seems an attractive approach to show a younger audience the basic ideas of catchment modelling concepts, which would otherwise not be possible.
Lim-Dunham, Jennifer E; Ensminger, David C; McNulty, John A; Hoyt, Amy E; Chandrasekhar, Arcot J
2016-02-01
The principles of Collins' cognitive apprenticeship model were used to design a radiology curriculum in which medical students practice radiological skills using online case-based modules. The modules are embedded within clinical third-year clerkships, and students are provided with personalized feedback from the instructors. We describe the development of the vertical online radiology curriculum and evaluate its impact on student achievement and learning process using a mixed method approach. The curriculum was developed over a 2-year period. Student participation was voluntary in the first year and mandatory in the second year. For quantitative curriculum evaluation, student metrics for voluntary versus mandatory groups were assessed using independent sample t tests and variable entry method regression analysis. For qualitative analysis, responses from a survey of students about the value of the curriculum were organized into defined themes using consensus coding. Mandatory participation significantly improved (p = .001) the mean radiology examination score (82 %) compared to the voluntary group (73%), suggesting that mandatory participation had a beneficial effect on student performance. Potential preexisting differences in underlying general academic performance were accounted for by including mean basic science grades as the first variable in the regression model. The significant increase in R(2) from .16 to .28 when number of radiology cases completed was added to the original model, and the greater value of the standardized beta for this variable, suggest that the curriculum made a significant contribution to students' radiology examination scores beyond their baseline academic performance. Five dominant themes about curricular characteristics that enhanced student learning and beneficial outcomes emerged from consensus coding. These themes were (1) self-paced design, (2) receiving feedback from faculty, (3) clinical relevance of cases, (4) gaining confidence in interpreting radiological images, and (5) transfer of conceptual knowledge to actual practice. The vertically integrated online radiology curriculum can positively impact student performance and learning process in the context of the cognitive apprenticeship model. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Performance and structure of single-mode bosonic codes
NASA Astrophysics Data System (ADS)
Albert, Victor V.; Noh, Kyungjoo; Duivenvoorden, Kasper; Young, Dylan J.; Brierley, R. T.; Reinhold, Philip; Vuillot, Christophe; Li, Linshu; Shen, Chao; Girvin, S. M.; Terhal, Barbara M.; Jiang, Liang
2018-03-01
The early Gottesman, Kitaev, and Preskill (GKP) proposal for encoding a qubit in an oscillator has recently been followed by cat- and binomial-code proposals. Numerically optimized codes have also been proposed, and we introduce codes of this type here. These codes have yet to be compared using the same error model; we provide such a comparison by determining the entanglement fidelity of all codes with respect to the bosonic pure-loss channel (i.e., photon loss) after the optimal recovery operation. We then compare achievable communication rates of the combined encoding-error-recovery channel by calculating the channel's hashing bound for each code. Cat and binomial codes perform similarly, with binomial codes outperforming cat codes at small loss rates. Despite not being designed to protect against the pure-loss channel, GKP codes significantly outperform all other codes for most values of the loss rate. We show that the performance of GKP and some binomial codes increases monotonically with increasing average photon number of the codes. In order to corroborate our numerical evidence of the cat-binomial-GKP order of performance occurring at small loss rates, we analytically evaluate the quantum error-correction conditions of those codes. For GKP codes, we find an essential singularity in the entanglement fidelity in the limit of vanishing loss rate. In addition to comparing the codes, we draw parallels between binomial codes and discrete-variable systems. First, we characterize one- and two-mode binomial as well as multiqubit permutation-invariant codes in terms of spin-coherent states. Such a characterization allows us to introduce check operators and error-correction procedures for binomial codes. Second, we introduce a generalization of spin-coherent states, extending our characterization to qudit binomial codes and yielding a multiqudit code.
Cognitive components of picture naming.
Johnson, C J; Paivio, A; Clark, J M
1996-07-01
A substantial research literature documents the effects of diverse item attributes, task conditions, and participant characteristics on the case of picture naming. The authors review what the research has revealed about 3 generally accepted stages of naming a pictured object: object identification, name activation, and response generation. They also show that dual coding theory gives a coherent and plausible account of these findings without positing amodal conceptual representations, and they identify issues and methods that may further advance the understanding of picture naming and related cognitive tasks.
Neutronics Assessments for a RIA Fragmentation Line Beam Dump Concept
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boles, J L; Reyes, S; Ahle, L E
Heavy ion and radiation transport calculations are in progress for conceptual beam dump designs for the fragmentation line of the proposed Rare Isotope Accelerator (RIA). Using the computer code PHITS, a preliminary design of a motor-driven rotating wheel beam dump and adjacent downstream multipole has been modeled. Selected results of these calculations are given, including neutron and proton flux in the wheel, absorbed dose and displacements per atom in the hub materials, and heating from prompt radiation and from decay heat in the multipole.
Neighborhood Effects on Health: Concentrated Advantage and Disadvantage
Finch, Brian K.; Do, D. Phuong; Heron, Melonie; Bird, Chloe; Seeman, Teresa; Lurie, Nicole
2010-01-01
We investigate an alternative conceptualization of neighborhood context and its association with health. Using an index that measures a continuum of concentrated advantage and disadvantage, we examine whether the relationship between neighborhood conditions and health varies by socio-economic status. Using NHANES III data geo-coded to census tracts, we find that while largely uneducated neighborhoods are universally deleterious, individuals with more education benefit from living in highly educated neighborhoods to a greater degree than individuals with lower levels of education. PMID:20627796
Advanced Data Collection for Inventory Management
NASA Technical Reports Server (NTRS)
Opresko, G. A.; Leet, J. H.; Mcgrath, D. F.; Eidson, J.
1987-01-01
Bar-coding, radio-frequency, and voice-operated systems selected. Report discusses study of state-of-the-art in automated collection of data for management of large inventories. Study included comprehensive search of literature on data collection and inventory management, visits to existing automated inventory systems, and tours of selected supply and transportation facilities at Kennedy Space Center. Information collected analyzed in view of needs of conceptual inventory-management systems for Kennedy Space Center and for manned space station and other future space projects.
Error Control Coding Techniques for Space and Satellite Communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Takeshita, Oscar Y.; Cabral, Hermano A.
1998-01-01
It is well known that the BER performance of a parallel concatenated turbo-code improves roughly as 1/N, where N is the information block length. However, it has been observed by Benedetto and Montorsi that for most parallel concatenated turbo-codes, the FER performance does not improve monotonically with N. In this report, we study the FER of turbo-codes, and the effects of their concatenation with an outer code. Two methods of concatenation are investigated: across several frames and within each frame. Some asymmetric codes are shown to have excellent FER performance with an information block length of 16384. We also show that the proposed outer coding schemes can improve the BER performance as well by eliminating pathological frames generated by the iterative MAP decoding process.
When pretesting fails to enhance learning concepts from reading texts.
Hausman, Hannah; Rhodes, Matthew G
2018-05-03
Prior research suggests that people can learn more from reading a text when they attempt to answer pretest questions first. Specifically, pretests on factual information explicitly stated in a text increases the likelihood that participants can answer identical questions after reading than if they had not answered pretest questions. Yet, a central goal of education is to develop deep conceptual understanding. The present experiments investigated whether conceptual pretests facilitate learning concepts from reading texts. In Experiment 1, participants were given factual or conceptual pretest questions; a control group was not given a pretest. Participants then read a passage and took a final test consisting of both factual and conceptual questions. Some of the final test questions were repeated from the pretest and some were new. Although factual pretesting improved learning for identical factual questions, conceptual pretesting did not enhance conceptual learning. Conceptual pretest errors were significantly more likely to be repeated on the final test than factual pretest errors. Providing correct answers (Experiment 2) or correct/incorrect feedback (Experiment 3) following pretest questions enhanced performance on repeated conceptual test items, although these benefits likely reflect memorization and not conceptual understanding. Thus, pretesting appears to provide little benefit for learning conceptual information. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Los Alamos radiation transport code system on desktop computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less
A Conceptual Framework for Evaluation of Public Health and Primary Care System Performance in Iran
Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Sari, Ali Akbari; Mesdaghinia, Alireza
2015-01-01
Introduction: The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. Methods: We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. Results: We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. Conclusion: The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report. PMID:25946937
Predicting the Performance of an Axial-Flow Compressor
NASA Technical Reports Server (NTRS)
Steinke, R. J.
1986-01-01
Stage-stacking computer code (STGSTK) developed for predicting off-design performance of multi-stage axial-flow compressors. Code uses meanline stagestacking method. Stage and cumulative compressor performance calculated from representative meanline velocity diagrams located at rotor inlet and outlet meanline radii. Numerous options available within code. Code developed so user modify correlations to suit their needs.
Conceptualisations of infinity by primary pre-service teachers
NASA Astrophysics Data System (ADS)
Date-Huxtable, Elizabeth; Cavanagh, Michael; Coady, Carmel; Easey, Michael
2018-05-01
As part of the Opening Real Science: Authentic Mathematics and Science Education for Australia project, an online mathematics learning module embedding conceptual thinking about infinity in science-based contexts, was designed and trialled with a cohort of 22 pre-service teachers during 1 week of intensive study. This research addressed the question: "How do pre-service teachers conceptualise infinity mathematically?" Participants argued the existence of infinity in a summative reflective task, using mathematical and empirical arguments that were coded according to five themes: definition, examples, application, philosophy and teaching; and 17 codes. Participants' reflections were differentiated as to whether infinity was referred to as an abstract (A) or a real (R) concept or whether both (B) codes were used. Principal component analysis of the reflections, using frequency of codings, revealed that A and R codes occurred at different frequencies in three groups of reflections. Distinct methods of argument were associated with each group of reflections: mathematical numerical examples and empirical measurement comparisons characterised arguments for infinity as an abstract concept, geometric and empirical dynamic examples and belief statements characterised arguments for infinity as a real concept and empirical measurement and mathematical examples and belief statements characterised arguments for infinity as both an abstract and a real concept. An implication of the results is that connections between mathematical and empirical applications of infinity may assist pre-service teachers to contrast finite with infinite models of the world.
NASA Astrophysics Data System (ADS)
Wang, Yayong
2010-06-01
A large number of buildings were seriously damaged or collapsed in the “5.12” Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the “Standard for classification of seismic protection of building constructions GB50223-2008” and “Code for Seismic Design of Buildings GB50011-2001.” The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
Do evidence-based active-engagement courses reduce the gender gap in introductory physics?
NASA Astrophysics Data System (ADS)
Karim, Nafis I.; Maries, Alexandru; Singh, Chandralekha
2018-03-01
Prior research suggests that using evidence-based pedagogies can not only improve learning for all students, it can also reduce the gender gap. We describe the impact of physics education research-based pedagogical techniques in flipped and active-engagement non-flipped courses on the gender gap observed with validated conceptual surveys. We compare male and female students’ performance in courses which make significant use of evidence-based active-engagement (EBAE) strategies with courses that primarily use lecture-based (LB) instruction. All courses had large enrolment and often had more than 100 students. The analysis of data for validated conceptual surveys presented here includes data from two-semester sequences of algebra-based and calculus-based introductory physics courses. The conceptual surveys used to assess student learning in the first and second semester courses were the force concept inventory and the conceptual survey of electricity and magnetism, respectively. In the research discussed here, the performance of male and female students in EBAE courses at a particular level is compared with LB courses in two situations: (I) the same instructor taught two courses, one of which was an EBAE course and the other an LB course, while the homework, recitations and final exams were kept the same; (II) student performance in all of the EBAE courses taught by different instructors was averaged and compared with LB courses of the same type also averaged over different instructors. In all cases, on conceptual surveys we find that students in courses which make significant use of active-engagement strategies, on average, outperformed students in courses of the same type using primarily lecture-based instruction even though there was no statistically significant difference on the pre-test before instruction. However, the gender gap persisted even in courses using EBAE methods. We also discuss correlations between the performance of male and female students on the validated conceptual surveys and the final exam, which had a heavy weight on quantitative problem solving.
Quantum Algorithms for Scientific Computing and Approximate Optimization
NASA Astrophysics Data System (ADS)
Hadfield, Stuart Andrew
Diversity and inclusion has been a concern for the physics community for nearly 50 years. Despite significant efforts including the American Physical Society (APS) Conferences for Undergraduate Women in Physics (CUWiP) and the APS Bridge Program, women, African Americans, and Hispanics continue to be substantially underrepresented in the physics profession. Similar efforts within the field of engineering, whose students make up the majority of students in the introductory calculus-based physics courses, have also met with limited success. With the introduction of research-based instruments such as the Force Concept Inventory (FCI), the Force and Motion Conceptual Evaluation (FMCE), and the Conceptual Survey of Electricity and Magnetism (CSEM), differences in performance by gender began to be reported. Researchers have yet to come to an agreement as to why these "gender gaps" exist in the conceptual inventories that are widely used in physics education research and/or how to reduce the gaps. The "gender gap" has been extensively studied; on average, for the mechanics conceptual inventories, male students outperform female students by 13% on the pretest and by 12% post instruction. While much of the gender gap research has been geared toward the mechanics conceptual inventories, there have been few studies exploring the gender gap in the electricity and magnetism conceptual inventories. Overall, male students outperform female students by 3.7% on the pretest and 8.5% on the post-test; however, these studies have much more variation including one study showing female students outperforming male students on the CSEM. Many factors have been proposed that may influence the gender gap, from differences in background and preparation to various psychological and sociocultural effects. A parallel but largely disconnected set of research has identified gender biased questions within the FCI. This research has produced sporadic results and has only been performed on the FCI. The work performed in this manuscript will seek to synthesize these strands and use large datasets and deep demographic data to understand the persistent differences in male and female performance.
Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W
2018-04-03
To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Brian; Gutowska, Izabela; Chiger, Howard
Computer simulations of nuclear reactor thermal-hydraulic phenomena are often used in the design and licensing of nuclear reactor systems. In order to assess the accuracy of these computer simulations, computer codes and methods are often validated against experimental data. This experimental data must be of sufficiently high quality in order to conduct a robust validation exercise. In addition, this experimental data is generally collected at experimental facilities that are of a smaller scale than the reactor systems that are being simulated due to cost considerations. Therefore, smaller scale test facilities must be designed and constructed in such a fashion tomore » ensure that the prototypical behavior of a particular nuclear reactor system is preserved. The work completed through this project has resulted in scaling analyses and conceptual design development for a test facility capable of collecting code validation data for the following high temperature gas reactor systems and events— 1. Passive natural circulation core cooling system, 2. pebble bed gas reactor concept, 3. General Atomics Energy Multiplier Module reactor, and 4. prismatic block design steam-water ingress event. In the event that code validation data for these systems or events is needed in the future, significant progress in the design of an appropriate integral-type test facility has already been completed as a result of this project. Where applicable, the next step would be to begin the detailed design development and material procurement. As part of this project applicable scaling analyses were completed and test facility design requirements developed. Conceptual designs were developed for the implementation of these design requirements at the Oregon State University (OSU) High Temperature Test Facility (HTTF). The original HTTF is based on a ¼-scale model of a high temperature gas reactor concept with the capability for both forced and natural circulation flow through a prismatic core with an electrical heat source. The peak core region temperature capability is 1400°C. As part of this project, an inventory of test facilities that could be used for these experimental programs was completed. Several of these facilities showed some promise, however, upon further investigation it became clear that only the OSU HTTF had the power and/or peak temperature limits that would allow for the experimental programs envisioned herein. Thus the conceptual design and feasibility study development focused on examining the feasibility of configuring the current HTTF to collect validation data for these experimental programs. In addition to the scaling analyses and conceptual design development, a test plan was developed for the envisioned modified test facility. This test plan included a discussion on an appropriate shakedown test program as well as the specific matrix tests. Finally, a feasibility study was completed to determine the cost and schedule considerations that would be important to any test program developed to investigate these designs and events.« less
Numerical predictions of EML (electromagnetic launcher) system performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnurr, N.M.; Kerrisk, J.F.; Davidson, R.F.
1987-01-01
The performance of an electromagnetic launcher (EML) depends on a large number of parameters, including the characteristics of the power supply, rail geometry, rail and insulator material properties, injection velocity, and projectile mass. EML system performance is frequently limited by structural or thermal effects in the launcher (railgun). A series of computer codes has been developed at the Los Alamos National Laboratory to predict EML system performance and to determine the structural and thermal constraints on barrel design. These codes include FLD, a two-dimensional electrostatic code used to calculate the high-frequency inductance gradient and surface current density distribution for themore » rails; TOPAZRG, a two-dimensional finite-element code that simultaneously analyzes thermal and electromagnetic diffusion in the rails; and LARGE, a code that predicts the performance of the entire EML system. Trhe NIKE2D code, developed at the Lawrence Livermore National Laboratory, is used to perform structural analyses of the rails. These codes have been instrumental in the design of the Lethality Test System (LTS) at Los Alamos, which has an ultimate goal of accelerating a 30-g projectile to a velocity of 15 km/s. The capabilities of the individual codes and the coupling of these codes to perform a comprehensive analysis is discussed in relation to the LTS design. Numerical predictions are compared with experimental data and presented for the LTS prototype tests.« less
NASA Technical Reports Server (NTRS)
Lin, Shu; Rhee, Dojun; Rajpal, Sandeep
1993-01-01
This report presents a low-complexity and high performance concatenated coding scheme for high-speed satellite communications. In this proposed scheme, the NASA Standard Reed-Solomon (RS) code over GF(2(exp 8) is used as the outer code and the second-order Reed-Muller (RM) code of Hamming distance 8 is used as the inner code. The RM inner code has a very simple trellis structure and is decoded with the soft-decision Viterbi decoding algorithm. It is shown that the proposed concatenated coding scheme achieves an error performance which is comparable to that of the NASA TDRS concatenated coding scheme in which the NASA Standard rate-1/2 convolutional code of constraint length 7 and d sub free = 10 is used as the inner code. However, the proposed RM inner code has much smaller decoding complexity, less decoding delay, and much higher decoding speed. Consequently, the proposed concatenated coding scheme is suitable for reliable high-speed satellite communications, and it may be considered as an alternate coding scheme for the NASA TDRS system.
Coding performance of the Probe-Orbiter-Earth communication link
NASA Technical Reports Server (NTRS)
Divsalar, D.; Dolinar, S.; Pollara, F.
1993-01-01
The coding performance of the Probe-Orbiter-Earth communication link is analyzed and compared for several cases. It is assumed that the coding system consists of a convolutional code at the Probe, a quantizer and another convolutional code at the Orbiter, and two cascaded Viterbi decoders or a combined decoder on the ground.
Evaluation of Aeroelastically Tailored Small Wind Turbine Blades Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffin, Dayton A.
2005-09-29
Evaluation of Aeroelastically Tailored Small Wind Turbine Blades Final Report Global Energy Concepts, LLC (GEC) has performed a conceptual design study concerning aeroelastic tailoring of small wind turbine blades. The primary objectives were to evaluate ways that blade/rotor geometry could be used to enable cost-of-energy reductions by enhancing energy capture while constraining or mitigating blade costs, system loads, and related component costs. This work builds on insights developed in ongoing adaptive-blade programs but with a focus on application to small turbine systems with isotropic blade material properties and with combined blade sweep and pre-bending/pre-curving to achieve the desired twist coupling.more » Specific goals of this project are to: (A) Evaluate and quantify the extent to which rotor geometry can be used to realize load-mitigating small wind turbine rotors. Primary aspects of the load mitigation are: (1) Improved overspeed safety affected by blades twisting toward stall in response to speed increases. (2) Reduced fatigue loading affected by blade twisting toward feather in response to turbulent gusts. (B) Illustrate trade-offs and design sensitivities for this concept. (C) Provide the technical basis for small wind turbine manufacturers to evaluate this concept and commercialize if the technology appears favorable. The SolidWorks code was used to rapidly develop solid models of blade with varying shapes and material properties. Finite element analyses (FEA) were performed using the COSMOS code modeling with tip-loads and centripetal accelerations. This tool set was used to investigate the potential for aeroelastic tailoring with combined planform sweep and pre-curve. An extensive matrix of design variables was investigated, including aerodynamic design, magnitude and shape of planform sweep, magnitude and shape of blade pre-curve, material stiffness, and rotor diameter. The FEA simulations resulted in substantial insights into the structural response of these blades. The trends were used to identify geometries and rotor configurations that showed the greatest promise for achieving beneficial aeroelastic response. The ADAMS code was used to perform complete aeroelastic simulations of selected rotor configurations; however, the results of these simulations were not satisfactory. This report documents the challenges encountered with the ADAMS simulations and presents recommendations for further development of this concept for aeroelastically tailored small wind turbine blades.« less
NASA Technical Reports Server (NTRS)
Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)
2008-01-01
An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.
Block 2 Solid Rocket Motor (SRM) conceptual design study, volume 1
NASA Technical Reports Server (NTRS)
1986-01-01
Segmented and monolithic Solid Rocket Motor (SRM) design concepts were evaluated with emphasis on joints and seals. Particular attention was directed to eliminating deficiencies in the SRM High Performance Motor (HPM). The selected conceptual design is described and discussed.
Low-density parity-check codes for volume holographic memory systems.
Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali
2003-02-10
We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.
Analysis of the TREAT LEU Conceptual Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Kontogeorgakos, D. C.; Papadias, D. D.
2016-03-01
Analyses were performed to evaluate the performance of the low enriched uranium (LEU) conceptual design fuel for the conversion of the Transient Reactor Test Facility (TREAT) from its current highly enriched uranium (HEU) fuel. TREAT is an experimental nuclear reactor designed to produce high neutron flux transients for the testing of reactor fuels and other materials. TREAT is currently in non-operational standby, but is being restarted under the U.S. Department of Energy’s Resumption of Transient Testing Program. The conversion of TREAT is being pursued in keeping with the mission of the Department of Energy National Nuclear Security Administration’s Material Managementmore » and Minimization (M3) Reactor Conversion Program. The focus of this study was to demonstrate that the converted LEU core is capable of maintaining the performance of the existing HEU core, while continuing to operate safely. Neutronic and thermal hydraulic simulations have been performed to evaluate the performance of the LEU conceptual-design core under both steady-state and transient conditions, for both normal operation and reactivity insertion accident scenarios. In addition, ancillary safety analyses which were performed for previous LEU design concepts have been reviewed and updated as-needed, in order to evaluate if the converted LEU core will function safely with all existing facility systems. Simulations were also performed to evaluate the detailed behavior of the UO 2-graphite fuel, to support future fuel manufacturing decisions regarding particle size specifications. The results of these analyses will be used in conjunction with work being performed at Idaho National Laboratory and Los Alamos National Laboratory, in order to develop the Conceptual Design Report project deliverable.« less
Parametric study of a canard-configured transport using conceptual design optimization
NASA Technical Reports Server (NTRS)
Arbuckle, P. D.; Sliwa, S. M.
1985-01-01
Constrained-parameter optimization is used to perform optimal conceptual design of both canard and conventional configurations of a medium-range transport. A number of design constants and design constraints are systematically varied to compare the sensitivities of canard and conventional configurations to a variety of technology assumptions. Main-landing-gear location and canard surface high-lift performance are identified as critical design parameters for a statically stable, subsonic, canard-configured transport.
Concepts Within Reach: Action Performance Predicts Action Language Processing in Stroke
Desai, Rutvik H.; Herter, Troy; Riccardi, Nicholas; Rorden, Chris; Fridriksson, Julius
2015-01-01
The relationship between the brain’s conceptual or semantic and sensory-motor systems remains controversial. Here, we tested manual and conceptual abilities of 41 chronic stroke patients in order to examine their relationship. Manual abilities were assed through a reaching task using an exoskeleton robot. Semantic abilities were assessed with implicit as well as explicit semantic tasks, for both verbs and nouns. The results show that that the degree of selective impairment for action word processing was predicted by the degree of impairment in reaching performance. Moreover, the implicit semantic measures showed a correlation with a global reaching parameter, while the explicit semantic similarity judgment task predicted performance in action initiation. These results indicate that action concepts are dynamically grounded through motoric simulations, and that more details are simulated for more explicit semantic tasks. This is evidence for a close and causal relationship between sensory-motor and conceptual systems of the brain. PMID:25858602
Gligorović, Milica; Buha, Nataša
2013-06-01
The ability to generate and flexibly change concepts is of great importance for the development of academic and adaptive skills. This paper analyses the conceptual reasoning ability of children with mild intellectual disability (MID) by their achievements on the Wisconsin Card Sorting Test (WCST). The sample consisted of 95 children with MID aged between 10 years and 13 years 11 months. The following variables from the WCST were analysed: number of categories completed, initial conceptualisation, total number of errors, non-perseverative errors, perseverative errors, number of perseverative responses, and failures to maintain set. The observed WCST predictive variables account for 79% of the variability in the number of categories completed (p < .000). The total number of errors was the most significant predictor of performance on the WCST. We can conclude that there is a significant progress of conceptual abilities between the age of 13 years to 13 years 11 months, compared to other assessed age groups. The results of our research suggests that the development of mental set flexibility is the basis of progress in conceptual abilities, thus intervention programs should offer specially designed activities that vary in their attentional demands, content, conceptual patterns, and actions required.
Maximum likelihood decoding analysis of accumulate-repeat-accumulate codes
NASA Technical Reports Server (NTRS)
Abbasfar, A.; Divsalar, D.; Yao, K.
2004-01-01
In this paper, the performance of the repeat-accumulate codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. Some simple codes are shown that perform very close to Shannon limit with maximum likelihood decoding.
NASA Technical Reports Server (NTRS)
Sager, R. E.; Cox, D. W.
1983-01-01
Various conceptual designs for the secondary mirror actuator system to be used in the Shuttle Infrared Telescope Facility (SIRTF) were evaluated. In addition, a set of design concepts was developed to assist in the solution of problems crucial for optimum performance of the secondary mirror actuator system. A specific conceptual approach was presented along with a plan for developing that approach and identifying issues of critical importance in the developmental effort.
NASA Technical Reports Server (NTRS)
Welstead, Jason; Crouse, Gilbert L., Jr.
2014-01-01
Empirical sizing guidelines such as tail volume coefficients have long been used in the early aircraft design phases for sizing stabilizers, resulting in conservatively stable aircraft. While successful, this results in increased empty weight, reduced performance, and greater procurement and operational cost relative to an aircraft with optimally sized surfaces. Including flight dynamics in the conceptual design process allows the design to move away from empirical methods while implementing modern control techniques. A challenge of flight dynamics and control is the numerous design variables, which are changing fluidly throughout the conceptual design process, required to evaluate the system response to some disturbance. This research focuses on addressing that challenge not by implementing higher order tools, such as computational fluid dynamics, but instead by linking the lower order tools typically used within the conceptual design process so each discipline feeds into the other. In thisresearch, flight dynamics and control was incorporated into the conceptual design process along with the traditional disciplines of vehicle sizing, weight estimation, aerodynamics, and performance. For the controller, a linear quadratic regulator structure with constant gains has been specified to reduce the user input. Coupling all the disciplines in the conceptual design phase allows the aircraft designer to explore larger design spaces where stabilizers are sized according to dynamic response constraints rather than historical static margin and volume coefficient guidelines.
Structural Analysis in a Conceptual Design Framework
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.
2012-01-01
Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.
Senin, Tatjana; Meyer, Thorsten
2018-01-22
Aim was to gather theoretical knowledge about self-determination and to develop a conceptual model for medical rehabilitation- which serves as a basis for discussion. We performed a literature research in electronic databases. Various theories and research results were adopted and transferred to the context of medical rehabilitation and into a conceptual model. The conceptual model of self-determination reflects on a continuum which forms of self-determination may be present in situations of medical rehabilitation treatments. The location on the continuum depends theoretically on the manifestation of certain internal and external factors that may influence each other. The model provides a first conceptualization of self-determination focusing on medical rehabilitation which should be further refined and tested empirically. © Georg Thieme Verlag KG Stuttgart · New York.
Iterative demodulation and decoding of coded non-square QAM
NASA Technical Reports Server (NTRS)
Li, L.; Divsalar, D.; Dolinar, S.
2003-01-01
Simulation results show that, with iterative demodulation and decoding, coded NS-8QAM performs 0.5 dB better than standard 8QAM and 0.7 dB better than 8PSK at BER= 10(sup -5), when the FEC code is the (15, 11) Hamming code concatenated with a rate-1 accumulator code, while coded NS-32QAM performs 0.25 dB better than standard 32QAM.
NASA Technical Reports Server (NTRS)
Lin, C. H.; Meyer, M. S.
1983-01-01
The systems engineering aspects of developing a conceptual design of the Space Station Environmental Control and Life Support System (ECLSS) are discussed. Topics covered include defining system requirements and groundrules for approach, formulating possible cycle closure options, and establishing a system-level mass balance on the essential materials processed in oxygen and water cycles. Consideration is also given to the performance of a system trade-off study to determine the best degree of cycle closure for the ECLSS, and the construction of a conceptual design of the ECLSS with subsystem performance specifications and candidate concepts. For the optimum balance between development costs, technological risks, and resupply penalties, a partially closed cycle ECLSS option is suggested.
Advanced coding and modulation schemes for TDRSS
NASA Technical Reports Server (NTRS)
Harrell, Linda; Kaplan, Ted; Berman, Ted; Chang, Susan
1993-01-01
This paper describes the performance of the Ungerboeck and pragmatic 8-Phase Shift Key (PSK) Trellis Code Modulation (TCM) coding techniques with and without a (255,223) Reed-Solomon outer code as they are used for Tracking Data and Relay Satellite System (TDRSS) S-Band and Ku-Band return services. The performance of these codes at high data rates is compared to uncoded Quadrature PSK (QPSK) and rate 1/2 convolutionally coded QPSK in the presence of Radio Frequency Interference (RFI), self-interference, and hardware distortions. This paper shows that the outer Reed-Solomon code is necessary to achieve a 10(exp -5) Bit Error Rate (BER) with an acceptable level of degradation in the presence of RFI. This paper also shows that the TCM codes with or without the Reed-Solomon outer code do not perform well in the presence of self-interference. In fact, the uncoded QPSK signal performs better than the TCM coded signal in the self-interference situation considered in this analysis. Finally, this paper shows that the E(sub b)/N(sub 0) degradation due to TDRSS hardware distortions is approximately 1.3 dB with a TCM coded signal or a rate 1/2 convolutionally coded QPSK signal and is 3.2 dB with an uncoded QPSK signal.
Software for Collaborative Engineering of Launch Rockets
NASA Technical Reports Server (NTRS)
Stanley, Thomas Troy
2003-01-01
The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.
Individual differences in children's understanding of inversion and arithmetical skill.
Gilmore, Camilla K; Bryant, Peter
2006-06-01
Background and aims. In order to develop arithmetic expertise, children must understand arithmetic principles, such as the inverse relationship between addition and subtraction, in addition to learning calculation skills. We report two experiments that investigate children's understanding of the principle of inversion and the relationship between their conceptual understanding and arithmetical skills. A group of 127 children from primary schools took part in the study. The children were from 2 age groups (6-7 and 8-9 years). Children's accuracy on inverse and control problems in a variety of presentation formats and in canonical and non-canonical forms was measured. Tests of general arithmetic ability were also administered. Children consistently performed better on inverse than control problems, which indicates that they could make use of the inverse principle. Presentation format affected performance: picture presentation allowed children to apply their conceptual understanding flexibly regardless of the problem type, while word problems restricted their ability to use their conceptual knowledge. Cluster analyses revealed three subgroups with different profiles of conceptual understanding and arithmetical skill. Children in the 'high ability' and 'low ability' groups showed conceptual understanding that was in-line with their arithmetical skill, whilst a 3rd group of children had more advanced conceptual understanding than arithmetical skill. The three subgroups may represent different points along a single developmental path or distinct developmental paths. The discovery of the existence of the three groups has important consequences for education. It demonstrates the importance of considering the pattern of individual children's conceptual understanding and problem-solving skills.
Toward performance portability of the Albany finite element analysis code using the Kokkos library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.
Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less
Toward performance portability of the Albany finite element analysis code using the Kokkos library
Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.; ...
2018-02-05
Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less
A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.
Vakil, Eli; Lev-Ran Galon, Carmit
2014-01-01
Existing literature presents a complex and inconsistent picture of the specific deficiencies involved in skill learning following traumatic brain injury (TBI). In an attempt to address this difficulty, individuals with moderate to severe TBI (n = 29) and a control group (n = 29) were tested with two different skill-learning tasks: conceptual (i.e., Tower of Hanoi Puzzle, TOHP) and perceptual (i.e., mirror reading, MR). Based on previous studies of the effect of divided attention on these tasks and findings regarding the effect of TBI on conceptual and perceptual priming tasks, it was predicted that the group with TBI would show impaired baseline performance compared to controls in the TOHP task though their learning rate would be maintained, while both baseline performance and learning rate on the MR task would be maintained. Consistent with our predictions, overall baseline performance of the group with TBI was impaired in the TOHP test, while the learning rate was not. The learning rate on the MR task was preserved but, contrary to our prediction, response time of the group with TBI was slower than that of controls. The pattern of results observed in the present study was interpreted to possibly reflect an impairment of both the frontal lobes as well as that of diffuse axonal injury, which is well documented as being affected by TBI. The former impairment affects baseline performance of the conceptual learning skill, while the latter affects the overall slower performance of the perceptual learning skill.
Some partial-unit-memory convolutional codes
NASA Technical Reports Server (NTRS)
Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.
1991-01-01
The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.
The neural career of sensory-motor metaphors.
Desai, Rutvik H; Binder, Jeffrey R; Conant, Lisa L; Mano, Quintino R; Seidenberg, Mark S
2011-09-01
The role of sensory-motor systems in conceptual understanding has been controversial. It has been proposed that many abstract concepts are understood metaphorically through concrete sensory-motor domains such as actions. Using fMRI, we compared neural responses with literal action (Lit; The daughter grasped the flowers), metaphoric action (Met; The public grasped the idea), and abstract (Abs; The public understood the idea) sentences of varying familiarity. Both Lit and Met sentences activated the left anterior inferior parietal lobule, an area involved in action planning, with Met sentences also activating a homologous area in the right hemisphere, relative to Abs sentences. Both Met and Abs sentences activated the left superior temporal regions associated with abstract language. Importantly, activation in primary motor and biological motion perception regions was inversely correlated with Lit and Met familiarity. These results support the view that the understanding of metaphoric action retains a link to sensory-motor systems involved in action performance. However, the involvement of sensory-motor systems in metaphor understanding changes through a gradual abstraction process whereby relatively detailed simulations are used for understanding unfamiliar metaphors, and these simulations become less detailed and involve only secondary motor regions as familiarity increases. Consistent with these data, we propose that anterior inferior parietal lobule serves as an interface between sensory-motor and conceptual systems and plays an important role in both domains. The similarity of abstract and metaphoric sentences in the activation of left superior temporal regions suggests that action metaphor understanding is not completely based on sensory-motor simulations but relies also on abstract lexical-semantic codes.
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
NASA Astrophysics Data System (ADS)
Nitadori, Keigo; Makino, Junichiro; Hut, Piet
2006-12-01
The main performance bottleneck of gravitational N-body codes is the force calculation between two particles. We have succeeded in speeding up this pair-wise force calculation by factors between 2 and 10, depending on the code and the processor on which the code is run. These speed-ups were obtained by writing highly fine-tuned code for x86_64 microprocessors. Any existing N-body code, running on these chips, can easily incorporate our assembly code programs. In the current paper, we present an outline of our overall approach, which we illustrate with one specific example: the use of a Hermite scheme for a direct N2 type integration on a single 2.0 GHz Athlon 64 processor, for which we obtain an effective performance of 4.05 Gflops, for double-precision accuracy. In subsequent papers, we will discuss other variations, including the combinations of N log N codes, single-precision implementations, and performance on other microprocessors.
NASA Technical Reports Server (NTRS)
Ellis, J. R.; Sandlass, G. S.; Bayyari, M.
2001-01-01
A design study was undertaken to investigate the feasibility of using simple specimen designs and reusable fixturing for in-plane biaxial tests planned for advanced aeropropulsion materials. Materials of interest in this work include: advanced metallics, polymeric matrix composites, metal and intermetallic matrix composites, and ceramic matrix composites. Early experience with advanced metallics showed that the cruciform specimen design typically used in this type of testing was impractical for these materials, primarily because of concerns regarding complexity and cost. The objective of this research was to develop specimen designs, fixturing, and procedures which would allow in-plane biaxial tests to be conducted on a wide range of aeropropulsion materials while at the same time keeping costs within acceptable limits. With this goal in mind. a conceptual design was developed centered on a specimen incorporating a relatively simple arrangement of slots and fingers for attachment and loading purposes. The ANSYS finite element code was used to demonstrate the feasibility of the approach and also to develop a number of optimized specimen designs. The same computer code was used to develop the reusable fixturing needed to position and grip the specimens in the load frame. The design adopted uses an assembly of slotted fingers which can be reconfigured as necessary to obtain optimum biaxial stress states in the specimen gage area. Most recently, prototype fixturing was manufactured and is being evaluated over a range of uniaxial and biaxial loading conditions.
The predictive mind and the experience of visual art work
Kesner, Ladislav
2014-01-01
Among the main challenges of the predictive brain/mind concept is how to link prediction at the neural level to prediction at the cognitive-psychological level and finding conceptually robust and empirically verifiable ways to harness this theoretical framework toward explaining higher-order mental and cognitive phenomena, including the subjective experience of aesthetic and symbolic forms. Building on the tentative prediction error account of visual art, this article extends the application of the predictive coding framework to the visual arts. It does so by linking this theoretical discussion to a subjective, phenomenological account of how a work of art is experienced. In order to engage more deeply with a work of art, viewers must be able to tune or adapt their prediction mechanism to recognize art as a specific class of objects whose ontological nature defies predictability, and they must be able to sustain a productive flow of predictions from low-level sensory, recognitional to abstract semantic, conceptual, and affective inferences. The affective component of the process of predictive error optimization that occurs when a viewer enters into dialog with a painting is constituted both by activating the affective affordances within the image and by the affective consequences of prediction error minimization itself. The predictive coding framework also has implications for the problem of the culturality of vision. A person’s mindset, which determines what top–down expectations and predictions are generated, is co-constituted by culture-relative skills and knowledge, which form hyperpriors that operate in the perception of art. PMID:25566111
The predictive mind and the experience of visual art work.
Kesner, Ladislav
2014-01-01
Among the main challenges of the predictive brain/mind concept is how to link prediction at the neural level to prediction at the cognitive-psychological level and finding conceptually robust and empirically verifiable ways to harness this theoretical framework toward explaining higher-order mental and cognitive phenomena, including the subjective experience of aesthetic and symbolic forms. Building on the tentative prediction error account of visual art, this article extends the application of the predictive coding framework to the visual arts. It does so by linking this theoretical discussion to a subjective, phenomenological account of how a work of art is experienced. In order to engage more deeply with a work of art, viewers must be able to tune or adapt their prediction mechanism to recognize art as a specific class of objects whose ontological nature defies predictability, and they must be able to sustain a productive flow of predictions from low-level sensory, recognitional to abstract semantic, conceptual, and affective inferences. The affective component of the process of predictive error optimization that occurs when a viewer enters into dialog with a painting is constituted both by activating the affective affordances within the image and by the affective consequences of prediction error minimization itself. The predictive coding framework also has implications for the problem of the culturality of vision. A person's mindset, which determines what top-down expectations and predictions are generated, is co-constituted by culture-relative skills and knowledge, which form hyperpriors that operate in the perception of art.
Insertion of operation-and-indicate instructions for optimized SIMD code
Eichenberger, Alexander E; Gara, Alan; Gschwind, Michael K
2013-06-04
Mechanisms are provided for inserting indicated instructions for tracking and indicating exceptions in the execution of vectorized code. A portion of first code is received for compilation. The portion of first code is analyzed to identify non-speculative instructions performing designated non-speculative operations in the first code that are candidates for replacement by replacement operation-and-indicate instructions that perform the designated non-speculative operations and further perform an indication operation for indicating any exception conditions corresponding to special exception values present in vector register inputs to the replacement operation-and-indicate instructions. The replacement is performed and second code is generated based on the replacement of the at least one non-speculative instruction. The data processing system executing the compiled code is configured to store special exception values in vector output registers, in response to a speculative instruction generating an exception condition, without initiating exception handling.
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.
Klempova, Bibiana; Liepelt, Roman
2016-07-01
Recent findings suggest that a Simon effect (SE) can be induced in Individual go/nogo tasks when responding next to an event-producing object salient enough to provide a reference for the spatial coding of one's own action. However, there is skepticism against referential coding for the joint Simon effect (JSE) by proponents of task co-representation. In the present study, we tested assumptions of task co-representation and referential coding by introducing unexpected double response events in a joint go/nogo and a joint independent go/nogo task. In Experiment 1b, we tested if task representations are functionally similar in joint and standard Simon tasks. In Experiment 2, we tested sequential updating of task co-representation after unexpected single response events in the joint independent go/nogo task. Results showed increased JSEs following unexpected events in the joint go/nogo and joint independent go/nogo task (Experiment 1a). While the former finding is in line with the assumptions made by both accounts (task co-representation and referential coding), the latter finding supports referential coding. In contrast to Experiment 1a, we found a decreased SE after unexpected events in the standard Simon task (Experiment 1b), providing evidence against the functional equivalence assumption between joint and two-choice Simon tasks of the task co-representation account. Finally, we found an increased JSE also following unexpected single response events (Experiment 2), ruling out that the findings of the joint independent go/nogo task in Experiment 1a were due to a re-conceptualization of the task situation. In conclusion, our findings support referential coding also for the joint Simon effect.
Types and patterns of safety concerns in home care: client and family caregiver perspectives
Tong, Catherine E.; Sims-Gould, Joanie; Martin-Matthews, Anne
2016-01-01
Objective Drawing on interviews with home care clients and their family caregivers, we sought to understand how these individuals conceptualize safety in the provision and receipt of home care, how they promote safety in the home space and how their safety concerns differ from those of home support workers. Design In-depth, semi-structured interviews were conducted with clients and family caregivers. The analysis included topic and analytical coding of participants' verbatim accounts. Setting Interviews were completed in British Columbia, Canada. Participants Totally 82 clients and 55 caregivers participated. Results Clients and family caregivers identified three types of safety concerns: physical, spatial and interpersonal. These concerns are largely multi-dimensional and intersectional. We present a conceptual model of client and caregiver safety concerns. We also examine the factors that intensify and mitigate safety concerns in the home. Conclusions In spite of safety concerns, clients and family caregivers overwhelmingly prefer to receive care in the home setting. Spatial and physical concerns are the most salient. The financial burden of creating a safe care space should not be the client's alone to bear. The conceptualization and promotion of safety in home care must recognize the roles, responsibilities and perspectives of all of the actors involved, including workers, clients and their caregivers. PMID:26832159
"At My Age … ": Defining Sexual Wellness in Mid- and Later Life.
Syme, Maggie L; Cohn, Tracy J; Stoffregen, Sydney; Kaempfe, Hanna; Schippers, Desiree
2018-04-18
Sexual wellness is integral to quality of life across the life span, despite ageist stereotypes suggesting sexual expression ends at midlife. However, conceptualizing sexual wellness in mid- and later life is complicated by a dysfunction-based narrative, lack of a sex-positive aging framework, and existing measures that are age irrelevant and limited in scope. This study aimed to address these limitations by providing a conceptualization of sexual wellness grounded in definitions from midlife and older adults. A sample of 373 midlife and older adults (M = 60, SD = 5.84) in the United States provided a definition of sexual wellness. Using thematic analysis, multiple researchers coded qualitative responses, and results suggested a biopsychosocial-cultural framework. Findings reflect that midlife and older adults provide multifaceted definitions inclusive of various behavioral experiences, including disengaging from sex. They are also keenly aware of physical and psychological limitations and strengths, and emphasize mutual experiences and synchronicity. Midlife and older adults also reflect on age, drawing comparisons to different phases of life and often displaying adaptability in adjusting expectations. When conceptualizing sexual wellness in this population it is imperative to capture this multidimensionality, include those who are not actively engaging in sex, and be aware of the influence of ageist and dys/function narratives.
Error-Rate Bounds for Coded PPM on a Poisson Channel
NASA Technical Reports Server (NTRS)
Moision, Bruce; Hamkins, Jon
2009-01-01
Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.
Bounds on Block Error Probability for Multilevel Concatenated Codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Moorthy, Hari T.; Stojanovic, Diana
1996-01-01
Maximum likelihood decoding of long block codes is not feasable due to large complexity. Some classes of codes are shown to be decomposable into multilevel concatenated codes (MLCC). For these codes, multistage decoding provides good trade-off between performance and complexity. In this paper, we derive an upper bound on the probability of block error for MLCC. We use this bound to evaluate difference in performance for different decompositions of some codes. Examples given show that a significant reduction in complexity can be achieved when increasing number of stages of decoding. Resulting performance degradation varies for different decompositions. A guideline is given for finding good m-level decompositions.
In service inspection and repair of sodium cooled ASTRID prototype
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baque, F.; Jadot, F.; Marlier, R.
2015-07-01
In the frame of the large R and D work which is performed for the future ASTRID sodium cooled prototype, In Service Inspection and Repair (ISI and R) has been identified as a major issue to be taken into account in order to enlarge the plant safety, to consolidate its availability and to protect the associated investment. After the first part of pre-conceptual design phase (2008-2012), the running second part of pre-conceptual phase (2013-2015) allows to increase the ISI and R tool ability for immersed sodium structures of ASTRID, at about 200 deg. C, on the basis of consolidated specificationsmore » and thanks to their qualification through more and more realistic laboratory tests and simulation with CIVA code. ISI and R items are being developed and qualified during a pluri-annual program which mainly deals with the reactor block structures, the primary components and circuit, and the Power Conversion System. It ensures a strong connection between the reactor designers and inspection specialists, as the optimization of inspectability and repairability is looked at: this already induced specific rules for design, in order to shorten and ease the ISI and R operations, which have been merged into RCC-MRx rules. In the frame of increasing technology readiness level with corresponding performance demonstration, this paper presents R and D dealing with the ISI and R items: it highlights the sensor development (both ultrasonic and electromagnetic concepts, compatible with sodium at 200 deg. C), then their applications for ASTRID structure control (under sodium telemetry, imaging and NDE). Activity for repair is also presented (a single laser tool for sodium sweeping, machining and welding), and finally the effort for associated robotic (generic program for ASTRID applications, specific technological tools for sodium medium, tight immersed bell). The main results of testing and simulation are given for telemetry, vision, NDE applications, laser process repair and under sodium sealing. (authors)« less
Campbell, David J T; Manns, Braden J; Hemmelgarn, Brenda R; Sanmartin, Claudia; King-Shier, Kathryn M
2016-01-01
Patients with cardiovascular-related chronic diseases may face financial barriers to accessing health care, even in Canada, where universal health care insurance is in place. No current theory or framework is adequate for understanding the impact of financial barriers to care on these patients or how they experience financial barriers. The overall objective of this study is to develop a framework for understanding the role of financial barriers to care in the lives of patients with cardiovascular-related chronic diseases and the impact of such barriers on their health. We will perform an inductive qualitative grounded theory study to develop a framework to understand the effect of financial barriers to care on patients with cardiovascular-related chronic diseases. We will use semistructured interviews (face-to-face and telephone) with a purposive sample of adult patients from Alberta with at least 1 of hypertension, diabetes, heart disease or stroke. We will analyze interview transcripts in triplicate using grounded theory coding techniques, including open, focused and axial coding, following the principle of constant comparison. Interviews and analysis will be done iteratively to theoretical saturation. Member checking will be used to enhance rigour. A comprehensive framework for understanding financial barriers to accessing health care is instrumental for both researchers and clinicians who care for patients with chronic diseases. Such a framework would enable a better understanding of patient behaviour and nonadherence to recommended medical therapies and lifestyle modifications.
Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers
NASA Astrophysics Data System (ADS)
Ogino, T.
High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
NASA Technical Reports Server (NTRS)
Tsuchiya, T.; Murthy, S. N. B.
1982-01-01
A computer code is presented for the prediction of off-design axial flow compressor performance with water ingestion. Four processes were considered to account for the aero-thermo-mechanical interactions during operation with air-water droplet mixture flow: (1) blade performance change, (2) centrifuging of water droplets, (3) heat and mass transfer process between the gaseous and the liquid phases and (4) droplet size redistribution due to break-up. Stage and compressor performance are obtained by a stage stacking procedure using representative veocity diagrams at a rotor inlet and outlet mean radii. The Code has options for performance estimation with (1) mixtures of gas and (2) gas-water droplet mixtures, and therefore can take into account the humidity present in ambient conditions. A test case illustrates the method of using the Code. The Code follows closely the methodology and architecture of the NASA-STGSTK Code for the estimation of axial-flow compressor performance with air flow.
Binary weight distributions of some Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Pollara, F.; Arnold, S.
1992-01-01
The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.
NASA Astrophysics Data System (ADS)
Jacques, Diederik
2017-04-01
As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different environmental and geo-engineering applications. SeSBench will organize new workshops to add new benchmarks in a new special issue. Steefel, C. I., et al. (2015). "Reactive transport codes for subsurface environmental simulation." Computational Geosciences 19: 445-478.
Long distance quantum communication with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang; Jianggroup Team
We study the construction of quantum Reed Solomon codes from classical Reed Solomon codes and show that they achieve the capacity of quantum erasure channel for multi-level quantum systems. We extend the application of quantum Reed Solomon codes to long distance quantum communication, investigate the local resource overhead needed for the functioning of one-way quantum repeaters with these codes, and numerically identify the parameter regime where these codes perform better than the known quantum polynomial codes and quantum parity codes . Finally, we discuss the implementation of these codes into time-bin photonic states of qubits and qudits respectively, and optimize the performance for one-way quantum repeaters.
NASA Astrophysics Data System (ADS)
Askay, S.
2009-12-01
Published on Memorial Day 2009, Map the Fallen is a Google Earth visualization of the 5500+ US and international soldiers that have died in Iraq and Afghanistan since 2001. In addition to providing photos, stories and links for each solider, the time-animated map visually connects hometowns to places of death. This novel way of representing casualty data brings the geographic reach and magnitude of the issue into focus together with the very personal nature of individual stories. Innovative visualizations techniques were used that illustrate the spatio-temporal nature of this information and to show the global reach and interconnectivity of this issue. Several of advanced KML techniques employed to create this engaging and performance-conscious map will be discussed during this session. These include: 1) the use of HTML iframes and javascript to minimize the KML size, and extensive cross-linking throughout content; 2) the creation of a time-animated, on-screen casualty counter; 3) the use of parabolic arcs to connect each hometown to place of death; 4) the use of concentric spirals to represent chronological data; and 5) numerous performance optimizations to ensure the 23K placemarks, 2500 screen overlays and nearly 250k line vertices performed well in Google Earth. This session will include a demonstration of the map, conceptual discussions of the techniques used, and some in-depth technical explanation of the KML code.
Philosophy and conceptual framework: collectively structuring nursing care systematization.
Schmitz, Eudinéia Luz; Gelbcke, Francine Lima; Bruggmann, Mario Sérgio; Luz, Susian Cássia Liz
2017-03-30
To build the Nursing Philosophy and Conceptual Framework that will support the Nursing Care Systematization in a hospital in southern Brazil with the active participation of the institution's nurses. Convergent Care Research Data collection took place from July to October 2014, through two workshops and four meetings, with 42 nurses. As a result, the nursing philosophy and conceptual framework were created and the theory was chosen. Data analysis was performed based on Morse and Field. The philosophy involves the following beliefs: team nursing; team work; holistic care; service excellence; leadership/coordination; interdisciplinary team commitment. The conceptual framework brings concepts such as: human being; nursing; nursing care, safe care. The nursing theory defined was that of Wanda de Aguiar Horta. As a contribution, it brought the construction of the institutions' nursing philosophy and conceptual framework, and the definition of a nursing theory.
Parametric System Model for a Stirling Radioisotope Generator
NASA Technical Reports Server (NTRS)
Schmitz, Paul C.
2015-01-01
A Parametric System Model (PSM) was created in order to explore conceptual designs, the impact of component changes and power level on the performance of the Stirling Radioisotope Generator (SRG). Using the General Purpose Heat Source (GPHS approximately 250 Wth) modules as the thermal building block from which a SRG is conceptualized, trade studies are performed to understand the importance of individual component scaling on isotope usage. Mathematical relationships based on heat and power throughput, temperature, mass, and volume were developed for each of the required subsystems. The PSM uses these relationships to perform component- and system-level trades.
Parametric System Model for a Stirling Radioisotope Generator
NASA Technical Reports Server (NTRS)
Schmitz, Paul C.
2014-01-01
A Parametric System Model (PSM) was created in order to explore conceptual designs, the impact of component changes and power level on the performance of Stirling Radioisotope Generator (SRG). Using the General Purpose Heat Source (GPHS approximately 250 watt thermal) modules as the thermal building block around which a SRG is conceptualized, trade studies are performed to understand the importance of individual component scaling on isotope usage. Mathematical relationships based on heat and power throughput, temperature, mass and volume were developed for each of the required subsystems. The PSM uses these relationships to perform component and system level trades.
Understanding Eating Disorders in Elite Gymnastics: Ethical and Conceptual Challenges.
Tan, Jacinta Oon Ai; Calitri, Raff; Bloodworth, Andrew; McNamee, Michael J
2016-04-01
Eating disorders and disordered eating are more common in high performance sports than the general population, and particularly so in high performance aesthetic sports. This paper presents some of the conceptual difficulties in understanding and diagnosing eating disorders in high performance gymnasts. It presents qualitative and quantitative data from a study designed to ascertain the pattern of eating disorder symptoms, depressive symptoms and levels of self-esteem among national and international level gymnasts from the UK in the gymnastic disciplines of sport acrobatics, tumbling, and rhythmic gymnastics. Copyright © 2016 Elsevier Inc. All rights reserved.
One-way quantum repeaters with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang
2018-05-01
We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of d -level systems for large dimension d . We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generations of quantum repeaters using quantum Reed-Solomon codes and identify parameter regimes where each generation performs the best.
Optimized atom position and coefficient coding for matching pursuit-based image compression.
Shoa, Alireza; Shirani, Shahram
2009-12-01
In this paper, we propose a new encoding algorithm for matching pursuit image coding. We show that coding performance is improved when correlations between atom positions and atom coefficients are both used in encoding. We find the optimum tradeoff between efficient atom position coding and efficient atom coefficient coding and optimize the encoder parameters. Our proposed algorithm outperforms the existing coding algorithms designed for matching pursuit image coding. Additionally, we show that our algorithm results in better rate distortion performance than JPEG 2000 at low bit rates.
Scaffolding software: How does it influence student conceptual understanding and motivation?
NASA Astrophysics Data System (ADS)
Butler, Kyle A.
The purpose of this study was to determine the influence of scaffolding software on student conceptual understanding and motivation. This study also provides insight on how students use the scaffolding features found in Artemis and the extent to which features show a relationship to student conceptual understanding and motivation. A Randomized Solomon Four Group Design was used in this study. As students worked through a project based unit over photosynthesis, the students performed information seeking activities that were based on their own inquiry. For this purpose, the students in the experimental group used an example of scaffolding software called Artemis, while the students in the control group used a search engine of their choice. To measure conceptual understanding, the researcher analyzed student generated concept maps on photosynthesis using three different methods (quantitative, qualitative, hierarchical). To measure motivation, the researcher used a survey that measured motivation on five different indicators: intrinsic goal orientation, extrinsic goal orientation, task value, control of learning beliefs, self-efficacy for learning and performance. Finally, the researcher looked at the relationship and influence of the scaffolding features on two student performance scores at the end of the unit. This created a total of ten dependent variables in relationship to the treatment. Overall, the students used the collaborative features 25% of the time, the maintenance features 0.84% of the time, the organizational features 16% of the time, the saving/viewing features 7% of the time and the searching features 51% of the time. There were significant correlations between the saving/viewing features hits and the students' task value (r = .499, p < .05), the searching features hits and the students' self-efficacy for learning and performance (r = .553, p < .01), the collaborative features hits and the students' essay performance scores (r = .519, p < .05) and the maintenance features time and the qualitative analysis of the concept maps (r = .576, p < .01). Finally, the results indicated that the scaffolding features in Artemis did not influence student conceptual understanding and motivation.
Using Conceptual Categories of Questions To Measure Differences in Retrieval Performance.
ERIC Educational Resources Information Center
Keyes, John G.
1996-01-01
To investigate the relationship between the retrieval mechanism and the level of question elaboration, this study divided 100 questions from the cystic fibrosis database into five conceptual categories based on their semantic representations. Two retrieval methods were chosen to investigate potential differences in outcomes across conceptual…
The Effect of Multimedia-Based Learning on the Concept Learning Levels and Attitudes of Students
ERIC Educational Resources Information Center
Beydogan, H. Ömer; Hayran, Zeynel
2015-01-01
Problem Statement: Rich stimuli received by sensory organs such as vision, hearing, and touch are important elements that affect an individual's perception, identification, classification, and conceptualization of the external world. In primary education, since students perform conceptual abstraction based upon concrete characteristics, when they…
Orchestration in Learning Technology Research: Evaluation of a Conceptual Framework
ERIC Educational Resources Information Center
Prieto, Luis P.; Dimitriadis, Yannis; Asensio-Pérez, Juan I.; Looi, Chee-Kit
2015-01-01
The term "orchestrating learning" is being used increasingly often, referring to the coordination activities performed while applying learning technologies to authentic settings. However, there is little consensus about how this notion should be conceptualised, and what aspects it entails. In this paper, a conceptual framework for…
The Conceptual Framework for the Development of a Mathematics Performance Assessment Instrument.
ERIC Educational Resources Information Center
Lane, Suzanne
1993-01-01
A conceptual framework is presented for the development of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument (QCAI) that focuses on the ability of middle-school students to problem solve, reason, and communicate mathematically. The instrument will provide programatic rather than…
Conceptual Scoring and Classification Accuracy of Vocabulary Testing in Bilingual Children
ERIC Educational Resources Information Center
Anaya, Jissel B.; Peña, Elizabeth D.; Bedore, Lisa M.
2018-01-01
Purpose: This study examined the effects of single-language and conceptual scoring on the vocabulary performance of bilingual children with and without specific language impairment. We assessed classification accuracy across 3 scoring methods. Method: Participants included Spanish-English bilingual children (N = 247) aged 5;1 (years;months) to…
Symptom outcomes important to women with anal incontinence: a conceptual framework.
Sung, Vivian W; Rogers, Rebecca G; Bann, Carla M; Arya, Lily; Barber, Matthew D; Lowder, Jerry; Lukacz, Emily S; Markland, Alayne; Siddiqui, Nazema; Wilmot, Amanda; Meikle, Susan F
2014-05-01
To develop a framework that describes the most important symptom outcomes for anal incontinence treatment from the patient perspective. A conceptual framework was developed by the Pelvic Floor Disorders Network based on four semistructured focus groups and confirmed in two sets of 10 cognitive interviews including women with anal incontinence. We explored: 1) patient-preferred terminology for describing anal incontinence symptoms; 2) patient definitions of treatment "success"; 3) importance of symptoms and outcomes in the framework; and 4) conceptual gaps (defined as outcomes not previously identified as important). Sessions were conducted according to grounded theory transcribed, coded, and qualitatively and quantitatively analyzed to identify relevant themes. Content and face validity of the framework were further assessed using cognitive interviews. Thirty-four women participated in focus groups and 20 in cognitive interviews. Overall, 29 (54%) were aged 60 years or older, 42 (78%) were white, and 10 (19%) had a high school degree or less. Two overarching outcome themes were identified: "primary bowel leakage symptoms" and "ancillary bowel symptoms." Subdomains important in primary bowel leakage symptoms included leakage characteristics (symptom frequency, amount of leakage, symptom bother) and conditions when bowel leakage occurs (predictability, awareness, urgency). Subdomains important under ancillary bowel symptoms included emptying disorders (constipation, obstructed defecation, and wiping issues) and discomfort (pain, burning). New outcomes identified included predictability, awareness, wiping issues, and discomfort. Women with anal incontinence desire a wide range of symptom outcomes after treatment. These are captured in our conceptual framework, which can aid clinicians and researchers in assessing anal incontinence. LEVEL OF EVIEDENCE: II.
Conceptual framework for patient-important treatment outcomes for pelvic organ prolapse.
Sung, Vivian W; Rogers, Rebecca G; Barber, Matthew D; Clark, Melissa A
2014-04-01
To develop a comprehensive conceptual framework representing the most important outcomes for women seeking treatment for pelvic organ prolapse (POP). Twenty-five women with POP were recruited and participated in four semi-structured focus groups to refine and assess the content validity of a conceptual framework representing patient-important outcomes for POP. Specifically, the focus groups addressed the following three aims: (1) to evaluate the content and appropriateness of domains in our framework; (2) to identify gaps in the framework; and (3) to determine the relative importance of our framework domains from the patient perspective. Sessions were transcribed, coded, and qualitatively and quantitatively analyzed using analytic induction and deductive analysis to identify themes and domains relevant to women with POP. Our focus groups confirmed the importance of vaginal bulge symptoms (discomfort, bother, and adaptation), and the overarching domains and subdomains of physical (physical function and participation), social (social function, relationships, and sexual function), and mental health (emotional distress, preoccupation, and body image). Patients ranked outcomes in the following order of importance: (1) the resolution of vaginal bulge symptoms, (2) improvement in physical function; (3) improvement in sexual function; (4) improvement in body image perception; and (5) improvement in social function. We developed a conceptual framework for patient important outcomes of women seeking treatment for POP. This framework can improve the transparency and interpretation of POP study findings from the patient perspective. Vaginal bulge and its associated discomfort are most important for the definition of POP treatment success from the patient perspective. © 2013 Wiley Periodicals, Inc.
Commonality between Reduced Gravity and Microgravity Habitats for Long Duration Missions
NASA Technical Reports Server (NTRS)
Howard, Robert
2014-01-01
Many conceptual studies for long duration missions beyond Earth orbit have assumed unique habitat designs for each destination and for transit habitation. This may not be the most effective approach. A variable gravity habitat, one designed for use in microgravity, lunar, Martian, and terrestrial environments may provide savings that offset the loss of environment-specific optimization. However, a brief analysis of selected flown spacecraft and Constellation-era conceptual habitat designs suggests that one cannot simply lift a habitat from one environment and place it in another that it was not designed for without incurring significant human performance compromises. By comparison, a conceptual habitat based on the Skylab II framework but designed specifically to accommodate variable gravity environments can be shown to yield significant advantages while incurring only minimal human performance compromises.
Unaligned instruction relocation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less
Unaligned instruction relocation
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.
2018-01-23
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.