Cao, Xiaolong; Jiang, Haobo
2015-01-01
The genome sequence of Manduca sexta was recently determined using 454 technology. Cufflinks and MAKER2 were used to establish gene models in the genome assembly based on the RNA-Seq data and other species' sequences. Aided by the extensive RNA-Seq data from 50 tissue samples at various life stages, annotators over the world (including the present authors) have manually confirmed and improved a small percentage of the models after spending months of effort. While such collaborative efforts are highly commendable, many of the predicted genes still have problems which may hamper future research on this insect species. As a biochemical model representing lepidopteran pests, M. sexta has been used extensively to study insect physiological processes for over five decades. In this work, we assembled Manduca datasets Cufflinks 3.0, Trinity 4.0, and Oases 4.0 to assist the manual annotation efforts and development of Official Gene Set (OGS) 2.0. To further improve annotation quality, we developed methods to evaluate gene models in the MAKER2, Cufflinks, Oases and Trinity assemblies and selected the best ones to constitute MCOT 1.0 after thorough crosschecking. MCOT 1.0 has 18,089 genes encoding 31,666 proteins: 32.8% match OGS 2.0 models perfectly or near perfectly, 11,747 differ considerably, and 29.5% are absent in OGS 2.0. Future automation of this process is anticipated to greatly reduce human efforts in generating comprehensive, reliable models of structural genes in other genome projects where extensive RNA-Seq data are available. PMID:25612938
The Effort Paradox: Effort Is Both Costly and Valued.
Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y
2018-04-01
According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Cao, Xiaolong; Jiang, Haobo
2015-07-01
The genome sequence of Manduca sexta was recently determined using 454 technology. Cufflinks and MAKER2 were used to establish gene models in the genome assembly based on the RNA-Seq data and other species' sequences. Aided by the extensive RNA-Seq data from 50 tissue samples at various life stages, annotators over the world (including the present authors) have manually confirmed and improved a small percentage of the models after spending months of effort. While such collaborative efforts are highly commendable, many of the predicted genes still have problems which may hamper future research on this insect species. As a biochemical model representing lepidopteran pests, M. sexta has been used extensively to study insect physiological processes for over five decades. In this work, we assembled Manduca datasets Cufflinks 3.0, Trinity 4.0, and Oases 4.0 to assist the manual annotation efforts and development of Official Gene Set (OGS) 2.0. To further improve annotation quality, we developed methods to evaluate gene models in the MAKER2, Cufflinks, Oases and Trinity assemblies and selected the best ones to constitute MCOT 1.0 after thorough crosschecking. MCOT 1.0 has 18,089 genes encoding 31,666 proteins: 32.8% match OGS 2.0 models perfectly or near perfectly, 11,747 differ considerably, and 29.5% are absent in OGS 2.0. Future automation of this process is anticipated to greatly reduce human efforts in generating comprehensive, reliable models of structural genes in other genome projects where extensive RNA-Seq data are available. Copyright © 2015 Elsevier Ltd. All rights reserved.
Economic Concepts Guiding Minnesota Extension's New Regional and County Delivery Model
ERIC Educational Resources Information Center
Morse, George W.; Klein, Thomas K.
2006-01-01
In response to a state budget deficit, the University of Minnesota Extension restructured its field staff, establishing a new regional and county delivery system, shifting all supervision of field staff to campus faculty, and encouraging greater field staff specialization, program focus, and entrepreneurial efforts. Nine economic concepts and…
Enhancing Extension and Research Activities through the Use of Web GIS
ERIC Educational Resources Information Center
Estwick, Noel M.; Griffin, Richard W.; James, Annette A.; Roberson, Samuel G.
2016-01-01
There have been numerous efforts aimed at improving geographic literacy in order to address societal challenges. Extension educators can use geographic information system (GIS) technology to help their clients cultivate spatial thinking skills and solve problems. Researchers can use it to model relationships and better answer questions. A program…
Applying Coaching Strategies to Support Youth- and Family-Focused Extension Programming
ERIC Educational Resources Information Center
Olson, Jonathan R.; Hawkey, Kyle R.; Smith, Burgess; Perkins, Daniel F.; Borden, Lynne M.
2016-01-01
In this article, we describe how a peer-coaching model has been applied to support community-based Extension programming through the Children, Youth, and Families at Risk (CYFAR) initiative. We describe the general approaches to coaching that have been used to help with CYFAR program implementation, evaluation, and sustainability efforts; we…
Accommodating complexity and human behaviors in decision analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Siirola, John Daniel; Schoenwald, David Alan
2007-11-01
This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.
ERIC Educational Resources Information Center
Collings, Mary L., Ed.
The document reports a 10-year seminar effort to consider a framework for a series of inquiries valuable to all institutions engaged in curriculum development or programing. The work reported includes the study of the professional role of the extension adult educator and identifies concepts, propositions, procedures, and a model for use in…
Reduction of Tunnel Dynamics at the National Transonic Facility (Invited)
NASA Technical Reports Server (NTRS)
Kilgore, W. A.; Balakrishna, S.; Butler, D. H.
2001-01-01
This paper describes the results of recent efforts to reduce the tunnel dynamics at the National Transonic Facility. The results presented describe the findings of an extensive data analysis, the proposed solutions to reduce dynamics and the results of implementing these solutions. These results show a 90% reduction in the dynamics around the model support structure and a small impact on reducing model dynamics. Also presented are several continuing efforts to further reduce dynamics.
40 CFR Appendix W to Part 51 - Guideline on Air Quality Models
Code of Federal Regulations, 2013 CFR
2013-07-01
... in the Guideline. The third activity is the extensive on-going research efforts by EPA and others in... addition, findings from ongoing research programs, new model development, or results from model evaluations... shown that the model is not biased toward underestimates; and v. A protocol on methods and procedures to...
40 CFR Appendix W to Part 51 - Guideline on Air Quality Models
Code of Federal Regulations, 2012 CFR
2012-07-01
... in the Guideline. The third activity is the extensive on-going research efforts by EPA and others in... addition, findings from ongoing research programs, new model development, or results from model evaluations... shown that the model is not biased toward underestimates; and v. A protocol on methods and procedures to...
40 CFR Appendix W to Part 51 - Guideline on Air Quality Models
Code of Federal Regulations, 2014 CFR
2014-07-01
... in the Guideline. The third activity is the extensive on-going research efforts by EPA and others in... addition, findings from ongoing research programs, new model development, or results from model evaluations... shown that the model is not biased toward underestimates; and v. A protocol on methods and procedures to...
Overview of the Gems Model of Volunteer Administration (Generate, Educate, Mobilize and Sustain)
ERIC Educational Resources Information Center
Culp, Ken, III
2012-01-01
To organize and coordinate the efforts of many volunteers, a framework for volunteer engagement is needed. The "GEMS" Model of volunteer administration was developed to assist Extension professionals and volunteer coordinators to effectively administer volunteer programs without delivering the program themselves. The GEMS Model is…
Cooperative Extension as a Framework for Health Extension: The Michigan State University Model.
Dwyer, Jeffrey W; Contreras, Dawn; Eschbach, Cheryl L; Tiret, Holly; Newkirk, Cathy; Carter, Erin; Cronk, Linda
2017-10-01
The Affordable Care Act charged the Agency for Healthcare Research and Quality to create the Primary Care Extension Program, but did not fund this effort. The idea to work through health extension agents to support health care delivery systems was based on the nationally known Cooperative Extension System (CES). Instead of creating new infrastructure in health care, the CES is an ideal vehicle for increasing health-related research and primary care delivery. The CES, a long-standing component of the land-grant university system, features a sustained infrastructure for providing education to communities. The Michigan State University (MSU) Model of Health Extension offers another means of developing a National Primary Care Extension Program that is replicable in part because of the presence of the CES throughout the United States. A partnership between the MSU College of Human Medicine and MSU Extension formed in 2014, emphasizing the promotion and support of human health research. The MSU Model of Health Extension includes the following strategies: building partnerships, preparing MSU Extension educators for participation in research, increasing primary care patient referrals and enrollment in health programs, and exploring innovative funding. Since the formation of the MSU Model of Health Extension, researchers and extension professionals have made 200+ connections, and grants have afforded savings in salary costs. The MSU College of Human Medicine and MSU Extension partnership can serve as a model to promote health partnerships nationwide between CES services within land-grant universities and academic health centers or community-based medical schools.
User interface for ground-water modeling: Arcview extension
Tsou, Ming‐shu; Whittemore, Donald O.
2001-01-01
Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.
Cooperative Extension as a Framework for Health Extension: The Michigan State University Model
Dwyer, Jeffrey W.; Contreras, Dawn; Tiret, Holly; Newkirk, Cathy; Carter, Erin; Cronk, Linda
2017-01-01
Problem The Affordable Care Act charged the Agency for Healthcare Research and Quality to create the Primary Care Extension Program, but did not fund this effort. The idea to work through health extension agents to support health care delivery systems was based on the nationally known Cooperative Extension System (CES). Instead of creating new infrastructure in health care, the CES is an ideal vehicle for increasing health-related research and primary care delivery. Approach The CES, a long-standing component of the land-grant university system, features a sustained infrastructure for providing education to communities. The Michigan State University (MSU) Model of Health Extension offers another means of developing a National Primary Care Extension Program that is replicable in part because of the presence of the CES throughout the United States. A partnership between the MSU College of Human Medicine and MSU Extension formed in 2014, emphasizing the promotion and support of human health research. The MSU Model of Health Extension includes the following strategies: building partnerships, preparing MSU Extension educators for participation in research, increasing primary care patient referrals and enrollment in health programs, and exploring innovative funding. Outcomes Since the formation of the MSU Model of Health Extension, researchers and extension professionals have made 200+ connections, and grants have afforded savings in salary costs. Next Steps The MSU College of Human Medicine and MSU Extension partnership can serve as a model to promote health partnerships nationwide between CES services within land-grant universities and academic health centers or community-based medical schools. PMID:28353501
Achieving Common Expectations for Overall Goals amid Diversity among Cooperative Extension Faculty.
ERIC Educational Resources Information Center
Taylor, Barbara
As a part of the initial phase of a strategic planning effort for the development of Florida's 1988 through 1991 long-range cooperative extension program, an effort was initiated to achieve common expectations for overall organizational mission and purpose among diverse cooperative extension faculty. The unification effort included the following…
Note: The full function test explosive generator.
Reisman, D B; Javedani, J B; Griffith, L V; Ellsworth, G F; Kuklo, R M; Goerz, D A; White, A D; Tallerico, L J; Gidding, D A; Murphy, M J; Chase, J B
2010-03-01
We have conducted three tests of a new pulsed power device called the full function test. These tests represented the culmination of an effort to establish a high energy pulsed power capability based on high explosive pulsed power (HEPP) technology. This involved an extensive computational modeling, engineering, fabrication, and fielding effort. The experiments were highly successful and a new U.S. record for magnetic energy was obtained.
Wang, Ophelia; Zachmann, Luke J; Sesnie, Steven E; Olsson, Aaryn D; Dickson, Brett G
2014-01-01
Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods implemented here provide a meaningful tool when understanding the potential distribution and habitat of species over multi-jurisdictional and extensive areas is needed for achieving management objectives.
Wang, Ophelia; Zachmann, Luke J.; Sesnie, Steven E.; Olsson, Aaryn D.; Dickson, Brett G.
2014-01-01
Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods implemented here provide a meaningful tool when understanding the potential distribution and habitat of species over multi-jurisdictional and extensive areas is needed for achieving management objectives. PMID:25019621
Impact of remote sensing upon the planning, management, and development of water resources
NASA Technical Reports Server (NTRS)
Loats, H. L.; Fowler, T. R.; Frech, S. L.
1974-01-01
A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.
Promoting Reflection in Teacher Preparation Programs: A Multilevel Model
ERIC Educational Resources Information Center
Etscheidt, Susan; Curran, Christina M.; Sawyer, Candace M.
2012-01-01
Teacher reflection has been promoted as a necessary tool for educators to sustain responsive instructional practices. A variety of approaches for integrating inquiry into teaching and reflection in practice emerged from extensive and intensive efforts to reform teacher preparation programs. Based on those conceptualizations, a three-level model of…
SURFACE WATER FLOW IN LANDSCAPE MODELS: 1. EVERGLADES CASE STUDY. (R824766)
Many landscape models require extensive computational effort using a large array of grid cells that represent the landscape. The number of spatial cells may be in the thousands and millions, while the ecological component run in each of the cells to account for landscape dynamics...
Description of Data Acquisition Efforts
DOT National Transportation Integrated Search
1999-09-01
As part of the overall strategy of refining and improving the existing transportation and air-quality modeling framework, the current project focuses extensively on acquiring disaggregate and reliable data for analysis. In this report, we discuss the...
A Model for Projection of Instructional Activity in a Multi-Campus University.
ERIC Educational Resources Information Center
Tallman, B. M.; Newton, R. D.
This report is concerned with the development of a model for projecting instructional activity and its application within The Pennsylvania State University. Inasmuch as models of this type have been developed at a number of institutions of higher education, the effort described in this report does not constitute an extension of fundamental…
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.
Broadband Scattering from Sand and Sand/Mud Sediments with Extensive Environmental Characterization
2017-01-30
experiment , extensive envi- ronmental characterization was also performed to support data/model comparisons for both experimental efforts. The site...mechanisms, potentially addressing questions left unresolved from the previous sediment acoustics experiments , SAX99 and SAX04. This work was also to provide...environmental characterization to support the analysis of data collected during the Target and Reverberation Experiment in 2013 (TREX13) as well as
Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.
Vassena, Eliana; Holroyd, Clay B; Alexander, William H
2017-01-01
In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.
Comparison of numerical model simulations and SFO wake vortex windline measurements
DOT National Transportation Integrated Search
2003-06-23
To provide quantitative support for the Simultaneous Offset Instrument Approach (SOIA) procedure, an extensive data collection effort was undertaken at San Francisco International Airport by the Federal Aviation Administration (FAA, U.S. Dept. of Tra...
Applying a Service-Oriented Architecture to Operational Flight Program Development
2007-09-01
using two Java 2 Enterprise Edition (J2EE) Web servers. The weapon models were accessed using a SUN Microsystems Java Web Services Development Pack...Oriented Architectures 22 CROSSTALK The Journal of Defense Software Engineering September 2007 tion, and Spring/ Hibernate to provide the data access...tion since a major coding effort was avoided. The majority of the effort was tweaking pre-existing Java source code and editing of eXtensible Markup
Aerothermal modeling program. Phase 2, element B: Flow interaction experiment
NASA Technical Reports Server (NTRS)
Nikjooy, M.; Mongia, H. C.; Murthy, S. N. B.; Sullivan, J. P.
1987-01-01
NASA has instituted an extensive effort to improve the design process and data base for the hot section components of gas turbine engines. The purpose of element B is to establish a benchmark quality data set that consists of measurements of the interaction of circular jets with swirling flow. Such flows are typical of those that occur in the primary zone of modern annular combustion liners. Extensive computations of the swirling flows are to be compared with the measurements for the purpose of assessing the accuracy of current physical models used to predict such flows.
Fitting species-accumulation functions and assessing regional land use impacts on avian diversity
Curtis H. Flather
1996-01-01
As one samples species from a particular assemblage, the initial rapid rate with which new species are encountered declines with increasing effort. Nine candidate models to characterize species-accumulation functions were compared in a search for a model that consistently fit geographically extensive avian survey data from a wide range of environmental conditions....
A Proposal for Public and Private Partnership in Extension.
Krell, Rayda K; Fisher, Marc L; Steffey, Kevin L
2016-01-01
Public funding for Extension in the United States has been decreasing for many years, but farmers' need for robust information on which to make management decisions has not diminished. The current Extension funding challenges provide motivation to explore a different model for developing and delivering extension. The private sector has partnered with the public sector to fund and conduct agricultural research, but partnering on extension delivery has occurred far less frequently. The fundamental academic strength and established Extension network of the public sector combined with the ability of the private sector to encourage and deliver practical, implementable solutions has the potential to provide measurable benefits to farmers. This paper describes the current Extension climate, presents data from a survey about Extension and industry relationships, presents case studies of successful public- and private-sector extension partnerships, and proposes a framework for evaluating the state of effective partnerships. Synergistic public-private extension efforts could ensure that farmers receive the most current and balanced information available to help with their management decisions.
A Proposal for Public and Private Partnership in Extension
Krell, Rayda K.; Fisher, Marc L.; Steffey, Kevin L.
2016-01-01
Public funding for Extension in the United States has been decreasing for many years, but farmers’ need for robust information on which to make management decisions has not diminished. The current Extension funding challenges provide motivation to explore a different model for developing and delivering extension. The private sector has partnered with the public sector to fund and conduct agricultural research, but partnering on extension delivery has occurred far less frequently. The fundamental academic strength and established Extension network of the public sector combined with the ability of the private sector to encourage and deliver practical, implementable solutions has the potential to provide measurable benefits to farmers. This paper describes the current Extension climate, presents data from a survey about Extension and industry relationships, presents case studies of successful public- and private-sector extension partnerships, and proposes a framework for evaluating the state of effective partnerships. Synergistic public–private extension efforts could ensure that farmers receive the most current and balanced information available to help with their management decisions. PMID:26949567
NASA Technical Reports Server (NTRS)
Ferzali, Wassim; Zacharakis, Vassilis; Upadhyay, Triveni; Weed, Dennis; Burke, Gregory
1995-01-01
The ICAO Aeronautical Mobile Communications Panel (AMCP) completed the drafting of the Aeronautical Mobile Satellite Service (AMSS) Standards and Recommended Practices (SARP's) and the associated Guidance Material and submitted these documents to ICAO Air Navigation Commission (ANC) for ratification in May 1994. This effort, encompassed an extensive, multi-national SARP's validation. As part of this activity, the US Federal Aviation Administration (FAA) sponsored an effort to validate the SARP's via computer simulation. This paper provides a description of this effort. Specifically, it describes: (1) the approach selected for the creation of a high-fidelity AMSS computer model; (2) the test traffic generation scenarios; and (3) the resultant AMSS performance assessment. More recently, the AMSS computer model was also used to provide AMSS performance statistics in support of the RTCA standardization activities. This paper describes this effort as well.
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-03-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-01-01
A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.
Impact of an Extension Social Media Tool Kit on Audience Engagement
ERIC Educational Resources Information Center
Garcia, Aileen S.; Dev, Dipti; McGinnis, Colin M.; Thomas, Tyler
2018-01-01
Extension professionals can improve their use of social media as channels for extending programmatic efforts by maximizing target audience reach and engagement. We describe how implementation of a tool kit highlighting best practices for using social media improved Extension professionals' efforts to engage target audience members via social…
A Year-Long Comparison of GPS TEC and Global Ionosphere-Thermosphere Models
NASA Astrophysics Data System (ADS)
Perlongo, N. J.; Ridley, A. J.; Cnossen, I.; Wu, C.
2018-02-01
The prevalence of GPS total electron content (TEC) observations has provided an opportunity for extensive global ionosphere-thermosphere model validation efforts. This study presents a year-long data-model comparison using the Global Ionosphere-Thermosphere Model (GITM) and the Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM). For the entire year of 2010, each model was run and compared to GPS TEC observations. The results were binned according to season, latitude, local time, and magnetic local time. GITM was found to overestimate the TEC everywhere, except on the midlatitude nightside, due to high O/N2 ratios. TIE-GCM produced much less TEC and had lower O/N2 ratios and neutral wind speeds. Seasonal and regional biases in the models are discussed along with ideas for model improvements and further validation efforts.
Crop Characteristics Research: Growth and Reflectance Analysis
NASA Technical Reports Server (NTRS)
Badhwar, G. D. (Principal Investigator)
1985-01-01
Much of the early research in remote sensing follows along developing spectral signatures of cover types. It was found, however, that a signature from an unknown cover class could not always be matched to a catalog value of known cover class. This approach was abandoned and supervised classification schemes followed. These were not efficient and required extensive training. It was obvious that data acquired at a single time could not separate cover types. A large portion of the proposed research has concentrated on modeling the temporal behavior of agricultural crops and on removing the need for any training data in remote sensing surveys; the key to which is the solution of the so-called 'signature extension' problem. A clear need to develop spectral estimaters of crop ontogenic stages and yield has existed even though various correlations have been developed. Considerable effort in developing techniques to estimate these variables was devoted to this work. The need to accurately evaluate existing canopy reflectance model(s), improve these models, use them to understand the crop signatures, and estimate leaf area index was the third objective of the proposed work. A synopsis of this research effort is discussed.
Use of hydrologic and hydrodynamic modeling for ecosystem restoration
Obeysekera, J.; Kuebler, L.; Ahmed, S.; Chang, M.-L.; Engel, V.; Langevin, C.; Swain, E.; Wan, Y.
2011-01-01
Planning and implementation of unprecedented projects for restoring the greater Everglades ecosystem are underway and the hydrologic and hydrodynamic modeling of restoration alternatives has become essential for success of restoration efforts. In view of the complex nature of the South Florida water resources system, regional-scale (system-wide) hydrologic models have been developed and used extensively for the development of the Comprehensive Everglades Restoration Plan. In addition, numerous subregional-scale hydrologic and hydrodynamic models have been developed and are being used for evaluating project-scale water management plans associated with urban, agricultural, and inland costal ecosystems. The authors provide a comprehensive summary of models of all scales, as well as the next generation models under development to meet the future needs of ecosystem restoration efforts in South Florida. The multiagency efforts to develop and apply models have allowed the agencies to understand the complex hydrologic interactions, quantify appropriate performance measures, and use new technologies in simulation algorithms, software development, and GIS/database techniques to meet the future modeling needs of the ecosystem restoration programs. Copyright ?? 2011 Taylor & Francis Group, LLC.
Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?
NASA Technical Reports Server (NTRS)
Lum, Karen; Hihn, Jairus; Menzies, Tim
2006-01-01
While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.
Rasmussen, Victoria; Turnell, Adrienne; Butow, Phyllis; Juraskova, Ilona; Kirsten, Laura; Wiener, Lori; Patenaude, Andrea; Hoekstra-Weebers, Josette; Grassi, Luigi
2016-01-01
Objectives Burnout is a significant problem among healthcare professionals working within the oncology setting. This study aimed to investigate predictors of emotional exhaustion (EE) and depersonalisation (DP) in psychosocial oncologists, through the application of the effort–reward imbalance (ERI) model with an additional focus on the role of meaningful work in the burnout process. Methods Psychosocial oncology clinicians (n = 417) in direct patient contact who were proficient in English were recruited from 10 international psychosocial oncology societies. Participants completed an online questionnaire, which included measures of demographic and work characteristics, EE and DP subscales of the Maslach Burnout Inventory-Human Services Survey, the Short Version ERI Questionnaire and the Work and Meaning Inventory. Results Higher effort and lower reward were both significantly associated with greater EE, although not DP. The interaction of higher effort and lower reward did not predict greater EE or DP. Overcommitment predicted both EE and DP but did not moderate the impact of effort and reward on burnout. Overall, the ERI model accounted for 33% of the variance in EE. Meaningful work significantly predicted both EE and DP but accounted for only 2% more of the variance in EE above and beyond the ERI model. Conclusions The ERI was only partially supported as a useful framework for investigating burnout in psychosocial oncology professionals. Meaningful work may be a viable extension of the ERI model. Burnout among health professionals may be reduced by interventions aimed at increasing self-efficacy and changes to the supportive work environment. PMID:26239424
Increasing Understanding of Public Problems and Policies--1990.
ERIC Educational Resources Information Center
Farm Foundation, Chicago, IL.
This collection of papers aims to improve the policy education efforts of extension workers responsible for public affairs programs. The first section, "An Evolving Public Policy Education" examines the history of public education; address current issues such as leadership models, ethics in policy formation and policy education; and…
Regulation of Motivation: Contextual and Social Aspects
ERIC Educational Resources Information Center
Wolters, Christopher A.
2011-01-01
Background: Models of self-regulated learning have been used extensively as a way of understanding how students understand, monitor, and manage their own academic functioning. The regulation of motivation is a facet of self-regulated learning that describes students' efforts to control their own motivation or motivational processing. The…
Validation of Model Simulations of Anvil Cirrus Properties During TWP-ICE: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zipser, Edward J.
2013-05-20
This 3-year grant, with two extensions, resulted in a successful 5-year effort, led by Ph.D. student Adam Varble, to compare cloud resolving model (CRM) simulations with the excellent database obtained during the TWP-ICE field campaign. The objective, largely achieved, is to undertake these comparisons comprehensively and quantitatively, informing the community in ways that goes beyond pointing out errors in the models, but points out ways to improve both cloud dynamics and microphysics parameterizations in future modeling efforts. Under DOE support, Adam Varble, with considerable assistance from Dr. Ann Fridlind and others, entrained scientists who ran some 10 different CRMs andmore » 4 different limited area models (LAMs) using a variety of microphysics parameterizations, to ensure that the conclusions of the study will have considerable generality.« less
Baele, Guy; Lemey, Philippe; Vansteelandt, Stijn
2013-03-06
Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model's marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. We here assess the original 'model-switch' path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model's marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation.
Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection
Vesperini, Fabio; Schuller, Björn
2017-01-01
In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events. There is no evidence of studies focused on comparing previous efforts to automatically recognize novel events from audio signals and giving a broad and in depth evaluation of recurrent neural network-based autoencoders. The present contribution aims to consistently evaluate our recent novel approaches to fill this white spot in the literature and provide insight by extensive evaluations carried out on three databases: A3Novelty, PASCAL CHiME, and PROMETHEUS. Besides providing an extensive analysis of novel and state-of-the-art methods, the article shows how RNN-based autoencoders outperform statistical approaches up to an absolute improvement of 16.4% average F-measure over the three databases. PMID:28182121
2013-01-01
Background Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model’s marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. Results We here assess the original ‘model-switch’ path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model’s marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. Conclusions We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation. PMID:23497171
Multifaceted Approach to Designing an Online Masters Program.
ERIC Educational Resources Information Center
McNeil, Sara G.; Chernish, William N.; DeFranco, Agnes L.
At the Conrad N. Hilton College of Hotel and Restaurant Management at the University of Houston (Texas), the faculty and administrators made a conscious effort to take a broad, extensive approach to designing and implementing a fully online masters program. This approach was entered in a comprehensive needs assessment model and sought input from…
Optimized Temporal Monitors for SystemC
NASA Technical Reports Server (NTRS)
Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.
2012-01-01
SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.
Modeling a constant power load for nickel-hydrogen battery testing using SPICE
NASA Technical Reports Server (NTRS)
Bearden, Douglas B.; Lollar, Louis F.; Nelms, R. M.
1990-01-01
The effort to design and model a constant power load for the HST (Hubble Space Telescope) nickel-hydrogen battery tests is described. The constant power load was designed for three different simulations on the batteries: life cycling, reconditioning, and capacity testing. A dc-dc boost converter was designed to act as this constant power load. A boost converter design was chosen because of the low test battery voltage (4 to 6 VDC) generated and the relatively high power requirement of 60 to 70 W. The SPICE model was shown to consistently predict variations in the actual circuit as various designs were attempted. It is concluded that the confidence established in the SPICE model of the constant power load ensures its extensive utilization in future efforts to improve performance in the actual load circuit.
Policy Implications of Air Quality Research
NASA Astrophysics Data System (ADS)
Sheinbaum, C.
2004-12-01
While an integrated assessment approach will be required to achieve and sustain improvements in the air quality of Mexico City Metropolitan Area's (MCMA), policy strategies must be based on a solid understanding of the pollutant emissions and atmospheric processes that lead to unacceptable levels of air pollution. The required level of understanding can only be achieved by comprehensive atmospheric measurements followed by a coordinated atmospheric modeling program. The innovative, two-phase atmospheric measurement program, which was a collaborative effort between Massachusetts Institute of Technology and the Mexican Metropolitan Environmental Commission, with exploratory measurements in February 2002 and extensive measurements from late March through early May of 2003, was an important step towards meeting these requirements. Although the extensive data sets from the two measurement programs are still being analyzed by the investigators, their preliminary analysis efforts have yielded important insights into the nature and extent of air pollution problem in the MCMA, which in turn will have important policy implications.
Robotics-Centered Outreach Activities: An Integrated Approach
ERIC Educational Resources Information Center
Ruiz-del-Solar, Javier
2010-01-01
Nowadays, universities are making extensive efforts to attract prospective students to the fields of electrical, electronic, and computer engineering. Thus, outreach is becoming increasingly important, and activities with schoolchildren are being extensively carried out as part of this effort. In this context, robotics is a very attractive and…
Upper Stage Engine Composite Nozzle Extensions
NASA Technical Reports Server (NTRS)
Valentine, Peter G.; Allen, Lee R.; Gradl, Paul R.; Greene, Sandra E.; Sullivan, Brian J.; Weller, Leslie J.; Koenig, John R.; Cuneo, Jacques C.; Thompson, James; Brown, Aaron;
2015-01-01
Carbon-carbon (C-C) composite nozzle extensions are of interest for use on a variety of launch vehicle upper stage engines and in-space propulsion systems. The C-C nozzle extension technology and test capabilities being developed are intended to support National Aeronautics and Space Administration (NASA) and United States Air Force (USAF) requirements, as well as broader industry needs. Recent and on-going efforts at the Marshall Space Flight Center (MSFC) are aimed at both (a) further developing the technology and databases for nozzle extensions fabricated from specific CC materials, and (b) developing and demonstrating low-cost capabilities for testing composite nozzle extensions. At present, materials development work is concentrating on developing a database for lyocell-based C-C that can be used for upper stage engine nozzle extension design, modeling, and analysis efforts. Lyocell-based C-C behaves in a manner similar to rayon-based CC, but does not have the environmental issues associated with the use of rayon. Future work will also further investigate technology and database gaps and needs for more-established polyacrylonitrile- (PAN-) based C-C's. As a low-cost means of being able to rapidly test and screen nozzle extension materials and structures, MSFC has recently established and demonstrated a test rig at MSFC's Test Stand (TS) 115 for testing subscale nozzle extensions with 3.5-inch inside diameters at the attachment plane. Test durations of up to 120 seconds have been demonstrated using oxygen/hydrogen propellants. Other propellant combinations, including the use of hydrocarbon fuels, can be used if desired. Another test capability being developed will allow the testing of larger nozzle extensions (13.5- inch inside diameters at the attachment plane) in environments more similar to those of actual oxygen/hydrogen upper stage engines. Two C-C nozzle extensions (one lyocell-based, one PAN-based) have been fabricated for testing with the larger-scale facility.
Quantitative computational models of molecular self-assembly in systems biology
Thomas, Marcus; Schwartz, Russell
2017-01-01
Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally. PMID:28535149
Quantitative computational models of molecular self-assembly in systems biology.
Thomas, Marcus; Schwartz, Russell
2017-05-23
Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.
ERIC Educational Resources Information Center
Oloruntoba, Abayomi; Adegbite, Dorcas A.
2006-01-01
University outreach is an educational and research-based information source enabling farmers to make decisions that improve the quality of their lives. This paper explores how collaborative efforts between the university and farmers have directly impacted in albeit Striga ("noxious witch weed") ravaged maize farms in rainforest farming…
Representing Hydrologic Models as HydroShare Resources to Facilitate Model Sharing and Collaboration
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Goodall, J. L.; Mbewe, P.
2013-12-01
The CUAHSI HydroShare project is a collaborative effort that aims to provide software for sharing data and models within the hydrologic science community. One of the early focuses of this work has been establishing metadata standards for describing models and model-related data as HydroShare resources. By leveraging this metadata definition, a prototype extension has been developed to create model resources that can be shared within the community using the HydroShare system. The extension uses a general model metadata definition to create resource objects, and was designed so that model-specific parsing routines can extract and populate metadata fields from model input and output files. The long term goal is to establish a library of supported models where, for each model, the system has the ability to extract key metadata fields automatically, thereby establishing standardized model metadata that will serve as the foundation for model sharing and collaboration within HydroShare. The Soil Water & Assessment Tool (SWAT) is used to demonstrate this concept through a case study application.
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
The application of SMA spring actuators to a lightweight modular compliant surface bioinspired robot
NASA Astrophysics Data System (ADS)
Stone, David L.; Cranney, John; Liang, Robert; Taya, Minoru
2004-07-01
The DARPA Sponsored Compliant Surface Robotics (CSR) program pursues development of a high mobility, lightweight, modular, morph-able robot for military forces in the field and for other industrial uses. The USTLAB and University of Washington Center for Intelligent Materials and Systems (CIMS) effort builds on USTLAB proof of concept feasibility studies and demonstration of a 4, 6, or 8 wheeled modular vehicle with articulated leg-wheel assemblies. A collaborative effort between USTLAB and UW-CIMS explored the application of Shape Memory Alloy Nickel Titanium Alloy springs to a leg extension actuator capable of actuating with 4.5 Newton force over a 50 mm stroke. At the completion of Phase II, we have completed mechanical and electronics engineering design and achieved conventional actuation which currently enable active articulation, enabling autonomous reconfiguration for a wide variety of terrains, including upside down operations (in case of flip over), have developed a leg extension actuator demonstration model, and we have positioned our team to pursue a small vehicle with leg extension actuators in follow on work. The CSR vehicle's modular spider-like configuration facilitates adaptation to many uses and compliance over rugged terrain. The developmental process, actuator and vehicle characteristics will be discussed.
Transit and lifespan in neutrophil production: implications for drug intervention.
Câmara De Souza, Daniel; Craig, Morgan; Cassidy, Tyler; Li, Jun; Nekka, Fahima; Bélair, Jacques; Humphries, Antony R
2018-02-01
A comparison of the transit compartment ordinary differential equation modelling approach to distributed and discrete delay differential equation models is studied by focusing on Quartino's extension to the Friberg transit compartment model of myelosuppression, widely relied upon in the pharmaceutical sciences to predict the neutrophil response after chemotherapy, and on a QSP delay differential equation model of granulopoiesis. An extension to the Quartino model is provided by considering a general number of transit compartments and introducing an extra parameter that allows for the decoupling of the maturation time from the production rate of cells. An overview of the well established linear chain technique, used to reformulate transit compartment models with constant transit rates as distributed delay differential equations (DDEs), is then given. A state-dependent time rescaling of the Quartino model is performed to apply the linear chain technique and rewrite the Quartino model as a distributed DDE, yielding a discrete DDE model in a certain parameter limit. Next, stability and bifurcation analyses are undertaken in an effort to situate such studies in a mathematical pharmacology context. We show that both the original Friberg and the Quartino extension models incorrectly define the mean maturation time, essentially treating the proliferative pool as an additional maturation compartment. This misspecification can have far reaching consequences on the development of future models of myelosuppression in PK/PD.
ERIC Educational Resources Information Center
Chen, Ouhao; Castro-Alonso, Juan C.; Paas, Fred; Sweller, John
2018-01-01
Depletion of limited working memory resources may occur following extensive mental effort resulting in decreased performance compared to conditions requiring less extensive mental effort. This "depletion effect" can be incorporated into cognitive load theory that is concerned with using the properties of human cognitive architecture,…
Agriculture and Health Sectors Collaborate in Addressing Population Health
Kaufman, Arthur; Boren, Jon; Koukel, Sonja; Ronquillo, Francisco; Davies, Cindy; Nkouaga, Carolina
2017-01-01
PURPOSE Population health is of growing importance in the changing health care environment. The Cooperative Extension Service, housed in each state’s land grant university, has a major impact on population health through its many community-based efforts, including the Supplemental Nutrition Assistance Program – Education (SNAP-Ed) nutrition programs, 4-H youth engagement, health and wellness education, and community development. Can the agricultural and health sectors, which usually operate in parallel, mostly unknown to each other, collaborate to address population health? We set out to provide an overview of the collaboration between the Cooperative Extension Service and the health sector in various states and describe a case study of 1 model as it developed in New Mexico. METHODS We conducted a literature review and personally contacted states in which the Cooperative Extension Service is collaborating on a “Health Extension” model with academic health centers or their health systems. We surveyed 6 states in which Health Extension models are being piloted as to their different approaches. For a case study of collaboration in New Mexico, we drew on interviews with the leadership of New Mexico State University’s Cooperative Extension Service in the College of Agricultural, Consumer and Environmental Sciences; the University of New Mexico (UNM) Health Science Center’s Office for Community Health; and the personal experiences of frontline Cooperative Extension agents and UNM Health Extension officers who collaborated on community projects. RESULTS A growing number of states are linking the agricultural Cooperative Extension Service with academic health centers and with the health care system. In New Mexico, the UNM academic health center has created “Health Extension Rural Offices” based on principles of the Cooperative Extension model. Today, these 2 systems are working collaboratively to address unmet population health needs in their communities. Nationally, the Cooperative Extension Service has formed a steering committee to guide its movement into the health arena. CONCLUSION Resources of the agricultural and health sectors offer communities complementary expertise and resources to address adverse population health outcomes. The collaboration between Cooperative Extension and the health sector is 1 manifestation of this emerging collaboration model termed Health Extension. Initial skepticism and protection of funding sources and leadership roles can be overcome with shared funding from new sources, shared priority setting and decision making, and the initiation of practical, collaborative projects that build personal relationships and trust. PMID:28893819
Memory mechanisms supporting syntactic comprehension.
Caplan, David; Waters, Gloria
2013-04-01
Efforts to characterize the memory system that supports sentence comprehension have historically drawn extensively on short-term memory as a source of mechanisms that might apply to sentences. The focus of these efforts has changed significantly in the past decade. As a result of changes in models of short-term working memory (ST-WM) and developments in models of sentence comprehension, the effort to relate entire components of an ST-WM system, such as those in the model developed by Baddeley (Nature Reviews Neuroscience 4: 829-839, 2003) to sentence comprehension has largely been replaced by an effort to relate more specific mechanisms found in modern models of ST-WM to memory processes that support one aspect of sentence comprehension--the assignment of syntactic structure (parsing) and its use in determining sentence meaning (interpretation) during sentence comprehension. In this article, we present the historical background to recent studies of the memory mechanisms that support parsing and interpretation and review recent research into this relation. We argue that the results of this research do not converge on a set of mechanisms derived from ST-WM that apply to parsing and interpretation. We argue that the memory mechanisms supporting parsing and interpretation have features that characterize another memory system that has been postulated to account for skilled performance-long-term working memory. We propose a model of the relation of different aspects of parsing and interpretation to ST-WM and long-term working memory.
2003-01-01
Over the past several years, the Energy Information Administration (EIA) has extensively studied the relationships between wholesale and retail markets for transportation fuels. This article represents a return to gasoline markets, where EIA first performed this type of analysis and modeling in 1997. The current effort takes advantage of improvements and enhancements to our approach over the intervening years, resulting in more detailed and accurate results.
Optical modeling of agricultural fields and rough-textured rock and mineral surfaces
NASA Technical Reports Server (NTRS)
Suits, G. H.; Vincent, R. K.; Horwitz, H. M.; Erickson, J. D.
1973-01-01
Review was made of past models for describing the reflectance and/or emittance properties of agricultural/forestry and geological targets in an effort to select the best theoretical models. An extension of the six parameter Allen-Gayle-Richardson model was chosen as the agricultural plant canopy model. The model is used to predict the bidirectional reflectance of a field crop from known laboratory spectra of crop components and approximate plant geometry. The selected geological model is based on Mie theory and radiative transfer equations, and will assess the effect of textural variations of the spectral emittance of natural rock surfaces.
Agricultural Extension: Farm Extension Services in Australia, Britain and the United States.
ERIC Educational Resources Information Center
Williams, Donald B.
By analyzing the scope and structure of agricultural extension services in Australia, Great Britain, and the United States, this work attempts to set guidelines for measuring progress and guiding extension efforts. Extension training, agricultural policy, and activities of national, international, state, and provincial bodies are examined. The…
Improved Multi-Axial, Temperature and Time Dependent (MATT) Failure Model
NASA Technical Reports Server (NTRS)
Richardson, D. E.; Anderson, G. L.; Macon, D. J.
2002-01-01
An extensive effort has recently been completed by the Space Shuttle's Reusable Solid Rocket Motor (RSRM) nozzle program to completely characterize the effects of multi-axial loading, temperature and time on the failure characteristics of three filled epoxy adhesives (TIGA 321, EA913NA, EA946). As part of this effort, a single general failure criterion was developed that accounted for these effects simultaneously. This model was named the Multi- Axial, Temperature, and Time Dependent or MATT failure criterion. Due to the intricate nature of the failure criterion, some parameters were required to be calculated using complex equations or numerical methods. This paper documents some simple but accurate modifications to the failure criterion to allow for calculations of failure conditions without complex equations or numerical techniques.
NASA Technical Reports Server (NTRS)
Cheung, Kar-Ming; Tung, Ramona H.; Lee, Charles H.
2003-01-01
In this paper, we describe the development roadmap and discuss the various challenges of an evolvable and extensible multi-mission telecom planning and analysis framework. Our long-term goal is to develop a set of powerful flexible telecommunications analysis tools that can be easily adapted to different missions while maintain the common Deep Space Communication requirements. The ability of re-using the DSN ground models and the common software utilities in our adaptations has contributed significantly to our development efforts measured in terms of consistency, accuracy, and minimal effort redundancy, which can translate into shorter development time and major cost savings for the individual missions. In our roadmap, we will address the design principles, technical achievements and the associated challenges for following telecom analysis tools (i) Telecom Forecaster Predictor - TFP (ii) Unified Telecom Predictor - UTP (iii) Generalized Telecom Predictor - GTP (iv) Generic TFP (v) Web-based TFP (vi) Application Program Interface - API (vii) Mars Relay Network Planning Tool - MRNPT.
Toward a Predictive Understanding of Earth’s Microbiomes to Address 21st Century Challenges
Blaser, Martin J.; Cardon, Zoe G.; Cho, Mildred K.; Dangl, Jeffrey L.; Green, Jessica L.; Knight, Rob; Maxon, Mary E.; Northen, Trent R.; Pollard, Katherine S.
2016-01-01
ABSTRACT Microorganisms have shaped our planet and its inhabitants for over 3.5 billion years. Humankind has had a profound influence on the biosphere, manifested as global climate and land use changes, and extensive urbanization in response to a growing population. The challenges we face to supply food, energy, and clean water while maintaining and improving the health of our population and ecosystems are significant. Given the extensive influence of microorganisms across our biosphere, we propose that a coordinated, cross-disciplinary effort is required to understand, predict, and harness microbiome function. From the parallelization of gene function testing to precision manipulation of genes, communities, and model ecosystems and development of novel analytical and simulation approaches, we outline strategies to move microbiome research into an era of causality. These efforts will improve prediction of ecosystem response and enable the development of new, responsible, microbiome-based solutions to significant challenges of our time. PMID:27178263
Toward a Predictive Understanding of Earth's Microbiomes to Address 21st Century Challenges.
Blaser, Martin J; Cardon, Zoe G; Cho, Mildred K; Dangl, Jeffrey L; Donohue, Timothy J; Green, Jessica L; Knight, Rob; Maxon, Mary E; Northen, Trent R; Pollard, Katherine S; Brodie, Eoin L
2016-05-13
Microorganisms have shaped our planet and its inhabitants for over 3.5 billion years. Humankind has had a profound influence on the biosphere, manifested as global climate and land use changes, and extensive urbanization in response to a growing population. The challenges we face to supply food, energy, and clean water while maintaining and improving the health of our population and ecosystems are significant. Given the extensive influence of microorganisms across our biosphere, we propose that a coordinated, cross-disciplinary effort is required to understand, predict, and harness microbiome function. From the parallelization of gene function testing to precision manipulation of genes, communities, and model ecosystems and development of novel analytical and simulation approaches, we outline strategies to move microbiome research into an era of causality. These efforts will improve prediction of ecosystem response and enable the development of new, responsible, microbiome-based solutions to significant challenges of our time. Copyright © 2016 Blaser et al.
Reddington, C. L.; Carslaw, K. S.; Stier, P.; ...
2017-09-01
The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddington, C. L.; Carslaw, K. S.; Stier, P.
The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less
Life Modeling and Design Analysis for Ceramic Matrix Composite Materials
NASA Technical Reports Server (NTRS)
2005-01-01
The primary research efforts focused on characterizing and modeling static failure, environmental durability, and creep-rupture behavior of two classes of ceramic matrix composites (CMC), silicon carbide fibers in a silicon carbide matrix (SiC/SiC) and carbon fibers in a silicon carbide matrix (C/SiC). An engineering life prediction model (Probabilistic Residual Strength model) has been developed specifically for CMCs. The model uses residual strength as the damage metric for evaluating remaining life and is posed probabilistically in order to account for the stochastic nature of the material s response. In support of the modeling effort, extensive testing of C/SiC in partial pressures of oxygen has been performed. This includes creep testing, tensile testing, half life and residual tensile strength testing. C/SiC is proposed for airframe and propulsion applications in advanced reusable launch vehicles. Figures 1 and 2 illustrate the models predictive capabilities as well as the manner in which experimental tests are being selected in such a manner as to ensure sufficient data is available to aid in model validation.
Fatigue life prediction modeling for turbine hot section materials
NASA Technical Reports Server (NTRS)
Halford, G. R.; Meyer, T. G.; Nelson, R. S.; Nissley, D. M.; Swanson, G. A.
1989-01-01
A major objective of the fatigue and fracture efforts under the NASA Hot Section Technology (HOST) program was to significantly improve the analytic life prediction tools used by the aeronautical gas turbine engine industry. This was achieved in the areas of high-temperature thermal and mechanical fatigue of bare and coated high-temperature superalloys. The cyclic crack initiation and propagation resistance of nominally isotropic polycrystalline and highly anisotropic single crystal alloys were addressed. Life prediction modeling efforts were devoted to creep-fatigue interaction, oxidation, coatings interactions, multiaxiality of stress-strain states, mean stress effects, cumulative damage, and thermomechanical fatigue. The fatigue crack initiation life models developed to date include the Cyclic Damage Accumulation (CDA) and the Total Strain Version of Strainrange Partitioning (TS-SRP) for nominally isotropic materials, and the Tensile Hysteretic Energy Model for anisotropic superalloys. A fatigue model is being developed based upon the concepts of Path-Independent Integrals (PII) for describing cyclic crack growth under complex nonlinear response at the crack tip due to thermomechanical loading conditions. A micromechanistic oxidation crack extension model was derived. The models are described and discussed.
Fatigue life prediction modeling for turbine hot section materials
NASA Technical Reports Server (NTRS)
Halford, G. R.; Meyer, T. G.; Nelson, R. S.; Nissley, D. M.; Swanson, G. A.
1988-01-01
A major objective of the fatigue and fracture efforts under the Hot Section Technology (HOST) program was to significantly improve the analytic life prediction tools used by the aeronautical gas turbine engine industry. This was achieved in the areas of high-temperature thermal and mechanical fatigue of bare and coated high-temperature superalloys. The cyclic crack initiation and propagation resistance of nominally isotropic polycrystalline and highly anisotropic single crystal alloys were addressed. Life prediction modeling efforts were devoted to creep-fatigue interaction, oxidation, coatings interactions, multiaxiality of stress-strain states, mean stress effects, cumulative damage, and thermomechanical fatigue. The fatigue crack initiation life models developed to date include the Cyclic Damage Accumulation (CDA) and the Total Strain Version of Strainrange Partitioning (TS-SRP) for nominally isotropic materials, and the Tensile Hysteretic Energy Model for anisotropic superalloys. A fatigue model is being developed based upon the concepts of Path-Independent Integrals (PII) for describing cyclic crack growth under complex nonlinear response at the crack tip due to thermomechanical loading conditions. A micromechanistic oxidation crack extension model was derived. The models are described and discussed.
Heritage House Maintenance Using 3d City Model Application Domain Extension Approach
NASA Astrophysics Data System (ADS)
Mohd, Z. H.; Ujang, U.; Liat Choon, T.
2017-11-01
Heritage house is part of the architectural heritage of Malaysia that highly valued. Many efforts by the Department of Heritage to preserve this heritage house such as monitoring the damage problems of heritage house. The damage problems of heritage house might be caused by wooden decay, roof leakage and exfoliation of wall. One of the initiatives for maintaining and documenting this heritage house is through Three-dimensional (3D) of technology. 3D city models are widely used now and much used by researchers for management and analysis. CityGML is a standard tool that usually used by researchers to exchange, storing and managing virtual 3D city models either geometric and semantic information. Moreover, it also represent multi-scale of 3D model in five level of details (LoDs) whereby each of level give a distinctive functions. The extension of CityGML was recently introduced and can be used for problems monitoring and the number of habitants of a house.
Memory mechanisms supporting syntactic comprehension
Waters, Gloria
2013-01-01
Efforts to characterize the memory system that supports sentence comprehension have historically drawn extensively on short-term memory as a source of mechanisms that might apply to sentences. The focus of these efforts has changed significantly in the past decade. As a result of changes in models of short-term working memory (ST-WM) and developments in models of sentence comprehension, the effort to relate entire components of an ST-WM system, such as those in the model developed by Baddeley (Nature Reviews Neuroscience 4: 829–839, 2003) to sentence comprehension has largely been replaced by an effort to relate more specific mechanisms found in modern models of ST-WM to memory processes that support one aspect of sentence comprehension—the assignment of syntactic structure (parsing) and its use in determining sentence meaning (interpretation) during sentence comprehension. In this article, we present the historical background to recent studies of the memory mechanisms that support parsing and interpretation and review recent research into this relation. We argue that the results of this research do not converge on a set of mechanisms derived from ST-WM that apply to parsing and interpretation. We argue that the memory mechanisms supporting parsing and interpretation have features that characterize another memory system that has been postulated to account for skilled performance—long-term working memory. We propose a model of the relation of different aspects of parsing and interpretation to ST-WM and long-term working memory. PMID:23319178
The PDS4 Data Dictionary Tool - Metadata Design for Data Preparers
NASA Astrophysics Data System (ADS)
Raugh, A.; Hughes, J. S.
2017-12-01
One of the major design goals of the PDS4 development effort was to create an extendable Information Model (IM) for the archive, and to allow mission data designers/preparers to create extensions for metadata definitions specific to their own contexts. This capability is critical for the Planetary Data System - an archive that deals with a data collection that is diverse along virtually every conceivable axis. Amid such diversity in the data itself, it is in the best interests of the PDS archive and its users that all extensions to the IM follow the same design techniques, conventions, and restrictions as the core implementation itself. But it is unrealistic to expect mission data designers to acquire expertise in information modeling, model-driven design, ontology, schema formulation, and PDS4 design conventions and philosophy in order to define their own metadata. To bridge that expertise gap and bring the power of information modeling to the data label designer, the PDS Engineering Node has developed the data dictionary creation tool known as "LDDTool". This tool incorporates the same software used to maintain and extend the core IM, packaged with an interface that enables a developer to create his extension to the IM using the same, standards-based metadata framework PDS itself uses. Through this interface, the novice dictionary developer has immediate access to the common set of data types and unit classes for defining attributes, and a straight-forward method for constructing classes. The more experienced developer, using the same tool, has access to more sophisticated modeling methods like abstraction and extension, and can define context-specific validation rules. We present the key features of the PDS Local Data Dictionary Tool, which both supports the development of extensions to the PDS4 IM, and ensures their compatibility with the IM.
A National Perspective on the Current Evaluation Activities in Extension
ERIC Educational Resources Information Center
Lamm, Alexa J.; Israel, Glenn D.; Diehl, David
2013-01-01
In order to enhance Extension evaluation efforts it is important to understand current practices. The study reported here researched the evaluation behaviors of county-based Extension professionals. Extension professionals from eight states (n = 1,173) responded to a survey regarding their evaluation data collection, analysis, and reporting…
Designing Fault-Injection Experiments for the Reliability of Embedded Systems
NASA Technical Reports Server (NTRS)
White, Allan L.
2012-01-01
This paper considers the long-standing problem of conducting fault-injections experiments to establish the ultra-reliability of embedded systems. There have been extensive efforts in fault injection, and this paper offers a partial summary of the efforts, but these previous efforts have focused on realism and efficiency. Fault injections have been used to examine diagnostics and to test algorithms, but the literature does not contain any framework that says how to conduct fault-injection experiments to establish ultra-reliability. A solution to this problem integrates field-data, arguments-from-design, and fault-injection into a seamless whole. The solution in this paper is to derive a model reduction theorem for a class of semi-Markov models suitable for describing ultra-reliable embedded systems. The derivation shows that a tight upper bound on the probability of system failure can be obtained using only the means of system-recovery times, thus reducing the experimental effort to estimating a reasonable number of easily-observed parameters. The paper includes an example of a system subject to both permanent and transient faults. There is a discussion of integrating fault-injection with field-data and arguments-from-design.
Variance in binary stellar population synthesis
NASA Astrophysics Data System (ADS)
Breivik, Katelyn; Larson, Shane L.
2016-03-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations in less than a week, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Studying Variance in the Galactic Ultra-compact Binary Population
NASA Astrophysics Data System (ADS)
Larson, Shane L.; Breivik, Katelyn
2017-01-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations on week-long timescales, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
NASA Technical Reports Server (NTRS)
Santanello, Joseph
2011-01-01
NASA's Land Information System (LIS; lis.gsfc.nasa.gov) is a flexible land surface modeling and data assimilation framework developed over the past decade with the goal of integrating satellite- and ground-based observational data products and advanced land surface modeling techniques to produce optimal fields of land surface states and fluxes. LIS features a high performance and flexible design, and operates on an ensemble of land surface models for extension over user-specified regional or global domains. The extensible interfaces of LIS allow the incorporation of new domains, land surface models (LSMs), land surface parameters, meteorological inputs, data assimilation and optimization algorithms. In addition, LIS has also been demonstrated for parameter estimation and uncertainty estimation, and has been coupled to the Weather Research and Forecasting (WRF) mesoscale model. A visiting fellowship is currently underway to implement JULES into LIS and to undertake some fundamental science on the feedbacks between the land surface and the atmosphere. An overview of the LIS system, features, and sample results will be presented in an effort to engage the community in the potential advantages of LIS-JULES for a range of applications. Ongoing efforts to develop a framework for diagnosing land-atmosphere coupling will also be presented using the suite of LSM and PBL schemes available in LIS and WRF along with observations from the U. S .. Southern Great Plains. This methodology provides a potential pathway to study factors controlling local land-atmosphere coupling (LoCo) using the LIS-WRF system, which will serve as a testbed for future experiments to evaluate coupling diagnostics within the community.
Terminology development towards harmonizing multiple clinical neuroimaging research repositories.
Turner, Jessica A; Pasquerello, Danielle; Turner, Matthew D; Keator, David B; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D; Potkin, Steven G; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei
2015-07-01
Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories.
Terminology development towards harmonizing multiple clinical neuroimaging research repositories
Turner, Jessica A.; Pasquerello, Danielle; Turner, Matthew D.; Keator, David B.; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D.; Potkin, Steven G.; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei
2015-01-01
Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories. PMID:26688838
Development and validation of a blade-element mathematical model for the AH-64A Apache helicopter
NASA Technical Reports Server (NTRS)
Mansur, M. Hossein
1995-01-01
A high-fidelity blade-element mathematical model for the AH-64A Apache Advanced Attack Helicopter has been developed by the Aeroflightdynamics Directorate of the U.S. Army's Aviation and Troop Command (ATCOM) at Ames Research Center. The model is based on the McDonnell Douglas Helicopter Systems' (MDHS) Fly Real Time (FLYRT) model of the AH-64A (acquired under contract) which was modified in-house and augmented with a blade-element-type main-rotor module. This report describes, in detail, the development of the rotor module, and presents some results of an extensive validation effort.
Generalized fractional diffusion equations for subdiffusion in arbitrarily growing domains
NASA Astrophysics Data System (ADS)
Angstmann, C. N.; Henry, B. I.; McGann, A. V.
2017-10-01
The ubiquity of subdiffusive transport in physical and biological systems has led to intensive efforts to provide robust theoretical models for this phenomena. These models often involve fractional derivatives. The important physical extension of this work to processes occurring in growing materials has proven highly nontrivial. Here we derive evolution equations for modeling subdiffusive transport in a growing medium. The derivation is based on a continuous-time random walk. The concise formulation of these evolution equations requires the introduction of a new, comoving, fractional derivative. The implementation of the evolution equation is illustrated with a simple model of subdiffusing proteins in a growing membrane.
Lange, Bernd Markus; Rios-Estepa, Rigoberto
2014-01-01
The integration of mathematical modeling with analytical experimentation in an iterative fashion is a powerful approach to advance our understanding of the architecture and regulation of metabolic networks. Ultimately, such knowledge is highly valuable to support efforts aimed at modulating flux through target pathways by molecular breeding and/or metabolic engineering. In this article we describe a kinetic mathematical model of peppermint essential oil biosynthesis, a pathway that has been studied extensively for more than two decades. Modeling assumptions and approximations are described in detail. We provide step-by-step instructions on how to run simulations of dynamic changes in pathway metabolites concentrations.
NASA Technical Reports Server (NTRS)
Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.
2013-01-01
NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA evaluate pilot modeling efforts and select the appropriate tools for future modeling efforts to predict pilot performance in NextGen operations.
2013-03-01
within systems of UAVs and between UAVs and the operators that use them. The next step for small UAVs in this direction is for one operator to be able...Team’s testing efforts, both in the planning and execution stages. The flight tests would never have taken place without the tremendous assistance...1 1.2 Unmanned Aerial Systems
ERIC Educational Resources Information Center
Gansemer, Lawrence P.; Bealer, Robert C.
Using data generated from the records of 460 rural-reared Pennsylvania males contacted initially as sophomores in 1947 and again in 1957 and 1971, an effort was made to replicate the tradition of path analytic, causal modeling of status attainment in American society and to assess the empirical efficacy of certain family input variables not…
NASA Technical Reports Server (NTRS)
Ayres, T. R.; Brown, A.
2000-01-01
Our LTSA (Long Term Space Astrophysics) research has utilized current NASA and ESA spacecraft, supporting ground-based IR, radio, and sub-mm telescopes, and the extensive archives of HST (Hubble Space Telescope), IUE (International Ultraviolet Explorer), ROSAT, EUVE (Extreme Ultraviolet Explorer), and other missions. Our research effort has included observational work (with a nonnegligible groundbased component), specialized processing techniques for imaging and spectral data, and semiempirical modelling, ranging from optically thin emission measure studies to simulations of optically thick resonance lines. In our previous LTSA efforts, we have had a number of major successes, including most recently: organizing and carrying out an extensive cool star UV survey in HST cycle eight; obtaining observing time with new instruments, such as Chandra and XMM (X-ray Multi-Mirror) in their first cycles; collaborating with the Chandra GTO program and participating with the Chandra Emission Line Project on multi-wavelength observations of HR 1099 and Capella. These are the main broad-brush themes of our previous investigation: a) Where do Coronae Occur in the Hertzsprung-Russell Diagram? b) Winds of Coronal and Noncoronal Stars; c) Activity, Age, Rotation Relations; d) Atmospheric Inhomogeneities; e) Heating Mechanisms, Subcoronal Flows, and Flares; f) Development of Analysis and Modelling Tools.
An Analysis of the Priority Needs of Cooperative Extension at the County Level
ERIC Educational Resources Information Center
Harder, Amy; Lamm, Alexa; Strong, Robert
2009-01-01
Cooperative Extension's role as a relevant provider of nonformal education is dependent upon its ability to improve and adjust in response to internal and external pressures. Periodically conducting needs assessments focused on the Extension organization can aid in Extension's efforts to deliver quality educational programs by pinpointing priority…
Utilizing Evaluation To Develop a Marketing Strategy in the Louisiana Cooperative Extension Service.
ERIC Educational Resources Information Center
Coreil, Paul D.; Verma, Satish
Marketing has become a popular strategic initiative among state extension services to meet the growing demand for program accountability. The Louisiana Cooperative Extension Service (LCES) began a formative evaluation of its marketing efforts as a step toward a comprehensive marketing plan. All extension faculty were surveyed to determine their…
A generic open-source software framework supporting scenario simulations in bioterrorist crises.
Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie
2013-09-01
Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.
Code of Federal Regulations, 2013 CFR
2013-01-01
... OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING EXTENSION PARTNERSHIP; ENVIRONMENTAL PROJECTS... information, NIST manufacturing extension efforts, EPA regulation and guidance, and state requirements. The... addition, consultants providing services to those businesses, the NIST Manufacturing Extension Centers, and...
Code of Federal Regulations, 2012 CFR
2012-01-01
... OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING EXTENSION PARTNERSHIP; ENVIRONMENTAL PROJECTS... information, NIST manufacturing extension efforts, EPA regulation and guidance, and state requirements. The... addition, consultants providing services to those businesses, the NIST Manufacturing Extension Centers, and...
Code of Federal Regulations, 2014 CFR
2014-01-01
... OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING EXTENSION PARTNERSHIP; ENVIRONMENTAL PROJECTS... information, NIST manufacturing extension efforts, EPA regulation and guidance, and state requirements. The... addition, consultants providing services to those businesses, the NIST Manufacturing Extension Centers, and...
FuGEFlow: data model and markup language for flow cytometry.
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-06-16
Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development.
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
NASA Technical Reports Server (NTRS)
Fried, Alan; Drummond, James
2003-01-01
This final report summarizes the progress achieved over the entire 3-year proposal period including two extensions spanning 1 year. These activities include: 1) Preparation for and participation in the NASA 2001 TRACE-P campaign using our airborne tunable diode laser system to acquire measurements of formaldehyde (CH2O); 2) Comprehensive data analysis and data submittal to the NASA archive; 3) Follow up data interpretation working with NASA modelers to place our ambient CH2O measurements into a broader photochemical context; 4) Publication of numerous JGR papers using this data; 5) Extensive follow up laboratory tests on the selectivity and efficiency of our CH20 scrubbing system; and 6) An extensive follow up effort to assess and study the mechanical stability of our entire optical system, particularly the multipass absorption cell, with aircraft changes in cabin pressure.
Sperm competition games when males invest in paternal care.
Requena, Gustavo S; Alonzo, Suzanne H
2017-08-16
Sperm competition games investigate how males partition limited resources between pre- and post-copulatory competition. Although extensive research has explored how various aspects of mating systems affect this allocation, male allocation between mating, fertilization and parental effort has not previously been considered. Yet, paternal care can be energetically expensive and males are generally predicted to adjust their parental effort in response to expected paternity. Here, we incorporate parental effort into sperm competition games, particularly exploring how the relationship between paternal care and offspring survival affects sperm competition and the relationship between paternity and paternal care. Our results support existing expectations that (i) fertilization effort should increase with female promiscuity and (ii) paternal care should increase with expected paternity. However, our analyses also reveal that the cost of male care can drive the strength of these patterns. When paternal behaviour is energetically costly, increased allocation to parental effort constrains allocation to fertilization effort. As paternal care becomes less costly, the association between paternity and paternal care weakens and may even be absent. By explicitly considering variation in sperm competition and the cost of male care, our model provides an integrative framework for predicting the interaction between paternal care and patterns of paternity. © 2017 The Author(s).
Experimental Characterization and Micromechanical Modeling of Woven Carbon/Copper Composites
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Pauly, Christopher C.; Pindera, Marek-Jerzy
1997-01-01
The results of an extensive experimental characterization and a preliminary analytical modeling effort for the elastoplastic mechanical behavior of 8-harness satin weave carbon/copper (C/Cu) composites are presented. Previous experimental and modeling investigations of woven composites are discussed, as is the evolution of, and motivation for, the continuing research on C/Cu composites. Experimental results of monotonic and cyclic tension, compression, and Iosipescu shear tests, and combined tension-compression tests, are presented. With regard to the test results, emphasis is placed on the effect of strain gauge size and placement, the effect of alloying the copper matrix to improve fiber-matrix bonding, yield surface characterization, and failure mechanisms. The analytical methodology used in this investigation consists of an extension of the three-dimensional generalized method of cells (GMC-3D) micromechanics model, developed by Aboudi (1994), to include inhomogeneity and plasticity effects on the subcell level. The extension of the model allows prediction of the elastoplastic mechanical response of woven composites, as represented by a true repeating unit cell for the woven composite. The model is used to examine the effects of refining the representative geometry of the composite, altering the composite overall fiber volume fraction, changing the size and placement of the strain gauge with respect to the composite's reinforcement weave, and including porosity within the infiltrated fiber yarns on the in-plane elastoplastic tensile, compressive, and shear response of 8-harness satin C/Cu. The model predictions are also compared with the appropriate monotonic experimental results.
Advances in Time-Distance Helioseismology
NASA Technical Reports Server (NTRS)
Duvall, Thomas L., Jr.; Beck, John G.; Gizon, Laurent; Kosovichev, Alexander F.; Oegerle, William (Technical Monitor)
2002-01-01
Time-distance helioseismology is a way to measure travel times between surface locations for waves traversing the solar interior. Coupling the travel with an extensive modeling effort has proven to be a powerful tool for measuring flows and other wave speed inhomogeneities in the solar interior. Problems receiving current attention include studying the time variation of the meridional circulation and torsional oscillation and active region emergence and evolution, current results on these topics will be presented.
Biscarini, Andrea; Benvenuti, Paolo; Botti, Fabio M; Brunetti, Antonella; Brunetti, Orazio; Pettorossi, Vito E
2014-09-01
A number of research studies provide evidence that hamstring cocontraction during open kinetic chain knee extension exercises enhances tibiofemoral (TF) stability and reduces the strain on the anterior cruciate ligament. To determine the possible increase in hamstring muscle coactivation caused by a voluntary cocontraction effort during open kinetic chain leg-extension exercises, and to assess whether an intentional hamstring cocontraction can completely suppress the anterior TF shear force during these exercises. Descriptive laboratory study. Knee kinematics as well as electromyographic activity in the semitendinosus (ST), semimembranosus (SM), biceps femoris (BF), and quadriceps femoris muscles were measured in 20 healthy men during isotonic leg extension exercises with resistance (R) ranging from 10% to 80% of the 1-repetition maximum (1RM). The same exercises were also performed while the participants attempted to enhance hamstring coactivation through a voluntary cocontraction effort. The data served as input parameters for a model to calculate the shear and compressive TF forces in leg extension exercises for any set of coactivation patterns of the different hamstring muscles. For R≤ 40% 1RM, the peak coactivation levels obtained with intentional cocontraction (l) were significantly higher (P < 10(-3)) than those obtained without intentional cocontraction (l 0). For each hamstring muscle, maximum level l was reached at R = 30% 1RM, corresponding to 9.2%, 10.5%, and 24.5% maximum voluntary isometric contraction (MVIC) for the BF, ST, and SM, respectively, whereas the ratio l/l 0 reached its maximum at R = 20% 1RM and was approximately 2, 3, and 4 for the BF, SM, and ST, respectively. The voluntary enhanced coactivation level l obtained for R≤ 30% 1RM completely suppressed the anterior TF shear force developed by the quadriceps during the exercise. In leg extension exercises with resistance R≤ 40% 1RM, coactivation of the BF, SM, and ST can be significantly enhanced (up to 2, 3, and 4 times, respectively) by a voluntary hamstring cocontraction effort. The enhanced coactivation levels obtained for R≤ 30% 1RM can completely suppress the anterior TF shear force developed by the quadriceps during the exercise. This laboratory study suggests that leg extension exercise with intentional hamstring cocontraction may have the potential to be a safe and effective quadriceps-strengthening intervention in the early stages of rehabilitation programs for anterior cruciate ligament injury or reconstruction recovery. Further studies, including clinical trials, are needed to investigate the relevance of this therapeutic exercise in clinical practice. © 2014 The Author(s).
Spatially-explicit models of global tree density.
Glick, Henry B; Bettigole, Charlie; Maynard, Daniel S; Covey, Kristofer R; Smith, Jeffrey R; Crowther, Thomas W
2016-08-16
Remote sensing and geographic analysis of woody vegetation provide means of evaluating the distribution of natural resources, patterns of biodiversity and ecosystem structure, and socio-economic drivers of resource utilization. While these methods bring geographic datasets with global coverage into our day-to-day analytic spheres, many of the studies that rely on these strategies do not capitalize on the extensive collection of existing field data. We present the methods and maps associated with the first spatially-explicit models of global tree density, which relied on over 420,000 forest inventory field plots from around the world. This research is the result of a collaborative effort engaging over 20 scientists and institutions, and capitalizes on an array of analytical strategies. Our spatial data products offer precise estimates of the number of trees at global and biome scales, but should not be used for local-level estimation. At larger scales, these datasets can contribute valuable insight into resource management, ecological modelling efforts, and the quantification of ecosystem services.
Evidence of erosive burning in shuttle solid rocket motor
NASA Technical Reports Server (NTRS)
Martin, C. L.
1983-01-01
Known models of Shuttle Solid Rocket Motor (SRM) performance have failed to produce pressure-time traces which accurately matched actual motor performance, especially during the first 5 seconds after ignition and during the last quarter of web burn time. Efforts to compensate for these differences in model reconstruction and actual performance resulted in resorting to the use of a Burning Anomaly Rate Function (BARF). It was suspected that propellant erosive burning was primarily responsible for the variation of model from actual results. The three dimensional Hercules Grain Design and Internal Ballistics Evaluation Program was made operational and slightly modified and an extensive trial and error effort was begun to test the hypothesis of erosive burning as an explanation of the burning anomaly. It was found that introduction of erosive burning (using Green's erosive burning equation) over portions of the aft segment grain and above a threshold gas Mach number did, in fact, give excellent agreement with the actual motor trace.
Benedict, Stephen T.; Conrads, Paul; Feaster, Toby D.; Journey, Celeste A.; Golden, Heather E.; Knightes, Christopher D.; Davis, Gary M.; Bradley, Paul M.
2012-01-01
The McTier Creek watershed is located in the headwaters of the Edisto River Basin, which is in the Coastal Plain region of South Carolina. The Edisto ecosystem has some of the highest recorded fish-tissue mercury concentrations in the United States. In an effort to advance the understanding of the fate and transport of mercury in stream ecosystems, the U.S. Geological Survey, as part of its National Water-Quality Assessment Program, initiated a field investigation of mercury in the McTier Creek watershed in 2006. The initial efforts of the investigation included the collection of extensive hydrologic and water-quality field data, along with the development of several hydrologic and water-quality models. This series of measured and modeled data forms the primary source of information for this investigation to assess the fate and transport of mercury within the McTier Creek watershed.
Comparison of UWCC MOX fuel measurements to MCNP-REN calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.; Baker, M.; Jie, R.
1998-12-31
The development of neutron coincidence counting has greatly improved the accuracy and versatility of neutron-based techniques to assay fissile materials. Today, the shift register analyzer connected to either a passive or active neutron detector is widely used by both domestic and international safeguards organizations. The continued development of these techniques and detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model, as it is currently used, fails to accurately predict detector response in highly multiplying mediums such as mixed-oxide (MOX) lightmore » water reactor fuel assemblies. For this reason, efforts have been made to modify the currently used Monte Carlo codes and to develop new analytical methods so that this model is not required to predict detector response. The authors describe their efforts to modify a widely used Monte Carlo code for this purpose and also compare calculational results with experimental measurements.« less
Fretting Fatigue of Single Crystal/Polycrystalline Nickel Subjected to Blade/Disk Contact Loading
NASA Astrophysics Data System (ADS)
Matlik, J. F.; Murthy, H.; Farris, T. N.
2002-01-01
Fretting fatigue describes the formation and growth of cracks at the edge-of-contact of nominally clamped components subjected to cyclic loading. Components that are known to be subject to fretting fatigue include riveted lap joints and blade/disk contacts in launch vehicle turbomachinery. Recent efforts have shown that conventional mechanics tools, both fatigue and fracture based, can be used to model fretting fatigue experiments leading to successful life predictions. In particular, experiments involving contact load configurations similar to those that occur in the blade/disk connection of gas turbine engines have been performed extensively. Predictions of fretting fatigue life have been compared favorably to experimental observations [1]. Recent efforts are aimed at performing experiments at higher temperatures as shown in the photograph below along with a sample fracture surface. The talk will describe the status of these experiments as will as model developments relevant to the single crystal material properties.
Use of Demonstration Gardens in Extension: Challenges and Benefits
ERIC Educational Resources Information Center
Glen, Charlotte D.; Moore, Gary E.; Jayaratne, K. S. U.; Bradley, Lucy K.
2014-01-01
Extension agents' use of demonstration gardens was studied to determine how gardens are employed in horticultural programming, perceived benefits and challenges of using gardens for Extension programming, and desired competencies. Gardens are primarily used to enhance educational efforts by providing hands-on learning experiences. Greatest…
An adapted yield criterion for the evolution of subsequent yield surfaces
NASA Astrophysics Data System (ADS)
Küsters, N.; Brosius, A.
2017-09-01
In numerical analysis of sheet metal forming processes, the anisotropic material behaviour is often modelled with isotropic work hardening and an average Lankford coefficient. In contrast, experimental observations show an evolution of the Lankford coefficients, which can be associated with a yield surface change due to kinematic and distortional hardening. Commonly, extensive efforts are carried out to describe these phenomena. In this paper an isotropic material model based on the Yld2000-2d criterion is adapted with an evolving yield exponent in order to change the yield surface shape. The yield exponent is linked to the accumulative plastic strain. This change has the effect of a rotating yield surface normal. As the normal is directly related to the Lankford coefficient, the change can be used to model the evolution of the Lankford coefficient during yielding. The paper will focus on the numerical implementation of the adapted material model for the FE-code LS-Dyna, mpi-version R7.1.2-d. A recently introduced identification scheme [1] is used to obtain the parameters for the evolving yield surface and will be briefly described for the proposed model. The suitability for numerical analysis will be discussed for deep drawing processes in general. Efforts for material characterization and modelling will be compared to other common yield surface descriptions. Besides experimental efforts and achieved accuracy, the potential of flexibility in material models and the risk of ambiguity during identification are of major interest in this paper.
Modeling strength data for CREW CHIEF
NASA Technical Reports Server (NTRS)
Mcdaniel, Joe W.
1990-01-01
The Air Force has developed CREW CHIEF, a computer-aided design (CAD) tool for simulating and evaluating aircraft maintenance to determine if the required activities are feasible. CREW CHIEF gives the designer the ability to simulate maintenance activities with respect to reach, accessibility, strength, hand tool operation, and materials handling. While developing the CREW CHIEF, extensive research was performed to describe workers strength capabilities for using hand tools and manual handling of objects. More than 100,000 strength measures were collected and modeled for CREW CHIEF. These measures involved both male and female subjects in the 12 maintenance postures included in CREW CHIEF. The data collection and modeling effort are described.
Validated Predictions of Metabolic Energy Consumption for Submaximal Effort Movement
Tsianos, George A.; MacFadden, Lisa N.
2016-01-01
Physical performance emerges from complex interactions among many physiological systems that are largely driven by the metabolic energy demanded. Quantifying metabolic demand is an essential step for revealing the many mechanisms of physical performance decrement, but accurate predictive models do not exist. The goal of this study was to investigate if a recently developed model of muscle energetics and force could be extended to reproduce the kinematics, kinetics, and metabolic demand of submaximal effort movement. Upright dynamic knee extension against various levels of ergometer load was simulated. Task energetics were estimated by combining the model of muscle contraction with validated models of lower limb musculotendon paths and segment dynamics. A genetic algorithm was used to compute the muscle excitations that reproduced the movement with the lowest energetic cost, which was determined to be an appropriate criterion for this task. Model predictions of oxygen uptake rate (VO2) were well within experimental variability for the range over which the model parameters were confidently known. The model's accurate estimates of metabolic demand make it useful for assessing the likelihood and severity of physical performance decrement for a given task as well as investigating underlying physiologic mechanisms. PMID:27248429
Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R
2017-01-01
Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.
A Synopsis of Global Mapping of Freshwater Habitats and Biodiversity: Implications for Conservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A.; Griffiths, Natalie A.; DeRolph, Christopher R.
Accurately mapping freshwater habitats and biodiversity at high-resolutions across the globe is essential for assessing the vulnerability and threats to freshwater organisms and prioritizing conservation efforts. Since the 2000s, extensive efforts have been devoted to mapping global freshwater habitats (rivers, lakes, and wetlands), the spatial representation of which has changed dramatically over time with new geospatial data products and improved remote sensing technologies. Some of these mapping efforts, however, are still coarse representations of actual conditions. Likewise, the resolution and scope of global freshwater biodiversity compilation efforts have also increased, but are yet to mirror the spatial resolution and fidelitymore » of mapped freshwater environments. In our synopsis, we find that efforts to map freshwater habitats have been conducted independently of those for freshwater biodiversity; subsequently, there is little congruence in the spatial representation and resolution of the two efforts. We suggest that global species distribution models are needed to fill this information gap; however, limiting data on habitat characteristics at scales that complement freshwater habitats has prohibited global high-resolution biogeography efforts. Emerging research trends, such as mapping habitat alteration in freshwater ecosystems and trait biogeography, show great promise in mechanistically linking global anthropogenic stressors to freshwater biodiversity decline and extinction risk.« less
Community Leadership Development: Implications for Extension.
ERIC Educational Resources Information Center
Northeast Regional Center for Rural Development, University Park, PA.
Designed for extension personnel who are involved in community leadership (CL) programs, this publication summarizes recent national efforts that could be useful in developing and conducting CL programs, and current leadership theory and literature. Part 1 reports the results of the national survey, initiated in April 1985, of extension staff…
Expanding the Graduate Education Experience at Scripps Institution of Oceanography, UC San Diego
NASA Astrophysics Data System (ADS)
Peach, C. L.; Kilb, D. L.; Zmarzly, D.; Abeyta, E.
2016-02-01
Emerging career pathways for graduate students in earth, ocean and climate sciences increasingly require skills in teaching and communication. This is true of academic careers, in which demonstrated teaching skills make applicants for faculty positions far more competitive, and traditionally less conventional careers outside of academia that require cross-disciplinary collaboration and/or communication to audiences not directly involved in science research (e.g. policy makers, educators, the public). Yet most graduate education programs provide little to no opportunity or incentive for young investigators to develop and hone these skills, and graduate students are often discouraged from deviating from the traditional "research apprenticeship" model during their graduate education. At Scripps, the Birch Aquarium at Scripps, and UC San Diego Extension, we are developing new ways to integrate teaching, communication, and outreach into our graduate education program, thus broadening the scope of graduate training and better serving the needs and evolving career aspirations of our graduate students. This effort is an integral part of our overall outreach strategy a Scripps in which we seek to combine high quality STEM outreach and teaching with opportunities for Scripps graduate students to put their teaching and communications training into practice. The overall effort is a "win-win" both for our students and for the highly diverse K-16 community in San Diego County. In this talk we will summarize the programmatic efforts currently underway at Scripps, our strategic collaboration with UCSD Extension, which is expanding the capacity and reach of our integrated program, and our plans for sustaining these efforts for the long term.
Comparing Emerging XML Based Formats from a Multi-discipline Perspective
NASA Astrophysics Data System (ADS)
Sawyer, D. M.; Reich, L. I.; Nikhinson, S.
2002-12-01
This paper analyzes the similarity and differences among several examples of an emerging generation of Scientific Data Formats that are based on XML technologies. Some of the factors evaluated include the goals of these efforts, the data models, and XML technologies used, and the maturity of currently available software. This paper then investigates the practicality of developing a single set of structural data objects and basic scientific concepts, such as units, that could be used across discipline boundaries and extended by disciplines and missions to create Scientific Data Formats for their communities. This analysis is partly based on an effort sponsored by the ESDIS office at GSFC to compare the Earth Science Markup Language (ESML) and the eXtensible Data Format( XDF), two members of this new generation of XML based Data Description Languages that have been developed by NASA funded efforts in recent years. This paper adds FITSML and potentially CDFML to the list of XML based Scientific Data Formats discussed. This paper draws heavily a Formats Evolution Process Committee (http://ssdoo.gsfc.nasa.gov/nost/fep/) draft white paper primarily developed by Lou Reich, Mike Folk and Don Sawyer to assist the Space Science community in understanding Scientific Data Formats. One of primary conclusions of that paper is that a scientific data format object model should be examined along two basic axes. The first is the complexity of the computer/mathematical data types supported and the second is the level of scientific domain specialization incorporated. This paper also discusses several of the issues that affect the decision on whether to implement a discipline or project specific Scientific Data Format as a formal extension of a general purpose Scientific Data Format or to implement the APIs independently.
Efforts to make and apply humanized yeast
Laurent, Jon M.; Young, Jonathan H.; Kachroo, Aashiq H.
2016-01-01
Despite a billion years of divergent evolution, the baker’s yeast Saccharomyces cerevisiae has long proven to be an invaluable model organism for studying human biology. Given its tractability and ease of genetic manipulation, along with extensive genetic conservation with humans, it is perhaps no surprise that researchers have been able to expand its utility by expressing human proteins in yeast, or by humanizing specific yeast amino acids, proteins or even entire pathways. These methods are increasingly being scaled in throughput, further enabling the detailed investigation of human biology and disease-specific variations of human genes in a simplified model organism. PMID:26462863
Kidnapping model: an extension of Selten's game.
Iqbal, Azhar; Masson, Virginie; Abbott, Derek
2017-12-01
Selten's game is a kidnapping model where the probability of capturing the kidnapper is independent of whether the hostage has been released or executed. Most often, in view of the elevated sensitivities involved, authorities put greater effort and resources into capturing the kidnapper if the hostage has been executed, in contrast with the case when a ransom is paid to secure the hostage's release. In this paper, we study the asymmetric game when the probability of capturing the kidnapper depends on whether the hostage has been executed or not and find a new uniquely determined perfect equilibrium point in Selten's game.
Progress towards extreme attitude testing with Magnetic Suspension and Balance Systems
NASA Technical Reports Server (NTRS)
Britcher, Colin P.; Parker, David H.
1988-01-01
Progress is reported in a research effort aimed towards demonstration of the feasibility of suspension and aerodynamic testing of models at high angles of attack in wind tunnel Magnetic Suspension and Balance Systems. Extensive modifications, described in this paper, have been made to the Southampton University suspension system in order to facilitate this work. They include revision of electromagnet configuration, installation of all-new position sensors and expansion of control system programs. An angle of attack range of 0 to 90 deg is expected for axisymmetric models. To date, suspension up to 80 deg angle of attack has been achieved.
Hormone purification by isoelectric focusing in space
NASA Technical Reports Server (NTRS)
Bier, M.
1982-01-01
The performance of a ground-prototype of an apparatus for recycling isoelectric focusing was evaluated in an effort to provide technology for large scale purification of peptide hormones, proteins, and other biologicals. Special emphasis was given to the effects of gravity on the function of the apparatus and to the determination of potential advantages deriveable from its use in a microgravity environment. A theoretical model of isoelectric focusing sing chemically defined buffer systems for the establishment of the pH gradients was developed. The model was transformed to a form suitable for computer simulations and was used extensively for the design of experimental buffers.
Non-formal educator use of evaluation results.
Baughman, Sarah; Boyd, Heather H; Franz, Nancy K
2012-08-01
Increasing demands for accountability in educational programming have resulted in increasing calls for program evaluation in educational organizations. Many organizations include conducting program evaluations as part of the job responsibilities of program staff. Cooperative Extension is a complex organization offering non-formal educational programs through land grant universities. Many Extension services require non-formal educational program evaluations be conducted by field-based Extension educators. Evaluation research has focused primarily on the efforts of professional, external evaluators. The work of program staff with many responsibilities including program evaluation has received little attention. This study examined how field based Extension educators (i.e. program staff) in four Extension services use the results of evaluations of programs that they have conducted themselves. Four types of evaluation use are measured and explored; instrumental use, conceptual use, persuasive use and process use. Results indicate that there are few programmatic changes as a result of evaluation findings among the non-formal educators surveyed in this study. Extension educators tend to use evaluation results to persuade others about the value of their programs and learn from the evaluation process. Evaluation use is driven by accountability measures with very little program improvement use as measured in this study. Practical implications include delineating accountability and program improvement tasks within complex organizations in order to align evaluation efforts and to improve the results of both. There is some evidence that evaluation capacity building efforts may be increasing instrumental use by educators evaluating their own programs. Copyright © 2011 Elsevier Ltd. All rights reserved.
Grant, Evan H. Campbell; Zipkin, Elise; Scott, Sillett T.; Chandler, Richard; Royle, J. Andrew
2014-01-01
Wildlife populations consist of individuals that contribute disproportionately to growth and viability. Understanding a population's spatial and temporal dynamics requires estimates of abundance and demographic rates that account for this heterogeneity. Estimating these quantities can be difficult, requiring years of intensive data collection. Often, this is accomplished through the capture and recapture of individual animals, which is generally only feasible at a limited number of locations. In contrast, N-mixture models allow for the estimation of abundance, and spatial variation in abundance, from count data alone. We extend recently developed multistate, open population N-mixture models, which can additionally estimate demographic rates based on an organism's life history characteristics. In our extension, we develop an approach to account for the case where not all individuals can be assigned to a state during sampling. Using only state-specific count data, we show how our model can be used to estimate local population abundance, as well as density-dependent recruitment rates and state-specific survival. We apply our model to a population of black-throated blue warblers (Setophaga caerulescens) that have been surveyed for 25 years on their breeding grounds at the Hubbard Brook Experimental Forest in New Hampshire, USA. The intensive data collection efforts allow us to compare our estimates to estimates derived from capture–recapture data. Our model performed well in estimating population abundance and density-dependent rates of annual recruitment/immigration. Estimates of local carrying capacity and per capita recruitment of yearlings were consistent with those published in other studies. However, our model moderately underestimated annual survival probability of yearling and adult females and severely underestimates survival probabilities for both of these male stages. The most accurate and precise estimates will necessarily require some amount of intensive data collection efforts (such as capture–recapture). Integrated population models that combine data from both intensive and extensive sources are likely to be the most efficient approach for estimating demographic rates at large spatial and temporal scales.
1890 Institutions' Extension Program and Rural Development.
ERIC Educational Resources Information Center
Brown, Adell, Jr.
The extension role of Tuskegee Institute and the 16 black land grant colleges established by the Morrill Act of 1890 has been to diffuse among the non-university citizens of America useful and practical information on agriculture, home economics, and related areas. Tuskegee's extension efforts began in 1880 and flourished under the leadership of…
Cooperative Extension Answers the Call to Action to Support Breastfeeding
ERIC Educational Resources Information Center
Brill, Michelle F.
2016-01-01
Extension has many opportunities to promote breastfeeding, one of the most highly effective preventive measures a mother can take to protect the health of her infant and herself. This article describes how and why Cooperative Extension can and should partner with federal and state efforts to promote breastfeeding. Members of Rutgers' Family and…
From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models
Zhu, Hao
2017-01-01
Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837
Development and Testing of Carbon-Carbon Nozzle Extensions for Upper Stage Liquid Rocket Engines
NASA Technical Reports Server (NTRS)
Valentine, Peter G.; Gradl, Paul R.; Greene, Sandra E.
2017-01-01
Carbon-carbon (C-C) composite nozzle extensions are of interest for use on a variety of launch vehicle upper stage engines and in-space propulsion systems. The C-C nozzle extension technology and test capabilities being developed are intended to support National Aeronautics and Space Administration (NASA) and Department of Defense (DOD) requirements, as well as those of the broader Commercial Space industry. For NASA, C-C nozzle extension technology development primarily supports the NASA Space Launch System (SLS) and NASA's Commercial Space partners. Marshall Space Flight Center (MSFC) efforts are aimed at both (a) further developing the technology and databases needed to enable the use of composite nozzle extensions on cryogenic upper stage engines, and (b) developing and demonstrating low-cost capabilities for testing and qualifying composite nozzle extensions. Recent, on-going, and potential future work supporting NASA, DOD, and Commercial Space needs will be discussed. Information to be presented will include (a) recent and on-going mechanical, thermal, and hot-fire testing, as well as (b) potential future efforts to further develop and qualify domestic C-C nozzle extension solutions for the various upper stage engines under development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bons, Jeffrey; Ameri, Ali
2016-01-08
The objective of this research effort was to develop a validated computational modeling capability for the characterization of the effects of hot streaks and particulate deposition on the heat load of modern gas turbines. This was accomplished with a multi-faceted approach including analytical, experimental, and computational components. A 1-year no cost extension request was approved for this effort, so the total duration was 4 years. The research effort succeeded in its ultimate objective by leveraging extensive experimental deposition studies complemented by computational modeling. Experiments were conducted with hot streaks, vane cooling, and combinations of hot streaks with vane cooling. Thesemore » studies contributed to a significant body of corporate knowledge of deposition, in combination with particle rebound and deposition studies funded by other agencies, to provide suitable conditions for the development of a new model. The model includes the following physical phenomena: elastic deformation, plastic deformation, adhesion, and shear removal. It also incorporates material property sensitivity to temperature and tangential-normal velocity rebound cross-dependencies observed in experiments. The model is well-suited for incorporation in CFD simulations of complex gas turbine flows due to its algebraic (explicit) formulation. This report contains model predictions compared to coefficient of restitution data available in the open literature as well as deposition results from two different high temperature turbine deposition facilities. While the model comparisons with experiments are in many cases promising, several key aspects of particle deposition remain elusive. The simple phenomenological nature of the model allows for parametric dependencies to be evaluated in a straightforward manner. This effort also included the first-ever full turbine stage deposition model published in the open literature. The simulations included hot streaks and simulated vane cooling. The new deposition model was implemented into the CFD model as a wall boundary condition, with various particle sizes investigated in the simulation. Simulations utilizing a steady mixing plane formulation and an unsteady sliding mesh were conducted and the flow solution of each was validated against experimental data. Results from each of these simulations, including impact and capture distributions and efficiencies, were compared and potential reasons for differences discussed in detail. The inclusion of a large range of particle sizes allowed investigation of trends with particle size, such as increased radial migration and reduced sticking efficiency at the larger particle sizes. The unsteady simulation predicted lower sticking efficiencies on the rotor blades than the mixing plane simulation for the majority of particle sizes. This is postulated to be due to the preservation of the hot streak and cool vane wake through the vane-rotor interface (which are smeared out circumferentially in the mixing-plane simulation). The results reported here represent the successful implementation of a novel deposition model into validated vane-rotor flow solutions that include a non-uniform inlet temperature profile and simulated vane cooling.« less
Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J
2001-08-01
The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.
A framework for modeling and optimizing dynamic systems under uncertainty
Nicholson, Bethany; Siirola, John
2017-11-11
Algebraic modeling languages (AMLs) have drastically simplified the implementation of algebraic optimization problems. However, there are still many classes of optimization problems that are not easily represented in most AMLs. These classes of problems are typically reformulated before implementation, which requires significant effort and time from the modeler and obscures the original problem structure or context. In this work we demonstrate how the Pyomo AML can be used to represent complex optimization problems using high-level modeling constructs. We focus on the operation of dynamic systems under uncertainty and demonstrate the combination of Pyomo extensions for dynamic optimization and stochastic programming.more » We use a dynamic semibatch reactor model and a large-scale bubbling fluidized bed adsorber model as test cases.« less
A framework for modeling and optimizing dynamic systems under uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, Bethany; Siirola, John
Algebraic modeling languages (AMLs) have drastically simplified the implementation of algebraic optimization problems. However, there are still many classes of optimization problems that are not easily represented in most AMLs. These classes of problems are typically reformulated before implementation, which requires significant effort and time from the modeler and obscures the original problem structure or context. In this work we demonstrate how the Pyomo AML can be used to represent complex optimization problems using high-level modeling constructs. We focus on the operation of dynamic systems under uncertainty and demonstrate the combination of Pyomo extensions for dynamic optimization and stochastic programming.more » We use a dynamic semibatch reactor model and a large-scale bubbling fluidized bed adsorber model as test cases.« less
Evaluation of a Multi-Axial, Temperature, and Time Dependent (MATT) Failure Model
NASA Technical Reports Server (NTRS)
Richardson, D. E.; Anderson, G. L.; Macon, D. J.; Rudolphi, Michael (Technical Monitor)
2002-01-01
To obtain a better understanding the response of the structural adhesives used in the Space Shuttle's Reusable Solid Rocket Motor (RSRM) nozzle, an extensive effort has been conducted to characterize in detail the failure properties of these adhesives. This effort involved the development of a failure model that includes the effects of multi-axial loading, temperature, and time. An understanding of the effects of these parameters on the failure of the adhesive is crucial to the understanding and prediction of the safety of the RSRM nozzle. This paper documents the use of this newly developed multi-axial, temperature, and time (MATT) dependent failure model for modeling failure for the adhesives TIGA 321, EA913NA, and EA946. The development of the mathematical failure model using constant load rate normal and shear test data is presented. Verification of the accuracy of the failure model is shown through comparisons between predictions and measured creep and multi-axial failure data. The verification indicates that the failure model performs well for a wide range of conditions (loading, temperature, and time) for the three adhesives. The failure criterion is shown to be accurate through the glass transition for the adhesive EA946. Though this failure model has been developed and evaluated with adhesives, the concepts are applicable for other isotropic materials.
Fixed gain and adaptive techniques for rotorcraft vibration control
NASA Technical Reports Server (NTRS)
Roy, R. H.; Saberi, H. A.; Walker, R. A.
1985-01-01
The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.
ELM control with RMP: plasma response models and the role of edge peeling response
NASA Astrophysics Data System (ADS)
Liu, Yueqiang; Ham, C. J.; Kirk, A.; Li, Li; Loarte, A.; Ryan, D. A.; Sun, Youwen; Suttrop, W.; Yang, Xu; Zhou, Lina
2016-11-01
Resonant magnetic perturbations (RMP) have extensively been demonstrated as a plausible technique for mitigating or suppressing large edge localized modes (ELMs). Associated with this is a substantial amount of theory and modelling efforts during recent years. Various models describing the plasma response to the RMP fields have been proposed in the literature, and are briefly reviewed in this work. Despite their simplicity, linear response models can provide alternative criteria, than the vacuum field based criteria, for guiding the choice of the coil configurations to achieve the best control of ELMs. The role of the edge peeling response to the RMP fields is illustrated as a key indicator for the ELM mitigation in low collisionality plasmas, in various tokamak devices.
NASA Astrophysics Data System (ADS)
Hobbs, J.; Turmon, M.; David, C. H.; Reager, J. T., II; Famiglietti, J. S.
2017-12-01
NASA's Western States Water Mission (WSWM) combines remote sensing of the terrestrial water cycle with hydrological models to provide high-resolution state estimates for multiple variables. The effort includes both land surface and river routing models that are subject to several sources of uncertainty, including errors in the model forcing and model structural uncertainty. Computational and storage constraints prohibit extensive ensemble simulations, so this work outlines efficient but flexible approaches for estimating and reporting uncertainty. Calibrated by remote sensing and in situ data where available, we illustrate the application of these techniques in producing state estimates with associated uncertainties at kilometer-scale resolution for key variables such as soil moisture, groundwater, and streamflow.
Terrain and refractivity effects on non-optical paths
NASA Astrophysics Data System (ADS)
Barrios, Amalia E.
1994-07-01
The split-step parabolic equation (SSPE) has been used extensively to model tropospheric propagation over the sea, but recent efforts have extended this method to propagation over arbitrary terrain. At the Naval Command, Control and Ocean Surveillance Center (NCCOSC), Research, Development, Test and Evaluation Division, a split-step Terrain Parabolic Equation Model (TPEM) has been developed that takes into account variable terrain and range-dependent refractivity profiles. While TPEM has been previously shown to compare favorably with measured data and other existing terrain models, two alternative methods to model radiowave propagation over terrain, implemented within TPEM, will be presented that give a two to ten-fold decrease in execution time. These two methods are also shown to agree well with measured data.
FuGEFlow: data model and markup language for flow cytometry
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-01-01
Background Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. Methods We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. Results The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. Conclusion We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development. PMID:19531228
Helping Hands; Giving Volunteer Leaders a Place in the Extension Program.
ERIC Educational Resources Information Center
Strow, Helen A.
The document is a guide for extension workers, to aid them in identifying and training local volunteer leaders, thereby adding a broader dimension to the extension worker's efforts and enabling him to increase by many times the number of families he is able to reach. Leadership is defined, the importance of leaders explained, and methods for…
The Role of Evaluation in Determining the Public Value of Extension
ERIC Educational Resources Information Center
Franz, Nancy; Arnold, Mary; Baughman, Sarah
2014-01-01
Extension has developed a strong evaluation culture across the system for the last 15 years. Yet measures are still limited to the private value of programs, looking at problems in a linear way and at isolated efforts. Across the country, Extension evaluators and administrators need to step up to help answer the "so what?" question about…
Payne, Philip R.O.; Greaves, Andrew W.; Kipps, Thomas J.
2003-01-01
The Chronic Lymphocytic Leukemia (CLL) Research Consortium (CRC) consists of 9 geographically distributed sites conducting a program of research including both basic science and clinical components. To enable the CRC’s clinical research efforts, a system providing for real-time collaboration was required. CTMS provides such functionality, and demonstrates that the use of novel data modeling, web-application platforms, and management strategies provides for the deployment of an extensible, cost effective solution in such an environment. PMID:14728471
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olson, D.
2012-09-01
Organic-based solar cells offer the potential for low cost, scalable conversion of solar energy. This project will try to utilize the extensive organic synthetic capabilities of ConocoPhillips to produce novel acceptor and donor materials as well potentially as interface modifiers to produce improved OPV devices with greater efficiency and stability. The synthetic effort will be based on the knowledge base and modeling being done at NREL to identify new candidate materials.
Development and application of structural dynamics analysis capabilities
NASA Technical Reports Server (NTRS)
Heinemann, Klaus W.; Hozaki, Shig
1994-01-01
Extensive research activities were performed in the area of multidisciplinary modeling and simulation of aerospace vehicles that are relevant to NASA Dryden Flight Research Facility. The efforts involved theoretical development, computer coding, and debugging of the STARS code. New solution procedures were developed in such areas as structures, CFD, and graphics, among others. Furthermore, systems-oriented codes were developed for rendering the code truly multidisciplinary and rather automated in nature. Also, work was performed in pre- and post-processing of engineering analysis data.
NASA Technical Reports Server (NTRS)
Bishop, James
1991-01-01
Extensive capabilities were developed in the analysis of ultraviolet spectrometer (UVS) absorptive lightcurves. The application of these capabilities to the Voyager UVS data sets from Uranus and Neptune has provided significant findings regarding the stratospheres of these planets. In particular, the direct comparison between photochemical models and UVS measurements accomplished by these efforts is unique, and it helps to guarantee that the information returned by the Voyager 2 spacecraft is being used to the fullest extent possible.
Developing the Precision Magnetic Field for the E989 Muon g{2 Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Matthias W.
The experimental value ofmore » $$(g\\hbox{--}2)_\\mu$$ historically has been and contemporarily remains an important probe into the Standard Model and proposed extensions. Previous measurements of $$(g\\hbox{--}2)_\\mu$$ exhibit a persistent statistical tension with calculations using the Standard Model implying that the theory may be incomplete and constraining possible extensions. The Fermilab Muon g-2 experiment, E989, endeavors to increase the precision over previous experiments by a factor of four and probe more deeply into the tension with the Standard Model. The $$(g\\hbox{--}2)_\\mu$$ experimental implementation measures two spin precession frequencies defined by the magnetic field, proton precession and muon precession. The value of $$(g\\hbox{--}2)_\\mu$$ is derived from a relationship between the two frequencies. The precision of magnetic field measurements and the overall magnetic field uniformity achieved over the muon storage volume are then two undeniably important aspects of the e xperiment in minimizing uncertainty. The current thesis details the methods employed to achieve magnetic field goals and results of the effort.« less
Insights into mammalian biology from the wild house mouse Mus musculus
Phifer-Rixey, Megan; Nachman, Michael W
2015-01-01
The house mouse, Mus musculus, was established in the early 1900s as one of the first genetic model organisms owing to its short generation time, comparatively large litters, ease of husbandry, and visible phenotypic variants. For these reasons and because they are mammals, house mice are well suited to serve as models for human phenotypes and disease. House mice in the wild consist of at least three distinct subspecies and harbor extensive genetic and phenotypic variation both within and between these subspecies. Wild mice have been used to study a wide range of biological processes, including immunity, cancer, male sterility, adaptive evolution, and non-Mendelian inheritance. Despite the extensive variation that exists among wild mice, classical laboratory strains are derived from a limited set of founders and thus contain only a small subset of this variation. Continued efforts to study wild house mice and to create new inbred strains from wild populations have the potential to strengthen house mice as a model system. DOI: http://dx.doi.org/10.7554/eLife.05959.001 PMID:25875302
76 FR 45621 - Employment and Training Administration
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-29
... DEPARTMENT OF LABOR Employment and Training Administration Comment Request for Extension of... Administration. ACTION: Notice. SUMMARY: The Department of Labor (Department), as part of its continuing effort..., the Employment and Training Administration (ETA) is soliciting comments concerning the extension of...
ERIC Educational Resources Information Center
Paulston, Rolland G.
Socioeconomic development in China between the two world wars included an extensive mass education effort by reform-minded young Chinese. Focusing on adult literacy and rural reconstruction, Chinese educators employed various strategies to stimulate and assist self-help efforts in Chinese villages. Paramount among these educational techniques were…
Struben, Jeroen; Chan, Derek; Dubé, Laurette
2014-12-01
This paper presents a system dynamics policy model of nutritional food market transformation, tracing over-time interactions between the nutritional quality of supply, consumer food choice, population health, and governmental policy. Applied to the Canadian context and with body mass index as the primary outcome, we examine policy portfolios for obesity prevention, including (1) industry self-regulation efforts, (2) health- and nutrition-sensitive governmental policy, and (3) efforts to foster health- and nutrition-sensitive innovation. This work provides novel theoretical and practical insights on drivers of nutritional market transformations, highlighting the importance of integrative policy portfolios to simultaneously shift food demand and supply for successful and self-sustaining nutrition and health sensitivity. We discuss model extensions for deeper and more comprehensive linkages of nutritional food market transformation with supply, demand, and policy in agrifood and health/health care. These aim toward system design and policy that can proactively, and with greater impact, scale, and resilience, address single as well as double malnutrition in varying country settings. © 2014 New York Academy of Sciences.
ATTDES: An Expert System for Satellite Attitude Determination and Control. 2
NASA Technical Reports Server (NTRS)
Mackison, Donald L.; Gifford, Kevin
1996-01-01
The design, analysis, and flight operations of satellite attitude determintion and attitude control systems require extensive mathematical formulations, optimization studies, and computer simulation. This is best done by an analyst with extensive education and experience. The development of programs such as ATTDES permit the use of advanced techniques by those with less experience. Typical tasks include the mission analysis to select stabilization and damping schemes, attitude determination sensors and algorithms, and control system designs to meet program requirements. ATTDES is a system that includes all of these activities, including high fidelity orbit environment models that can be used for preliminary analysis, parameter selection, stabilization schemes, the development of estimators covariance analyses, and optimization, and can support ongoing orbit activities. The modification of existing simulations to model new configurations for these purposes can be an expensive, time consuming activity that becomes a pacing item in the development and operation of such new systems. The use of an integrated tool such as ATTDES significantly reduces the effort and time required for these tasks.
A Holistic Approach to Systems Development
NASA Technical Reports Server (NTRS)
Wong, Douglas T.
2008-01-01
Introduces a Holistic and Iterative Design Process. Continuous process but can be loosely divided into four stages. More effort spent early on in the design. Human-centered and Multidisciplinary. Emphasis on Life-Cycle Cost. Extensive use of modeling, simulation, mockups, human subjects, and proven technologies. Human-centered design doesn t mean the human factors discipline is the most important Disciplines should be involved in the design: Subsystem vendors, configuration management, operations research, manufacturing engineering, simulation/modeling, cost engineering, hardware engineering, software engineering, test and evaluation, human factors, electromagnetic compatibility, integrated logistics support, reliability/maintainability/availability, safety engineering, test equipment, training systems, design-to-cost, life cycle cost, application engineering etc. 9
Direct hydrocarbon identification using AVO analysis in the Malay Basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lye, Y.C.; Yaacob, M.R.; Birkett, N.E.
1994-07-01
Esso Production Malaysia Inc. and Petronas Carigali Sdn. Bhd. have been conducting AVO (amplitude versus offset) processing and interpretation since April 1991 in an attempt to identify hydrocarbon fluids predrill. The major part of this effort was in the PM-5, PM-8, and PM-9 contract areas where an extensive exploration program is underway. To date, more than 1000 km of seismic data have been analyzed using the AVO technique, and the results were used to support the drilling of more than 50 exploration and delineation wells. Gather modeling of well data was used to calibrate and predict the presence of hydrocarbonmore » in proposed well locations. In order to generate accurate gather models, a geophysical properties and AVO database was needed, and great effort was spent in producing an accurate and complete database. This database is continuously being updated so that an experience file can be built to further improve the reliability of the AVO prediction.« less
Comparative Approaches to Understanding the Relation Between Aging and Physical Function
Cesari, Matteo; Seals, Douglas R.; Shively, Carol A.; Carter, Christy S.
2016-01-01
Despite dedicated efforts to identify interventions to delay aging, most promising interventions yielding dramatic life-span extension in animal models of aging are often ineffective when translated to clinical trials. This may be due to differences in primary outcomes between species and difficulties in determining the optimal clinical trial paradigms for translation. Measures of physical function, including brief standardized testing batteries, are currently being proposed as biomarkers of aging in humans, are predictive of adverse health events, disability, and mortality, and are commonly used as functional outcomes for clinical trials. Motor outcomes are now being incorporated into preclinical testing, a positive step toward enhancing our ability to translate aging interventions to clinical trials. To further these efforts, we begin a discussion of physical function and disability assessment across species, with special emphasis on mice, rats, monkeys, and man. By understanding how physical function is assessed in humans, we can tailor measurements in animals to better model those outcomes to establish effective, standardized translational functional assessments with aging. PMID:25910845
Haptic communication between humans is tuned by the hard or soft mechanics of interaction
Usai, Francesco; Ganesh, Gowrishankar; Sanguineti, Vittorio; Burdet, Etienne
2018-01-01
To move a hard table together, humans may coordinate by following the dominant partner’s motion [1–4], but this strategy is unsuitable for a soft mattress where the perceived forces are small. How do partners readily coordinate in such differing interaction dynamics? To address this, we investigated how pairs tracked a target using flexion-extension of their wrists, which were coupled by a hard, medium or soft virtual elastic band. Tracking performance monotonically increased with a stiffer band for the worse partner, who had higher tracking error, at the cost of the skilled partner’s muscular effort. This suggests that the worse partner followed the skilled one’s lead, but simulations show that the results are better explained by a model where partners share movement goals through the forces, whilst the coupling dynamics determine the capacity of communicable information. This model elucidates the versatile mechanism by which humans can coordinate during both hard and soft physical interactions to ensure maximum performance with minimal effort. PMID:29565966
Link Connectivity and Coverage of Underwater Cognitive Acoustic Networks under Spectrum Constraint
Wang, Qiu; Cheang, Chak Fong
2017-01-01
Extensive attention has been given to the use of cognitive radio technology in underwater acoustic networks since the acoustic spectrum became scarce due to the proliferation of human aquatic activities. Most of the recent studies on underwater cognitive acoustic networks (UCANs) mainly focus on spectrum management or protocol design. Few efforts have addressed the quality-of-service (QoS) of UCANs. In UCANs, secondary users (SUs) have lower priority to use acoustic spectrum than primary users (PUs) with higher priority to access spectrum. As a result, the QoS of SUs is difficult to ensure in UCANs. This paper proposes an analytical model to investigate the link connectivity and the probability of coverage of SUs in UCANs. In particular, this model takes both topological connectivity and spectrum availability into account, though spectrum availability has been ignored in most recent studies. We conduct extensive simulations to evaluate the effectiveness and the accuracy of our proposed model. Simulation results show that our proposed model is quite accurate. Besides, our results also imply that the link connectivity and the probability of coverage of SUs heavily depend on both the underwater acoustic channel conditions and the activities of PUs. PMID:29215561
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Gunceler, Deniz; Arias, T. A.
2014-10-07
Continuum solvation models enable efficient first principles calculations of chemical reactions in solution, but require extensive parametrization and fitting for each solvent and class of solute systems. Here, we examine the assumptions of continuum solvation models in detail and replace empirical terms with physical models in order to construct a minimally-empirical solvation model. Specifically, we derive solvent radii from the nonlocal dielectric response of the solvent from ab initio calculations, construct a closed-form and parameter-free weighted-density approximation for the free energy of the cavity formation, and employ a pair-potential approximation for the dispersion energy. We show that the resulting modelmore » with a single solvent-independent parameter: the electron density threshold (n c), and a single solvent-dependent parameter: the dispersion scale factor (s 6), reproduces solvation energies of organic molecules in water, chloroform, and carbon tetrachloride with RMS errors of 1.1, 0.6 and 0.5 kcal/mol, respectively. We additionally show that fitting the solvent-dependent s 6 parameter to the solvation energy of a single non-polar molecule does not substantially increase these errors. Parametrization of this model for other solvents, therefore, requires minimal effort and is possible without extensive databases of experimental solvation free energies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Gunceler, Deniz; Arias, T. A.
2014-10-07
Continuum solvation models enable efficient first principles calculations of chemical reactions in solution, but require extensive parametrization and fitting for each solvent and class of solute systems. Here, we examine the assumptions of continuum solvation models in detail and replace empirical terms with physical models in order to construct a minimally-empirical solvation model. Specifically, we derive solvent radii from the nonlocal dielectric response of the solvent from ab initio calculations, construct a closed-form and parameter-free weighted-density approximation for the free energy of the cavity formation, and employ a pair-potential approximation for the dispersion energy. We show that the resulting modelmore » with a single solvent-independent parameter: the electron density threshold (n{sub c}), and a single solvent-dependent parameter: the dispersion scale factor (s{sub 6}), reproduces solvation energies of organic molecules in water, chloroform, and carbon tetrachloride with RMS errors of 1.1, 0.6 and 0.5 kcal/mol, respectively. We additionally show that fitting the solvent-dependent s{sub 6} parameter to the solvation energy of a single non-polar molecule does not substantially increase these errors. Parametrization of this model for other solvents, therefore, requires minimal effort and is possible without extensive databases of experimental solvation free energies.« less
Benchmark Simulation Model No 2: finalisation of plant layout and default control strategy.
Nopens, I; Benedetti, L; Jeppsson, U; Pons, M-N; Alex, J; Copp, J B; Gernaey, K V; Rosen, C; Steyer, J-P; Vanrolleghem, P A
2010-01-01
The COST/IWA Benchmark Simulation Model No 1 (BSM1) has been available for almost a decade. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the research work related to the benchmark simulation models has resulted in more than 300 publications worldwide demonstrates the interest in and need of such tools within the research community. Recent efforts within the IWA Task Group on "Benchmarking of control strategies for WWTPs" have focused on an extension of the benchmark simulation model. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently, includes both pretreatment of wastewater as well as the processes describing sludge treatment. The motivation for the extension is the increasing interest and need to operate and control wastewater treatment systems not only at an individual process level but also on a plant-wide basis. To facilitate the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one week BSM1 evaluation period. In this paper, the finalised plant layout is summarised and, as was done for BSM1, a default control strategy is proposed. A demonstration of how BSM2 can be used to evaluate control strategies is also given.
Weak data do not make a free lunch, only a cheap meal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Zhipu; Rajashankar, Kanagalaghatta; Dauter, Zbigniew
2014-01-17
Four data sets were processed at resolutions significantly exceeding the criteria traditionally used for estimating the diffraction data resolution limit. The analysis of these data and the corresponding model-quality indicators suggests that the criteria of resolution limits widely adopted in the past may be somewhat conservative. Various parameters, such asR mergeandI/σ(I), optical resolution and the correlation coefficients CC 1/2and CC*, can be used for judging the internal data quality, whereas the reliability factorsRandR freeas well as the maximum-likelihood target values and real-space map correlation coefficients can be used to estimate the agreement between the data and the refined model. However,more » none of these criteria provide a reliable estimate of the data resolution cutoff limit. The analysis suggests that extension of the maximum resolution by about 0.2 Å beyond the currently adopted limit where theI/σ(I) value drops to 2.0 does not degrade the quality of the refined structural models, but may sometimes be advantageous. Such an extension may be particularly beneficial for significantly anisotropic diffraction. Extension of the maximum resolution at the stage of data collection and structure refinement is cheap in terms of the required effort and is definitely more advisable than accepting a too conservative resolution cutoff, which is unfortunately quite frequent among the crystal structures deposited in the Protein Data Bank.« less
EXTENSION EDUCATION SYMPOSIUM: reinventing extension as a resource--what does the future hold?
Mirando, M A; Bewley, J M; Blue, J; Amaral-Phillips, D M; Corriher, V A; Whittet, K M; Arthur, N; Patterson, D J
2012-10-01
The mission of the Cooperative Extension Service, as a component of the land-grant university system, is to disseminate new knowledge and to foster its application and use. Opportunities and challenges facing animal agriculture in the United States have changed dramatically over the past few decades and require the use of new approaches and emerging technologies that are available to extension professionals. Increased federal competitive grant funding for extension, the creation of eXtension, the development of smartphone and related electronic technologies, and the rapidly increasing popularity of social media created new opportunities for extension educators to disseminate knowledge to a variety of audiences and engage these audiences in electronic discussions. Competitive grant funding opportunities for extension efforts to advance animal agriculture became available from the USDA National Institute of Food and Agriculture (NIFA) and have increased dramatically in recent years. The majority of NIFA funding opportunities require extension efforts to be integrated with research, and NIFA encourages the use of eXtension and other cutting-edge approaches to extend research to traditional clientele and nontraditional audiences. A case study is presented to illustrate how research and extension were integrated to improve the adoption of AI by beef producers. Those in agriculture are increasingly resorting to the use of social media venues such as Facebook, YouTube, LinkedIn, and Twitter to access information required to support their enterprises. Use of these various approaches by extension educators requires appreciation of the technology and an understanding of how the target audiences access information available on social media. Technology to deliver information is changing rapidly, and Cooperative Extension Service professionals will need to continuously evaluate digital technology and social media tools to appropriately integrate them into learning and educational opportunities.
Hindcasting of Storm Surges, Currents, and Waves at Lower Delaware Bay during Hurricane Isabel
NASA Astrophysics Data System (ADS)
Salehi, M.
2017-12-01
Hurricanes are a major threat to coastal communities and infrastructures including nuclear power plants located in low-lying coastal zones. In response, their sensitive elements should be protected by smart design to withstand against drastic impact of such natural phenomena. Accurate and reliable estimate of hurricane attributes is the first step to that effort. Numerical models have extensively grown over the past few years and are effective tools in modeling large scale natural events such as hurricane. The impact of low probability hurricanes on the lower Delaware Bay is investigated using dynamically coupled meteorological, hydrodynamic, and wave components of Delft3D software. Efforts are made to significantly reduce the computational overburden of performing such analysis for the industry, yet keeping the same level of accuracy at the area of study (AOS). The model is comprised of overall and nested domains. The overall model domain includes portion of Atlantic Ocean, Delaware, and Chesapeake bays. The nested model domain includes Delaware Bay, its floodplain, and portion of the continental shelf. This study is portion of a larger modeling effort to study the impact of low probability hurricanes on sensitive infrastructures located at the coastal zones prone to hurricane activity. The AOS is located on the east bank of Delaware Bay almost 16 miles upstream of its mouth. Model generated wind speed, significant wave height, water surface elevation, and current are calibrated for hurricane Isabel (2003). The model calibration results agreed reasonably well with field observations. Furthermore, sensitivity of surge and wave responses to various hurricane parameters was tested. In line with findings from other researchers, accuracy of wind field played a major role in hindcasting the hurricane attributes.
Two-stage fuzzy-stochastic robust programming: a hybrid model for regional air quality management.
Li, Yongping; Huang, Guo H; Veawab, Amornvadee; Nie, Xianghui; Liu, Lei
2006-08-01
In this study, a hybrid two-stage fuzzy-stochastic robust programming (TFSRP) model is developed and applied to the planning of an air-quality management system. As an extension of existing fuzzy-robust programming and two-stage stochastic programming methods, the TFSRP can explicitly address complexities and uncertainties of the study system without unrealistic simplifications. Uncertain parameters can be expressed as probability density and/or fuzzy membership functions, such that robustness of the optimization efforts can be enhanced. Moreover, economic penalties as corrective measures against any infeasibilities arising from the uncertainties are taken into account. This method can, thus, provide a linkage to predefined policies determined by authorities that have to be respected when a modeling effort is undertaken. In its solution algorithm, the fuzzy decision space can be delimited through specification of the uncertainties using dimensional enlargement of the original fuzzy constraints. The developed model is applied to a case study of regional air quality management. The results indicate that reasonable solutions have been obtained. The solutions can be used for further generating pollution-mitigation alternatives with minimized system costs and for providing a more solid support for sound environmental decisions.
Observational Constraints for Modeling Diffuse Molecular Clouds
NASA Astrophysics Data System (ADS)
Federman, S. R.
2014-02-01
Ground-based and space-borne observations of diffuse molecular clouds suggest a number of areas where further improvements to modeling efforts is warranted. I will highlight those that have the widest applicability. The range in CO fractionation caused by selective isotope photodissociation, in particular the large 12C16O/13C16O ratios observed toward stars in Ophiuchus, is not reproduced well by current models. Our ongoing laboratory measurements of oscillator strengths and predissociation rates for Rydberg transitions in CO isotopologues may help clarify the situtation. The CH+ abundance continues to draw attention. Small scale structure seen toward ζ Per may provide additional constraints on the possible synthesis routes. The connection between results from optical transitions and those from radio and sub-millimeter wave transitions requires further effort. A study of OH+ and OH toward background stars reveals that these species favor different environments. This brings to focus the need to model each cloud along the line of sight separately, and to allow the physical conditions to vary within an individual cloud, in order to gain further insight into the chemistry. Now that an extensive set of data on molecular excitation is available, the models should seek to reproduce these data to place further constraints on the modeling results.
An MBSE Approach to Space Suit Development
NASA Technical Reports Server (NTRS)
Cordova, Lauren; Kovich, Christine; Sargusingh, Miriam
2012-01-01
The EVA/Space Suit Development Office (ESSD) Systems Engineering and Integration (SE&I) team has utilized MBSE in multiple programs. After developing operational and architectural models, the MBSE framework was expanded to link the requirements space to the system models through functional analysis and interfaces definitions. By documenting all the connections within the technical baseline, ESSD experienced significant efficiency improvements in analysis and identification of change impacts. One of the biggest challenges presented to the MBSE structure was a program transition and restructuring effort, which was completed successfully in 4 months culminating in the approval of a new EVA Technical Baseline. During this time three requirements sets spanning multiple DRMs were streamlined into one NASA-owned Systems Requirement Document (SRD) that successfully identified requirements relevant to the current hardware development effort while remaining extensible to support future hardware developments. A capability-based hierarchy was established to provide a more flexible framework for future space suit development that can support multiple programs with minimal rework of basic EVA/Space Suit requirements. This MBSE approach was most recently applied for generation of an EMU Demonstrator technical baseline being developed for an ISS DTO. The relatively quick turnaround of operational concepts, architecture definition, and requirements for this new suit development has allowed us to test and evolve the MBSE process and framework in an extremely different setting while still offering extensibility and traceability throughout ESSD projects. The ESSD MBSE framework continues to be evolved in order to support integration of all products associated with the SE&I engine.
Effective Regional Community Development
ERIC Educational Resources Information Center
Nesbitt, Rebecca; Merkowitz, Rose Fisher
2014-01-01
Times are changing, and so are Extension programs. These changes affect every aspect of the educational effort, including program development, project funding, educational delivery, partnership building, marketing, sharing impacts, and revenue generation. This article is not about how Extension is restructuring to adapt to changes; instead, it…
78 FR 70584 - Extension of Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... establishes arrangements to protect the rights of affected transit employees. Federal law requires such... DEPARTMENT OF LABOR Office of Labor-Management Standards Extension of Information Collection; Comment Request ACTION: Notice. SUMMARY: The Department of Labor, as part of its continuing effort to...
Towards Systematic Benchmarking of Climate Model Performance
NASA Astrophysics Data System (ADS)
Gleckler, P. J.
2014-12-01
The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine performance tests readily accessible will help advance a more transparent model evaluation process.
Robotics-based synthesis of human motion.
Khatib, O; Demircan, E; De Sapio, V; Sentis, L; Besier, T; Delp, S
2009-01-01
The synthesis of human motion is a complex procedure that involves accurate reconstruction of movement sequences, modeling of musculoskeletal kinematics, dynamics and actuation, and characterization of reliable performance criteria. Many of these processes have much in common with the problems found in robotics research. Task-based methods used in robotics may be leveraged to provide novel musculoskeletal modeling methods and physiologically accurate performance predictions. In this paper, we present (i) a new method for the real-time reconstruction of human motion trajectories using direct marker tracking, (ii) a task-driven muscular effort minimization criterion and (iii) new human performance metrics for dynamic characterization of athletic skills. Dynamic motion reconstruction is achieved through the control of a simulated human model to follow the captured marker trajectories in real-time. The operational space control and real-time simulation provide human dynamics at any configuration of the performance. A new criteria of muscular effort minimization has been introduced to analyze human static postures. Extensive motion capture experiments were conducted to validate the new minimization criterion. Finally, new human performance metrics were introduced to study in details an athletic skill. These metrics include the effort expenditure and the feasible set of operational space accelerations during the performance of the skill. The dynamic characterization takes into account skeletal kinematics as well as muscle routing kinematics and force generating capacities. The developments draw upon an advanced musculoskeletal modeling platform and a task-oriented framework for the effective integration of biomechanics and robotics methods.
ERIC Educational Resources Information Center
MANN, OPAL H.
A STUDY WAS MADE OF THE NEED FOR EXTENSION WORK WITH LOW INCOME FAMILIES IN EASTERN KENTUCKY (APPALACHIA) AND OF THE PROBLEMS AND TRAINING NEEDS OF HOME DEMONSTRATION EXTENSION AGENTS WHO WORK WITH THESE FAMILIES. THE AGENTS FELT THEY HAD A RESPONSIBILITY TO HELP LOW INCOME FAMILIES IN BUDGETING TIME, EFFORT, AND RESOURCES TO MEET MINIMUM…
ERIC Educational Resources Information Center
Harder, Amy; Moore, Austen; Mazurkewicz, Melissa; Benge, Matt
2013-01-01
Needs assessments are an important tool for informing organizational development efforts in Extension. The purpose of the study reported here was to identify problems faced by county units within UF/IFAS Extension during county program reviews. The findings were drawn from the reports created after five county units experienced program reviews in…
Conditioning of MVM '73 radio-tracking data
NASA Technical Reports Server (NTRS)
Koch, R. E.; Chao, C. C.; Winn, F. B.; Yip, K. W.
1974-01-01
An extensive effort was undertaken to edit Mariner 10 radiometric tracking data. Interactive computer graphics were used for the first time by an interplanetary mission to detect blunder points and spurious signatures in the data. Interactive graphics improved the former process time by a factor of 10 to 20, while increasing reliability. S/X dual Doppler data was used for the first time to calibrate charged particles in the tracking medium. Application of the charged particle calibrations decreased the orbit determination error for a short data arc following the 16 March 1974 maneuver by about 80%. A new model was developed to calibrate the earth's troposphere with seasonal adjustments. The new model has a 2% accuracy and is 5 times better than the old model.
Perceptual asymmetry in texture perception.
Williams, D; Julesz, B
1992-07-15
A fundamental property of human visual perception is our ability to distinguish between textures. A concerted effort has been made to account for texture segregation in terms of linear spatial filter models and their nonlinear extensions. However, for certain texture pairs the ease of discrimination changes when the role of figure and ground are reversed. This asymmetry poses a problem for both linear and nonlinear models. We have isolated a property of texture perception that can account for this asymmetry in discrimination: subjective closure. This property, which is also responsible for visual illusions, appears to be explainable by early visual processes alone. Our results force a reexamination of the process of human texture segregation and of some recent models that were introduced to explain it.
The Politics of Extension Water Programming: Determining if Affiliation Impacts Participation
ERIC Educational Resources Information Center
Owens, Courtney T.; Lamm, Alexa J.
2017-01-01
Research has found levels of engagement in environmental behaviors and participation in Extension programming around environmental issues are directly associated with political affiliation. Democrat and Independent parties encourage members to vote for stricter environmental regulations, such as water conservation efforts, while Republicans…
Representing annotation compositionality and provenance for the Semantic Web
2013-01-01
Background Though the annotation of digital artifacts with metadata has a long history, the bulk of that work focuses on the association of single terms or concepts to single targets. As annotation efforts expand to capture more complex information, annotations will need to be able to refer to knowledge structures formally defined in terms of more atomic knowledge structures. Existing provenance efforts in the Semantic Web domain primarily focus on tracking provenance at the level of whole triples and do not provide enough detail to track how individual triple elements of annotations were derived from triple elements of other annotations. Results We present a task- and domain-independent ontological model for capturing annotations and their linkage to their denoted knowledge representations, which can be singular concepts or more complex sets of assertions. We have implemented this model as an extension of the Information Artifact Ontology in OWL and made it freely available, and we show how it can be integrated with several prominent annotation and provenance models. We present several application areas for the model, ranging from linguistic annotation of text to the annotation of disease-associations in genome sequences. Conclusions With this model, progressively more complex annotations can be composed from other annotations, and the provenance of compositional annotations can be represented at the annotation level or at the level of individual elements of the RDF triples composing the annotations. This in turn allows for progressively richer annotations to be constructed from previous annotation efforts, the precise provenance recording of which facilitates evidence-based inference and error tracking. PMID:24268021
Cognitive engineering models in space systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1993-01-01
NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. Models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision aids. Currently, there several candidate modeling methodologies. They include the Rasmussen abstraction/aggregation hierarchy and decision ladder, the goal-means network, the problem behavior graph, and the operator function model. The research conducted under the sponsorship of this grant focuses on the extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications. The initial portion of this research consists of two parts. The first is a series of technical exchanges between NASA Johnson and Georgia Tech researchers. The purpose is to identify candidate applications for the current operator function model; prospects include mission operations and the Data Management System Testbed. The second portion will address extensions of the operator function model to tailor it to the specific needs of Johnson applications. At this point, we have accomplished two things. During a series of conversations with JSC researchers, we have defined the technical goal of the research supported by this grant to be the structural definition of the operator function model and its computer implementation, OFMspert. Both the OFM and OFMspert have matured to the point that they require infrastructure to facilitate use by researchers not involved in the evolution of the tools. The second accomplishment this year was the identification of the Payload Deployment and Retrieval System (PDRS) as a candidate system for the case study. In conjunction with government and contractor personnel in the Human-Computer Interaction Lab, the PDRS was identified as the most accessible system for the demonstration. Pursuant to this a PDRS simulation was obtained from the HCIL and an initial knowledge engineering effort was conducted to understand the operator's tasks in the PDRS application. The preliminary results of the knowledge engineering effort and an initial formulation of an operator function model (OFM) are contained in the appendices.
Powers, W; Cockett, N; Lardy, G
2017-04-01
Managing the demands of an academic appointment in extension can be a challenging task. Demands from constituent groups, expectations of supervisors, and rigors of promotion and tenure processes can create pressures that young faculty did not expect. Throw in spousal and family duties and you have created a situation that many will find hard to navigate. However, there are ways to cope and, even better news, there are ways to excel in meeting the demands of an academic appointment and enjoying life. Because many new extension faculty members do not have prior experience in extension, best practices in documenting programs and extension scholarship over the pretenure period are provided in this paper. Appointments that include both research and extension are quite common at many land grant universities. The advantages of joint appointments are numerous and include the fact that more and more grant agencies are seeking integrated research, teaching, and/or extension projects. However, the time demands of joint appointments can be challenging. Joint appointments can be designed to help faculty members conduct important translational research and have it be applied in a production setting. By seeking commonalities in research and extension efforts, joint appointments can be very synergistic. Development of highly successful programs requires planning on the front end with an emphasis on an in-depth needs assessment to determine stakeholder needs for both research and extension. Impact assessment should be part of this planning effort. Performing as a successful extension faculty member while maintaining relationships outside of work is challenging and requires deliberate effort on the part of employees and supervisors to realize there is more to life than work. Some authors have referred to this as work-life balance, but it may be more helpful to think of it as work-life effectiveness. To do this, one needs to 1) define what success looks like, 2) set boundaries and maintain control including control of your schedule, and 3) find time to ensure your physical, emotional, and spiritual well-being are nurtured in addition to your professional development. In summary, extension careers can be challenging at times as demands and expectations of stakeholders, supervisors, and rigors of the tenure system create formidable obstacles. However, by keeping a focus on the priorities of the position and looking for synergy in research and extension work, they can actually be quite enjoyable and very rewarding.
Mechanical Analysis of W78/88-1 Life Extension Program Warhead Design Options
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Nathan
2014-09-01
Life Extension Program (LEP) is a program to repair/replace components of nuclear weapons to ensure the ability to meet military requirements. The W78/88-1 LEP encompasses the modernization of two major nuclear weapon reentry systems into an interoperable warhead. Several design concepts exist to provide different options for robust safety and security themes, maximum non-nuclear commonality, and cost. Simulation is one capability used to evaluate the mechanical performance of the designs in various operational environments, plan for system and component qualification efforts, and provide insight into the survivability of the warhead in environments that are not currently testable. The simulation effortsmore » use several Sandia-developed tools through the Advanced Simulation and Computing program, including Cubit for mesh generation, the DART Model Manager, SIERRA codes running on the HPC TLCC2 platforms, DAKOTA, and ParaView. Several programmatic objectives were met using the simulation capability including: (1) providing early environmental specification estimates that may be used by component designers to understand the severity of the loads their components will need to survive, (2) providing guidance for load levels and configurations for subassembly tests intended to represent operational environments, and (3) recommending design options including modified geometry and material properties. These objectives were accomplished through regular interactions with component, system, and test engineers while using the laboratory's computational infrastructure to effectively perform ensembles of simulations. Because NNSA has decided to defer the LEP program, simulation results are being documented and models are being archived for future reference. However, some advanced and exploratory efforts will continue to mature key technologies, using the results from these and ongoing simulations for design insights, test planning, and model validation.« less
Basic Modeling of the Solar Atmosphere and Spectrum
NASA Technical Reports Server (NTRS)
Avrett, Eugene H.; Wagner, William J. (Technical Monitor)
2000-01-01
During the last three years we have continued the development of extensive computer programs for constructing realistic models of the solar atmosphere and for calculating detailed spectra to use in the interpretation of solar observations. This research involves two major interrelated efforts: work by Avrett and Loeser on the Pandora computer program for optically thick non-LTE modeling of the solar atmosphere including a wide range of physical processes, and work by Kurucz on the detailed high-resolution synthesis of the solar spectrum using data for over 58 million atomic and molecular lines. Our objective is to construct atmospheric models from which the calculated spectra agree as well as possible with high-and low-resolution observations over a wide wavelength range. Such modeling leads to an improved understanding of the physical processes responsible for the structure and behavior of the atmosphere.
DOE Office of Scientific and Technical Information (OSTI.GOV)
English, Shawn Allen; Nelson, Stacy Michelle; Briggs, Timothy
Presented is a model verification and validation effort using low - velocity impact (LVI) of carbon fiber reinforced polymer laminate experiments. A flat cylindrical indenter impacts the laminate with enough energy to produce delamination, matrix cracks and fiber breaks. Included in the experimental efforts are ultrasonic scans of the damage for qualitative validation of the models. However, the primary quantitative metrics of validation are the force time history measured through the instrumented indenter and initial and final velocities. The simulations, whi ch are run on Sandia's Sierra finite element codes , consist of all physics and material parameters of importancemore » as determined by a sensitivity analysis conducted on the LVI simulation. A novel orthotropic damage and failure constitutive model that is cap able of predicting progressive composite damage and failure is described in detail and material properties are measured, estimated from micromechanics or optimized through calibration. A thorough verification and calibration to the accompanying experiment s are presented. Specia l emphasis is given to the four - point bend experiment. For all simulations of interest, the mesh and material behavior is verified through extensive convergence studies. An ensemble of simulations incorporating model parameter unc ertainties is used to predict a response distribution which is then compared to experimental output. The result is a quantifiable confidence in material characterization and model physics when simulating this phenomenon in structures of interest.« less
Irma 5.2 multi-sensor signature prediction model
NASA Astrophysics Data System (ADS)
Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles
2007-04-01
The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after an extensive verification and validation of an upgraded and reengineered active channel. Since 2005, the reengineering effort has focused on the Irma passive channel. Field measurements for the validation effort include the unpolarized data collection. Irma 5.2 is scheduled for release in the summer of 2007. This paper will report the validation test results of the Irma passive models and discuss the new features in Irma 5.2.
Shopping for Courses on the Mall.
ERIC Educational Resources Information Center
Gallagher, John C.
1980-01-01
Describes an extension program conducted by Suffolk County Community College, New York, at an area shopping mall. Discusses program offerings, including a series of general interest lectures, regular credit courses, and a set of noncredit minicourses for mall employees. Examines the public relations value of the extension efforts. (JP)
Aerodynamic design trends for commercial aircraft
NASA Technical Reports Server (NTRS)
Hilbig, R.; Koerner, H.
1986-01-01
Recent research on advanced-configuration commercial aircraft at DFVLR is surveyed, with a focus on aerodynamic approaches to improved performance. Topics examined include transonic wings with variable camber or shock/boundary-layer control, wings with reduced friction drag or laminarized flow, prop-fan propulsion, and unusual configurations or wing profiles. Drawings, diagrams, and graphs of predicted performance are provided, and the need for extensive development efforts using powerful computer facilities, high-speed and low-speed wind tunnels, and flight tests of models (mounted on specially designed carrier aircraft) is indicated.
HydroGrid: Technologies for Global Water Quality and Sustainability
NASA Astrophysics Data System (ADS)
Yeghiazarian, L.
2017-12-01
Humans have been transforming planet Earth for millennia. We have recently come to understand that the collective impact of our decisions and actions has brought about severe water quality problems, which are likely to worsen in the light of rapid population growth to the projected nine billion by 2050. To sustainably manage our global water resources and possibly reverse these effects requires efforts in real-time monitoring of water contamination, analysis of monitoring data, and control of the state of water contamination. We develop technologies to address all three areas: monitoring, analysis and control. These efforts are carried out in the conceptual framework of the HydroGrid, an interconnected water system, which is (1) firmly rooted in the fundamental understanding of processes that govern microbial dynamics on multiple scales; and (2) used to develop watershed-specific management strategies. In the area of monitoring we are developing mobile autonomous sensors to detect surface water contamination, an effort supported by extensive materials research to provide multifunctional materials. We analyze environmental data within a stochastic modeling paradigm that bridges microscopic particle interactions to macroscopic manifestation of microbial population behavior in time and space in entire watersheds. These models are supported with laboratory and field experiments. Finally, we combine control and graph theories to derive controllability metrics of natural watersheds.
NASA Astrophysics Data System (ADS)
Dawson, A.; Trachsel, M.; Goring, S. J.; Paciorek, C. J.; McLachlan, J. S.; Jackson, S. T.; Williams, J. W.
2017-12-01
Pollen records have been extensively used to reconstruct past changes in vegetation and study the underlying processes. However, developing the statistical techniques needed to accurately represent both data and process uncertainties is a formidable challenge. Recent advances in paleoecoinformatics (e.g. the Neotoma Paleoecology Database and the European Pollen Database), Bayesian age-depth models, and process-based pollen-vegetation models, and Bayesian hierarchical modeling have pushed paleovegetation reconstructions forward to a point where multiple sources of uncertainty can be incorporated into reconstructions, which in turn enables new hypotheses to be asked and more rigorous integration of paleovegetation data with earth system models and terrestrial ecosystem models. Several kinds of pollen-vegetation models have been developed, notably LOVE/REVEALS, STEPPS, and classical transfer functions such as the modern analog technique. LOVE/REVEALS has been adopted as the standard method for the LandCover6k effort to develop quantitative reconstructions of land cover for the Holocene, while STEPPS has been developed recently as part of the PalEON project and applied to reconstruct with uncertainty shifts in forest composition in New England and the upper Midwest during the late Holocene. Each PVM has different assumptions and structure and uses different input data, but few comparisons among approaches yet exist. Here, we present new reconstructions of land cover change in northern North America during the Holocene based on LOVE/REVEALS and data drawn from the Neotoma database and compare STEPPS-based reconstructions to those from LOVE/REVEALS. These parallel developments with LOVE/REVEALS provide an opportunity to compare and contrast models, and to begin to generate continental scale reconstructions, with explicit uncertainties, that can provide a base for interdisciplinary research within the biogeosciences. We show how STEPPS provides an important benchmark for past land-cover reconstruction, and how the LandCover 6k effort in North America advances our understanding of the past by allowing cross-continent comparisons using standardized methods and quantifying the impact of humans in the early Anthropocene.
Zipkin, Elise F; Sillett, T Scott; Grant, Evan H Campbell; Chandler, Richard B; Royle, J Andrew
2014-01-01
Wildlife populations consist of individuals that contribute disproportionately to growth and viability. Understanding a population's spatial and temporal dynamics requires estimates of abundance and demographic rates that account for this heterogeneity. Estimating these quantities can be difficult, requiring years of intensive data collection. Often, this is accomplished through the capture and recapture of individual animals, which is generally only feasible at a limited number of locations. In contrast, N-mixture models allow for the estimation of abundance, and spatial variation in abundance, from count data alone. We extend recently developed multistate, open population N-mixture models, which can additionally estimate demographic rates based on an organism's life history characteristics. In our extension, we develop an approach to account for the case where not all individuals can be assigned to a state during sampling. Using only state-specific count data, we show how our model can be used to estimate local population abundance, as well as density-dependent recruitment rates and state-specific survival. We apply our model to a population of black-throated blue warblers (Setophaga caerulescens) that have been surveyed for 25 years on their breeding grounds at the Hubbard Brook Experimental Forest in New Hampshire, USA. The intensive data collection efforts allow us to compare our estimates to estimates derived from capture–recapture data. Our model performed well in estimating population abundance and density-dependent rates of annual recruitment/immigration. Estimates of local carrying capacity and per capita recruitment of yearlings were consistent with those published in other studies. However, our model moderately underestimated annual survival probability of yearling and adult females and severely underestimates survival probabilities for both of these male stages. The most accurate and precise estimates will necessarily require some amount of intensive data collection efforts (such as capture–recapture). Integrated population models that combine data from both intensive and extensive sources are likely to be the most efficient approach for estimating demographic rates at large spatial and temporal scales. PMID:24634726
Often Difficult--But Worth It. Collaboration among Professionals.
ERIC Educational Resources Information Center
Walker, Joyce A.
1988-01-01
A joint effort between the Minnesota Extension Service and University of Minnesota School of Medicine produced a community-based research and educational program on stress, depression, and suicide prevention. The Teens in Distress program represents a successful collaborative effort and illustrates the potential problems when Extension…
Perception of Muscular Effort During Dynamic Elbow Extension in Multiple Sclerosis.
Heller, Mario; Retzl, Irene; Kiselka, Anita; Greisberger, Andrea
2016-02-01
To investigate the perception of muscular effort in individuals with multiple sclerosis (MS) and healthy controls during dynamic contractions. Case-control study. MS day care center. Individuals with MS (n=28) and controls (n=28) (N=56). Not applicable. Perceived muscular effort during dynamic elbow extensions was rated at 9 different weight intensities (10%-90% of 1-repetition maximum) in a single-blind, randomized order using the OMNI-Resistance Exercise Scale. Muscle activity of the triceps brachii muscle (lateral head) was measured via surface electromyography and normalized to maximal voluntary excitation. According to OMNI-level ratings, significant main effects were found for the diagnostic condition (F=27.33, P<.001, η(2)=.11), indicating 0.7 (95% confidence interval [CI], 0.3-1.1) lower mean OMNI-level ratings for MS, and for the intensity level (F=46.81, P<.001, η(2)=.46), showing increased OMNI-level ratings for increased intensity levels for both groups. Furthermore, significant main effects were found for the diagnostic condition (F=16.52, P<.001, η(2)=.07), indicating 7.1% (95% CI, -8.6 to 22.8) higher maximal voluntary excitation values for MS, and for the intensity level (F=33.09, P<.001, η(2)=.36), showing higher relative muscle activities for increasing intensity levels in both groups. Similar to controls, individuals with MS were able to differentiate between different intensities of weight during dynamic elbow extensions when provided in a single-blind, randomized order. Therefore, perceived muscular effort might be considered to control resistance training intensities in individuals with MS. However, training intensity for individuals with MS should be chosen at approximately 1 OMNI level lower than recommended, at least for dynamic elbow extension exercises. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Development of a Power Electronics Controller for the Advanced Stirling Radioisotope Generator
NASA Technical Reports Server (NTRS)
Leland, Douglas K.; Priest, Joel F.; Keiter, Douglas E.; Schreiber, Jeffrey G.
2008-01-01
Under a U.S. Department of Energy program for radioisotope power systems, Lockheed Martin is developing an Engineering Unit of the Advanced Stirling Radioisotope Generator (ASRG). This is an advanced version of the previously reported SRG110 generator. The ASRG uses Advanced Stirling Convertors (ASCs) developed by Sunpower Incorporated under a NASA Research Announcement contract. The ASRG makes use of a Stirling controller based on power electronics that eliminates the tuning capacitors. The power electronics controller synchronizes dual-opposed convertors and maintains a fixed frequency operating point. The controller is single-fault tolerant and uses high-frequency pulse width modulation to create the sinusoidal currents that are nearly in phase with the piston velocity, eliminating the need for large series tuning capacitors. Sunpower supports this effort through an extension of their controller development intended for other applications. Glenn Research Center (GRC) supports this effort through system dynamic modeling, analysis and test support. The ASRG design arrived at a new baseline based on a system-level trade study and extensive feedback from mission planners on the necessity of single-fault tolerance. This paper presents the baseline design with an emphasis on the power electronics controller detailed design concept that will meet space mission requirements including single fault tolerance.
Calculation of the recirculating compressible flow downstream a sudden axisymmetric expansion
NASA Technical Reports Server (NTRS)
Vandromme, D.; Haminh, H.; Brunet, H.
1988-01-01
Significant progress has been made during the last five years to adapt conventional Navier-Stokes solver for handling nonconservative equations. A primary type of application is to use transport equation turbulence models, but the extension is also possible for describing the transport of nonpassive scalars, such as in reactive media. Among others, combustion and gas dissociation phenomena are topics needing a considerable research effort. An implicit two step scheme based on the well-known MacCormack scheme has been modified to treat compressible turbulent flows on complex geometries. Implicit treatment of nonconservative equations (in the present case a two-equation turbulence model) opens the way to the coupled solution of thermochemical transport equations.
Requirements and design aspects of a data model for a data dictionary in paediatric oncology.
Merzweiler, A; Knaup, P; Creutzig, U; Ehlerding, H; Haux, R; Mludek, V; Schilling, F H; Weber, R; Wiedemann, T
2000-01-01
German children suffering from cancer are mostly treated within the framework of multicentre clinical trials. An important task of conducting these trials is an extensive information and knowledge exchange, which has to be based on a standardised documentation. To support this effort, it is the aim of a nationwide project to define a standardised terminology that should be used by clinical trials for therapy documentation. In order to support terminology maintenance we are currently developing a data dictionary. In this paper we describe requirements and design aspects of the data model used for the data dictionary as first results of our research. We compare it with other terminology systems.
A Unified Model of Geostrophic Adjustment and Frontogenesis
NASA Astrophysics Data System (ADS)
Taylor, John; Shakespeare, Callum
2013-11-01
Fronts, or regions with strong horizontal density gradients, are ubiquitous and dynamically important features of the ocean and atmosphere. In the ocean, fronts are associated with enhanced air-sea fluxes, turbulence, and biological productivity, while atmospheric fronts are associated with some of the most extreme weather events. Here, we describe a new mathematical framework for describing the formation of fronts, or frontogenesis. This framework unifies two classical problems in geophysical fluid dynamics, geostrophic adjustment and strain-driven frontogenesis, and provides a number of important extensions beyond previous efforts. The model solutions closely match numerical simulations during the early stages of frontogenesis, and provide a means to describe the development of turbulence at mature fronts.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-20
... DEPARTMENT OF LABOR Office of Workers' Compensation Programs Division of Coal Mine Workers' Compensation; Proposed Extension of Information Collection; Comment Request ACTION: Notice. SUMMARY: The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a...
Informing Extension Program Development through Audience Segmentation: Targeting High Water Users
ERIC Educational Resources Information Center
Huang, Pei-wen; Lamm, Alexa J.; Dukes, Michael D.
2016-01-01
Human reliance on water has led to water issues globally. Although extension professionals have made efforts successfully to educate the general public about water conservation to enhance water resource sustainability, difficulty has been found in reaching high water users, defined as residents irrigating excessively to their landscape irrigation…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... DEPARTMENT OF LABOR Comment Request for Agency Information Collection Activities: Extension of a..., Department of Labor. ACTION: 60-day notice of information collection under review: Form ETA- 9033...-0309. SUMMARY: The Department of Labor, as part of its continuing effort to reduce paperwork and...
Emergency Food Programs: Untapped Opportunities for Extension?
ERIC Educational Resources Information Center
Mobley, Amy R.
2012-01-01
This article reports results from a questionnaire that assessed the frequency and type of nutrition questions asked at emergency food programs to determine if Extension professionals need to increase direct outreach efforts. Emergency food program workers (n = 460) were recruited via mail to complete a self-administered survey. More than one-third…
Untied Efforts: The Challenges for Improved Research, Extension and Education Linkages
ERIC Educational Resources Information Center
Eneyew, Adugna
2013-01-01
Ethiopian agriculture is characterized by smallholders farming whose access to modern technology and basic education is very limited. Research, extension, education and farmers are the main pillars of agricultural knowledge systems and their effectiveness largely depends on strong linkage among each other. However, the existing…
Developing a Successful Asynchronous Online Extension Program for Forest Landowners
ERIC Educational Resources Information Center
Zobrist, Kevin W.
2014-01-01
Asynchronous online Extension classes can reach a wide audience, is convenient for the learner, and minimizes ongoing demands on instructor time. However, producing such classes takes significant effort up front. Advance planning and good communication with contributors are essential to success. Considerations include delivery platforms, content…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration on Aging Agency Information Collection Activities; Submission for OMB Review; Comment Request; Extension of Certification of Maintenance of Effort for the Title III and Minor Revisions to the Certification of Long-Term Care Ombudsman Program...
A Journey in Standard Development: The Core Manufacturing Simulation Data (CMSD) Information Model.
Lee, Yung-Tsun Tina
2015-01-01
This report documents a journey "from research to an approved standard" of a NIST-led standard development activity. That standard, Core Manufacturing Simulation Data (CMSD) information model, provides neutral structures for the efficient exchange of manufacturing data in a simulation environment. The model was standardized under the auspices of the international Simulation Interoperability Standards Organization (SISO). NIST started the research in 2001 and initiated the standardization effort in 2004. The CMSD standard was published in two SISO Products. In the first Product, the information model was defined in the Unified Modeling Language (UML) and published in 2010 as SISO-STD-008-2010. In the second Product, the information model was defined in Extensible Markup Language (XML) and published in 2013 as SISO-STD-008-01-2012. Both SISO-STD-008-2010 and SISO-STD-008-01-2012 are intended to be used together.
The ASAC Air Carrier Investment Model (Third Generation)
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Gaier, Eric M.; Santmire, Tara E.
1998-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. To accomplish this, NASA is building an Aviation System Analysis Capability (ASAC). The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. To link the economics of flight with the technology of flight, ASAC requires a parametrically based model with extensions that link airline operations and investments in aircraft with aircraft characteristics. This model also must provide a mechanism for incorporating air travel demand and profitability factors into the airlines' investment decisions. Finally, the model must be flexible and capable of being incorporated into a wide-ranging suite of economic and technical models flat are envisioned for ASAC.
A Hybrid Fuzzy Model for Lean Product Development Performance Measurement
NASA Astrophysics Data System (ADS)
Osezua Aikhuele, Daniel; Mohd Turan, Faiz
2016-02-01
In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.
Extensive Training Is Insufficient to Produce The Work-Ethic Effect In Pigeons
Vasconcelos, Marco; Urcuioli, Peter J
2009-01-01
Zentall and Singer (2007a) hypothesized that our failure to replicate the work-ethic effect in pigeons (Vasconcelos, Urcuioli, & Lionello-DeNolf, 2007) was due to insufficient overtraining following acquisition of the high- and low-effort discriminations. We tested this hypothesis using the original work-ethic procedure (Experiment 1) and one similar to that used with starlings (Experiment 2) by providing at least 60 overtraining sessions. Despite this extensive overtraining, neither experiment revealed a significant preference for stimuli obtained after high effort. Together with other findings, these data support our contention that pigeons do not reliably show a work-ethic effect. PMID:19230517
NASA Astrophysics Data System (ADS)
He, R.; Zong, H.; Xue, Z. G.; Fennel, K.; Tian, H.; Cai, W. J.; Lohrenz, S. E.
2017-12-01
An integrated terrestrial-ocean ecosystem modeling system is developed and used to investigate marine physical-biogeochemical variabilities in the Gulf of Mexico and southeastern US shelf sea. Such variabilities stem from variations in the shelf circulation, boundary current dynamics, impacts of climate variability, as well as growing population and associated land use practices on transport of carbon and nutrients within terrestrial systems and their delivery to the coastal ocean. We will report our efforts in evaluating the performance of the coupled modeling system via extensive model and data comparisons, as well as findings from a suite of case studies and scenario simulations. Long-term model simulation results are used to quantify regional ocean circulation dynamics, nitrogen budget and carbon fluxes. Their corresponding sub-regional differences are also characterized and contrasted.
Schainker, Lisa M.; Redmond, Cleve; Ralston, Ekaterina; Yeh, Hsiu-Chen; Perkins, Daniel F.
2015-01-01
An emerging literature highlights the potential for broader dissemination of evidence-based prevention programs in communities through existing state systems, such as the land grant university Extension outreach system and departments of public education and health (DOE– DPH). This exploratory study entailed surveying representatives of the national Extension system and DOE– DPH, to evaluate dissemination readiness factors, as part of a larger project on an evidence-based program delivery model called PROSPER. In addition to assessing systems’ readiness factors, differences among US regions and comparative levels of readiness between state systems were evaluated. The Extension web-based survey sample N was 958 and the DOE–DPH telephone survey N was 338, with response rates of 23 and 79 %, respectively. Extension survey results suggested only a moderate level of overall readiness nationally, with relatively higher perceived need for collaborative efforts and relatively lower perceived resource availability. There were significant regional differences on all factors, generally favoring the Northeast. Results from DOE–DPH surveys showed significantly higher levels for all readiness factors, compared with Extension systems. Overall, the findings present a mixed picture. Although there were clear challenges related to measuring readiness in complex systems, addressing currently limited dissemination resources, and devising strategies for optimizing readiness, all systems showed some readiness-related strengths. PMID:25791916
Porting marine ecosystem model spin-up using transport matrices to GPUs
NASA Astrophysics Data System (ADS)
Siewertsen, E.; Piwonski, J.; Slawig, T.
2013-01-01
We have ported an implementation of the spin-up for marine ecosystem models based on transport matrices to graphics processing units (GPUs). The original implementation was designed for distributed-memory architectures and uses the Portable, Extensible Toolkit for Scientific Computation (PETSc) library that is based on the Message Passing Interface (MPI) standard. The spin-up computes a steady seasonal cycle of ecosystem tracers with climatological ocean circulation data as forcing. Since the transport is linear with respect to the tracers, the resulting operator is represented by matrices. Each iteration of the spin-up involves two matrix-vector multiplications and the evaluation of the used biogeochemical model. The original code was written in C and Fortran. On the GPU, we use the Compute Unified Device Architecture (CUDA) standard, a customized version of PETSc and a commercial CUDA Fortran compiler. We describe the extensions to PETSc and the modifications of the original C and Fortran codes that had to be done. Here we make use of freely available libraries for the GPU. We analyze the computational effort of the main parts of the spin-up for two exemplar ecosystem models and compare the overall computational time to those necessary on different CPUs. The results show that a consumer GPU can compete with a significant number of cluster CPUs without further code optimization.
The Planetary Data System (PDS) Data Dictionary Tool (LDDTool)
NASA Astrophysics Data System (ADS)
Raugh, Anne C.; Hughes, John S.
2017-10-01
One of the major design goals of the PDS4 development effort was to provide an avenue for discipline specialists and large data preparers such as mission archivists to extend the core PDS4 Information Model (IM) to include metadata definitions specific to their own contexts. This capability is critical for the Planetary Data System - an archive that deals with a data collection that is diverse along virtually every conceivable axis. Amid such diversity, it is in the best interests of the PDS archive and its users that all extensions to the core IM follow the same design techniques, conventions, and restrictions as the core implementation itself. Notwithstanding, expecting all mission and discipline archivist seeking to define metadata for a new context to acquire expertise in information modeling, model-driven design, ontology, schema formulation, and PDS4 design conventions and philosophy is unrealistic, to say the least.To bridge that expertise gap, the PDS Engineering Node has developed the data dictionary creation tool known as “LDDTool”. This tool incorporates the same software used to maintain and extend the core IM, packaged with an interface that enables a developer to create his contextual information model using the same, open standards-based metadata framework PDS itself uses. Through this interface, the novice dictionary developer has immediate access to the common set of data types and unit classes for defining attributes, and a straight-forward method for constructing classes. The more experienced developer, using the same tool, has access to more sophisticated modeling methods like abstraction and extension, and can define very sophisticated validation rules.We present the key features of the PDS Local Data Dictionary Tool, which both supports the development of extensions to the PDS4 IM, and ensures their compatibility with the IM.
NASA Astrophysics Data System (ADS)
Parkhill, John A.; Head-Gordon, Martin
2010-07-01
We present the next stage in a hierarchy of local approximations to complete active space self-consistent field (CASSCF) model in an active space of one active orbital per active electron based on the valence orbital-optimized coupled-cluster (VOO-CC) formalism. Following the perfect pairing (PP) model, which is exact for a single electron pair and extensive, and the perfect quadruples (PQ) model, which is exact for two pairs, we introduce the perfect hextuples (PH) model, which is exact for three pairs. PH is an approximation to the VOO-CC method truncated at hextuples containing all correlations between three electron pairs. While VOO-CCDTQ56 requires computational effort scaling with the 14th power of molecular size, PH requires only sixth power effort. Our implementation also introduces some techniques which reduce the scaling to fifth order and has been applied to active spaces roughly twice the size of the CASSCF limit without any symmetry. Because PH explicitly correlates up to six electrons at a time, it can faithfully model the static correlations of molecules with up to triple bonds in a size-consistent fashion and for organic reactions usually reproduces CASSCF with chemical accuracy. The convergence of the PP, PQ, and PH hierarchy is demonstrated on a variety of examples including symmetry breaking in benzene, the Cope rearrangement, the Bergman reaction, and the dissociation of fluorine.
Ultra High Mode Mix in NIF NIC Implosions
NASA Astrophysics Data System (ADS)
Scott, Robbie; Garbett, Warren
2017-10-01
This work re-examines a sub-set of the low adiabat implosions from the National Ignition Campaign in an effort to better understand potential phenomenological sources of `excess' mix observed experimentally. An extensive effort has been made to match both shock-timing and backlit radiography (Con-A) implosion data in an effort to reproduce the experimental conditions as accurately as possible. Notably a 30% reduction in ablation pressure at peak drive is required to match the experimental data. The reduced ablation pressure required to match the experimental data allows the ablator to decompress, in turn causing the DT ice-ablator interface to go Rayleigh-Taylor unstable early in the implosion acceleration phase. Post-processing the runs with various mix models indicates high-mode mix from the DT ice-ablator interface may penetrate deep into the hotspot. This work offers a potential explanation of why these low-adiabat implosions exhibited significantly higher levels of mix than expected from high-fidelity multi-dimensional simulations. Through this new understanding, a possible route forward for low-adiabat implosions on NIF is suggested.
A telescopic jib for continuous adjustment
NASA Technical Reports Server (NTRS)
Etzler, C. C.
1979-01-01
For special space applications, e.g. for experiments distant from any orbital platforms or manipulators a new kind of jibs with extreme extension capacity has to be designed. Considering the requirements, the telescopic principle is found to be the most promising. For the choice of the stiff structure, design criteria are evaluated. Special effort deals with the drive system. An electromechanical system can satisfy the requirements. First results of the development of such a drive are presented. The most significant features are: A telescopic assembly of tubes which can be mutually moved by a short spindle in the center of the package. An elastically suspended screw is located at the bottom of each tube. For the jib extension, these screws will be linked with the spindle. The control of their sequence and the adjustment of tubes in mutual end positions are performed by latches. A functional model proved the basic idea.
Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.
2012-01-01
An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.
Minimizing Dispersion in FDTD Methods with CFL Limit Extension
NASA Astrophysics Data System (ADS)
Sun, Chen
The CFL extension in FDTD methods is receiving considerable attention in order to reduce the computational effort and save the simulation time. One of the major issues in the CFL extension methods is the increased dispersion. We formulate a decomposition of FDTD equations to study the behaviour of the dispersion. A compensation scheme to reduce the dispersion in CFL extension is constructed and proposed. We further study the CFL extension in a FDTD subgridding case, where we improve the accuracy by acting only on the FDTD equations of the fine grid. Numerical results confirm the efficiency of the proposed method for minimising dispersion.
Thermal Performance of a Cryogenic Fluid Management Cubesat Mission
NASA Technical Reports Server (NTRS)
Berg, J. J.; Oliveira, J. M.; Congiardo, J. F.; Walls, L. K.; Putman, P. T.; Haberbusch, M. S.
2013-01-01
Development for an in-space demonstration of a CubeS at as a Cryogenic Fluid Management (CFM) test bed is currently underway. The favorable economics of CubeSats make them appealing for technology development activity. While their size limits testing to smaller scales, many of the regimes relevant to CFM can still be achieved. The first demo flight of this concept, CryoCube®-1, will focus on oxygen liquefaction and low-gravity level sensing using Reduced Gravity CryoTracker®. An extensive thermal modeling effort has been underway to both demonstrate concept feasibility and drive the prototype design. The satellite will utilize both a sun- and earth-shield to passively cool its experimental tank below 115 K. An on-board gas generator will create high pressure gaseous oxygen, which will be throttled into a bottle in the experimental node and condensed. The resulting liquid will be used to perform various experiments related to level sensing. Modeling efforts have focused on the spacecraft thermal performance and its effects on condensation in the experimental node. Parametric analyses for both optimal and suboptimal conditions have been considered and are presented herein.
Building Extension Partnerships with Government to Further Water Conservation Efforts
ERIC Educational Resources Information Center
McKee, Brandon; Huang, Pei-wen; Lamm, Alexa
2017-01-01
Extension, being a local, state and federally funded program has a natural partnership with government agencies at all three levels, however these partnerships could be built upon and targeted at specific audiences for greater effect if more is known about how government influences public perception. The government has recognized the need for…
What Extension Professionals Say about Teaching Health Insurance: Results from a Nationwide Survey
ERIC Educational Resources Information Center
Brown, Virginia; Koonce, Joan C.; Martin, Ken; Kiss, Elizabeth; Katras, Mary Jo; Wise, Dena
2017-01-01
The Extension Committees on Organization and Policy adopted a new Health and Wellness Framework with six priority areas. A health insurance literacy team was appointed to assess current system efforts and develop research, programs, and professional development opportunities. Survey results show that finance educators were the most likely…
Fundamental Dimensions and Essential Elements of Exemplary Local Extension Units
ERIC Educational Resources Information Center
Terry, Bryan D.; Osborne, Edward
2015-01-01
Collaborative efforts between federal, state, and local government agencies enable local Extension units to deliver a high level of educational opportunities to local citizens. These units represent land-grant institutions by delivering non-formal education that aim to address local, regional, and state concerns. The purpose of this study was to…
Dare To Be You: A Diversion Program for First Time Juvenile Offenders.
ERIC Educational Resources Information Center
Vail, Ann; Nest, Judy
This document notes that community-based organizations such as the Cooperative Extension Service have joined the efforts to reduce juvenile delinquency through juvenile diversion programs. It then describes the "Dare to be You" program that was developed by the Colorado Cooperative Extension System. The six objectives of the program delineated in…
ERIC Educational Resources Information Center
Chipman, Kristi; Litchfield, Ruth
2012-01-01
The Affordable Care Act provides impetus for Extension efforts in worksite wellness. The study reported here examined the influence of two worksite wellness interventions, newsletters and individual counseling. Surveys examined dietary and physical activity behaviors of participants pre- and post-intervention (N = 157). Descriptive statistics,…
Missouri Extension Provides Tax Assistance to Rural Families
ERIC Educational Resources Information Center
Huston, Sandra J.; Procter, Brenda
2006-01-01
Financial education is one pathway to improving the human condition. Family financial educators in University Extension programs lead in their efforts to provide individuals and families with the skills they need to manage their financial resources effectively. Offering these opportunities at a time when families have money to manage is a key…
Development and validation of a 10-year-old child ligamentous cervical spine finite element model.
Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H
2013-12-01
Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.
Lane, Andrew M.; Terry, Peter C.; Devonport, Tracey J.; Friesen, Andrew P.; Totterdell, Peter A.
2017-01-01
The present study tested and extended Lane and Terry (2000) conceptual model of mood-performance relationships using a large dataset from an online experiment. Methodological and theoretical advances included testing a more balanced model of pleasant and unpleasant emotions, and evaluating relationships among emotion regulation traits, states and beliefs, psychological skills use, perceptions of performance, mental preparation, and effort exerted during competition. Participants (N = 73,588) completed measures of trait emotion regulation, emotion regulation beliefs, regulation efficacy, use of psychological skills, and rated their anger, anxiety, dejection, excitement, energy, and happiness before completing a competitive concentration task. Post-competition, participants completed measures of effort exerted, beliefs about the quality of mental preparation, and subjective performance. Results showed that dejection associated with worse performance with the no-dejection group performing 3.2% better. Dejection associated with higher anxiety and anger scores and lower energy, excitement, and happiness scores. The proposed moderating effect of dejection was supported for the anxiety-performance relationship but not the anger-performance relationship. In the no-dejection group, participants who reported moderate or high anxiety outperformed those reporting low anxiety by about 1.6%. Overall, results showed partial support for Lane and Terry’s model. In terms of extending the model, results showed dejection associated with greater use of suppression, less frequent use of re-appraisal and psychological skills, lower emotion regulation beliefs, and lower emotion regulation efficacy. Further, dejection associated with greater effort during performance, beliefs that pre-competition emotions did not assist goal achievement, and low subjective performance. Future research is required to investigate the role of intense emotions in emotion regulation and performance. PMID:28458641
Lane, Andrew M; Terry, Peter C; Devonport, Tracey J; Friesen, Andrew P; Totterdell, Peter A
2017-01-01
The present study tested and extended Lane and Terry (2000) conceptual model of mood-performance relationships using a large dataset from an online experiment. Methodological and theoretical advances included testing a more balanced model of pleasant and unpleasant emotions, and evaluating relationships among emotion regulation traits, states and beliefs, psychological skills use, perceptions of performance, mental preparation, and effort exerted during competition. Participants ( N = 73,588) completed measures of trait emotion regulation, emotion regulation beliefs, regulation efficacy, use of psychological skills, and rated their anger, anxiety, dejection, excitement, energy, and happiness before completing a competitive concentration task. Post-competition, participants completed measures of effort exerted, beliefs about the quality of mental preparation, and subjective performance. Results showed that dejection associated with worse performance with the no-dejection group performing 3.2% better. Dejection associated with higher anxiety and anger scores and lower energy, excitement, and happiness scores. The proposed moderating effect of dejection was supported for the anxiety-performance relationship but not the anger-performance relationship. In the no-dejection group, participants who reported moderate or high anxiety outperformed those reporting low anxiety by about 1.6%. Overall, results showed partial support for Lane and Terry's model. In terms of extending the model, results showed dejection associated with greater use of suppression, less frequent use of re-appraisal and psychological skills, lower emotion regulation beliefs, and lower emotion regulation efficacy. Further, dejection associated with greater effort during performance, beliefs that pre-competition emotions did not assist goal achievement, and low subjective performance. Future research is required to investigate the role of intense emotions in emotion regulation and performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, C. J.; Chathoth, S. M., E-mail: smavilac@cityu.edu.hk; Podlesnyak, A.
2015-09-28
Extensive efforts have been made to develop metallic-glasses with large casting diameter. Such efforts were hindered by the poor understanding of glass formation mechanisms and the origin of the glass-forming ability (GFA) in metallic glass-forming systems. In this work, we have investigated relaxation dynamics of a model bulk glass-forming alloy system that shows the enhanced at first and then diminished GFA on increasing the percentage of micro-alloying. The micro-alloying did not have any significant impact on the thermodynamic properties. The GFA increasing on micro-alloying in this system cannot be explained by the present theoretical knowledge. Our results indicate that atomicmore » caging is the primary factor that influences the GFA. The composition dependence of the atomic caging time or residence time is found to be well correlated with GFA of the system.« less
Theoretical research program to study chemical reactions in AOTV bow shock tubes
NASA Technical Reports Server (NTRS)
Taylor, Peter
1992-01-01
Effort continued through this period to refine and expand the SIRIUS/ABACUS program package for CASSCF and RASSCF second derivatives. A new approach to computing the Gaussian integral derivatives that require much of the time in gradient and Hessian calculations was devised. Several different studies were undertaken in the area of application calculations. These include a study of proton transfer in the HF trimer, which provides an analog of rearrangement reactions, and the extension of our previous work on Be and Mg clusters to Ca clusters. In addition, a very accurate investigation of the lowest-lying potential curves of the O2 molecule was completed. These curves are essential for evaluating different models of the terrestrial atmosphere nightglow. The most effort this year was devoted to a large scale investigation of stationary points on the C4H4 surface, and the thermochemistry of acetylene/acetylene reaction.
Chen, C. J.; Podlesnyak, A.; Mamontov, E.; ...
2015-09-28
We've made extensive efforts to develop metallic-glasses with large casting diameter. Such efforts were hindered by the poor understanding of glass formation mechanisms and the origin of the glass-forming ability (GFA) in metallic glass-forming systems. We have investigated relaxation dynamics of a model bulk glass-forming alloy system that shows the enhanced at first and then diminished GFA on increasing the percentage of micro-alloying. The micro-alloying did not have any significant impact on the thermodynamic properties. The GFA increasing on micro-alloying in this system cannot be explained by the present theoretical knowledge. Finally, our results indicate that atomic caging is themore » primary factor that influences the GFA. The composition dependence of the atomic caging time or residence time is found to be well correlated with GFA of the system.« less
DOT National Transportation Integrated Search
1974-01-01
In response to its own research and observations in the early 1960's the Virginia Department of Highways mounted an intensive and extensive effort to improve the performance of concrete in bridge decks. Major elements of this effort included (1) a tr...
Derrida's Right to Philosophy, Then and Now
ERIC Educational Resources Information Center
Willinsky, John
2009-01-01
In this essay, a tribute to Jacques Derrida's educational efforts at expanding access to current work in philosophy, John Willinsky examines his efforts as both a public right and an element of academic freedom that bear on the open access movement today. Willinsky covers Derrida's extension and outreach work with the Groupe de Recherches pour…
Innovative and Coordinated Solutions for the Hardwood Industry: The Hardwood Utilization Consortium
Philip A. Araman; Cynthia West
1996-01-01
Many varied efforts have been underway to help the hardwood industry make more effective and efficient use of the hardwood resource while meeting market needs. The efforts range from training, extension, utilization and marketing research and development, to educational activities. The activities are disjointed and unorganized and individually struggling to maintain...
ERIC Educational Resources Information Center
Gutierez, Sally Baricaua
2015-01-01
In the Philippines, inquiry-based teaching has been promoted and implemented together with recently instigated curriculum reforms. Serious teacher professional development efforts are being used extensively to properly orient and present the benefits of inquirybased teaching. Despite these efforts, there still exists a big gap in the effective…
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
NASA Technical Reports Server (NTRS)
Foster, John V.; Hartman, David C.
2017-01-01
The NASA Unmanned Aircraft System (UAS) Traffic Management (UTM) project is conducting research to enable civilian low-altitude airspace and UAS operations. A goal of this project is to develop probabilistic methods to quantify risk during failures and off nominal flight conditions. An important part of this effort is the reliable prediction of feasible trajectories during off-nominal events such as control failure, atmospheric upsets, or navigation anomalies that can cause large deviations from the intended flight path or extreme vehicle upsets beyond the normal flight envelope. Few examples of high-fidelity modeling and prediction of off-nominal behavior for small UAS (sUAS) vehicles exist, and modeling requirements for accurately predicting flight dynamics for out-of-envelope or failure conditions are essentially undefined. In addition, the broad range of sUAS aircraft configurations already being fielded presents a significant modeling challenge, as these vehicles are often very different from one another and are likely to possess dramatically different flight dynamics and resultant trajectories and may require different modeling approaches to capture off-nominal behavior. NASA has undertaken an extensive research effort to define sUAS flight dynamics modeling requirements and develop preliminary high fidelity six degree-of-freedom (6-DOF) simulations capable of more closely predicting off-nominal flight dynamics and trajectories. This research has included a literature review of existing sUAS modeling and simulation work as well as development of experimental testing methods to measure and model key components of propulsion, airframe and control characteristics. The ultimate objective of these efforts is to develop tools to support UTM risk analyses and for the real-time prediction of off-nominal trajectories for use in the UTM Risk Assessment Framework (URAF). This paper focuses on modeling and simulation efforts for a generic quad-rotor configuration typical of many commercial vehicles in use today. An overview of relevant off-nominal multi-rotor behaviors will be presented to define modeling goals and to identify the prediction capability lacking in simplified models of multi-rotor performance. A description of recent NASA wind tunnel testing of multi-rotor propulsion and airframe components will be presented illustrating important experimental and data acquisition methods, and a description of preliminary propulsion and airframe models will be presented. Lastly, examples of predicted off-nominal flight dynamics and trajectories from the simulation will be presented.
Olstad, Bjørn Harald; Vaz, João Rocha; Zinner, Christoph; Cabri, Jan M H; Kjendlie, Per-Ludvik
2017-06-01
The aims of this study were to describe muscular activation patterns and kinematic variables during the complete stroke cycle (SC) and the different phases of breaststroke swimming at submaximal and maximal efforts. Surface electromyography (sEMG) was collected from eight muscles in nine elite swimmers; five females (age 20.3 ± 5.4 years; Fédération Internationale de Natation [FINA] points 815 ± 160) and four males (27.7 ± 7.1 years; FINA points 879 ± 151). Underwater cameras were used for 3D kinematic analysis with automatic motion tracking. The participants swam 25 m of breaststroke at 60%, 80% and 100% effort and each SC was divided into three phases: knee extension, knee extended and knee flexion. With increasing effort, the swimmers decreased their SC distance and increased their velocity and stroke rate. A decrease during the different phases was found for duration during knee extended and knee flexion, distance during knee extended and knee angle at the beginning of knee extension with increasing effort. Velocity increased for all phases. The mean activation pattern remained similar across the different effort levels, but the muscles showed longer activation periods relative to the SC and increased integrated sEMG (except trapezius) with increasing effort. The muscle activation patterns, muscular participation and kinematics assessed in this study with elite breaststroke swimmers contribute to a better understanding of the stroke and what occurs at different effort levels. This could be used as a reference for optimising breaststroke training to improve performance.
An, Gary; Hunt, C. Anthony; Clermont, Gilles; Neugebauer, Edmund; Vodovotz, Yoram
2007-01-01
Introduction Translational systems biology approaches can be distinguished from mainstream systems biology in that their goal is to drive novel therapies and streamline clinical trials in critical illness. One systems biology approach, dynamic mathematical modeling (DMM), is increasingly used in dealing with the complexity of the inflammatory response and organ dysfunction. The use of DMM often requires a broadening of research methods and a multidisciplinary team approach that includes bioscientists, mathematicians, engineers, and computer scientists. However, the development of these groups must overcome domain-specific barriers to communication and understanding. Methods We present four case studies of successful translational, interdisciplinary systems biology efforts, which differ by organizational level from an individual to an entire research community. Results Case 1 is a single investigator involved in DMM of the acute inflammatory response at Cook County Hospital, in which extensive translational progress was made using agent-based models of inflammation and organ damage. Case 2 is a community-level effort from the University of Witten-Herdecke in Cologne, whose efforts have led to the formation of the Society for Complexity in Acute Illness. Case 3 is an institution-based group, the Biosystems Group at the University of California, San Francisco, whose work has included a focus on a common lexicon for DMM. Case 4 is an institution-based, trans-disciplinary research group (the Center for Inflammation and Regenerative Modeling at the University of Pittsburgh, whose modeling work has led to internal education efforts, grant support, and commercialization. Conclusion A transdisciplinary approach, which involves team interaction in an iterative fashion to address ambiguity and is supported by educational initiatives, is likely to be necessary for DMM in acute illness. Community-wide organizations such as the Society of Complexity in Acute Illness (SCAI) must strive to facilitate the implementation of DMM in sepsis/trauma research into the research community as a whole. PMID:17548029
Rayne, Sierra; Forest, Kaya
2016-09-18
The air-water partition coefficients (Kaw) for 86 large polycyclic aromatic hydrocarbons and their unsaturated relatives were estimated using high-level G4(MP2) gas and aqueous phase calculations with the SMD, IEFPCM-UFF, and CPCM solvation models. An extensive method validation effort was undertaken which involved confirming that, via comparisons to experimental enthalpies of formation, gas-phase energies at the G4(MP2) level for the compounds of interest were at or near thermochemical accuracy. Investigations of the three solvation models using a range of neutral and ionic compounds suggested that while no clear preferential solvation model could be chosen in advance for accurate Kaw estimates of the target compounds, the employment of increasingly higher levels of theory would result in lower Kaw errors. Subsequent calculations on the polycyclic aromatic and unsaturated hydrocarbons at the G4(MP2) level revealed excellent agreement for the IEFPCM-UFF and CPCM models against limited available experimental data. The IEFPCM-UFF-G4(MP2) and CPCM-G4(MP2) solvation energy calculation approaches are anticipated to give Kaw estimates within typical experimental ranges, each having general Kaw errors of less than 0.5 log10 units. When applied to other large organic compounds, the method should allow development of a broad and reliable Kaw database for multimedia environmental modeling efforts on various contaminants.
Analysis of Bioprocesses. Dynamic Modeling is a Must.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramkrishna, Doraiswami; Song, Hyun-Seob
2016-01-01
The goal of this paper is to report on the performance of a promising dynamic framework based on the cybernetic concepts which have evolved over three decades. We present case studies of successful dynamic simulations of wild-type strains as well as specific KO mutants on bacteria and yeast. An extensive metabolic engineering effort, including genome scale networks, is called for to secure the methodology and realize its full potential. Towards this end, the software AUMIC is under active further development to enable speedy applications. Its wide use will be enabled by a publication that is shortly due.
The initial impact of a workplace lead-poisoning prevention project.
Bellows, J; Rudolph, L
1993-01-01
The California Department of Health Services began an occupational lead poisoning prevention project in cooperation with 275 radiator service companies. The agency developed and marketed resources to facilitate companies' own efforts, tracked the progress of each company, and urged the companies to conduct blood lead testing. Testing by participating employers increased from 9% to 95%, and 10 times as many companies with likely overexposures were identified as had been reported to the state's lead registry in the previous year. The success of this project indicates that the model should be applied more extensively. Images FIGURE 1 PMID:8438981
Rapid Monte Carlo Simulation of Gravitational Wave Galaxies
NASA Astrophysics Data System (ADS)
Breivik, Katelyn; Larson, Shane L.
2015-01-01
With the detection of gravitational waves on the horizon, astrophysical catalogs produced by gravitational wave observatories can be used to characterize the populations of sources and validate different galactic population models. Efforts to simulate gravitational wave catalogs and source populations generally focus on population synthesis models that require extensive time and computational power to produce a single simulated galaxy. Monte Carlo simulations of gravitational wave source populations can also be used to generate observation catalogs from the gravitational wave source population. Monte Carlo simulations have the advantes of flexibility and speed, enabling rapid galactic realizations as a function of galactic binary parameters with less time and compuational resources required. We present a Monte Carlo method for rapid galactic simulations of gravitational wave binary populations.
Riviere, Guillaume; Klopp, Christophe; Ibouniyamine, Nabihoudine; Huvet, Arnaud; Boudry, Pierre; Favrel, Pascal
2015-12-02
The Pacific oyster, Crassostrea gigas, is one of the most important aquaculture shellfish resources worldwide. Important efforts have been undertaken towards a better knowledge of its genome and transcriptome, which makes now C. gigas becoming a model organism among lophotrochozoans, the under-described sister clade of ecdysozoans within protostomes. These massive sequencing efforts offer the opportunity to assemble gene expression data and make such resource accessible and exploitable for the scientific community. Therefore, we undertook this assembly into an up-to-date publicly available transcriptome database: the GigaTON (Gigas TranscriptOme pipeliNe) database. We assembled 2204 million sequences obtained from 114 publicly available RNA-seq libraries that were realized using all embryo-larval development stages, adult organs, different environmental stressors including heavy metals, temperature, salinity and exposure to air, which were mostly performed as part of the Crassostrea gigas genome project. This data was analyzed in silico and resulted into 56621 newly assembled contigs that were deposited into a publicly available database, the GigaTON database. This database also provides powerful and user-friendly request tools to browse and retrieve information about annotation, expression level, UTRs, splice and polymorphism, and gene ontology associated to all the contigs into each, and between all libraries. The GigaTON database provides a convenient, potent and versatile interface to browse, retrieve, confront and compare massive transcriptomic information in an extensive range of conditions, tissues and developmental stages in Crassostrea gigas. To our knowledge, the GigaTON database constitutes the most extensive transcriptomic database to date in marine invertebrates, thereby a new reference transcriptome in the oyster, a highly valuable resource to physiologists and evolutionary biologists.
EVALUATING THE UNIT APPROACH--FARM AND HOME DEVELOPMENT. (TITLE SUPPLIED).
ERIC Educational Resources Information Center
MAYER, RALPH E.; RIECK, ROBERT E.
IN AN EFFORT TO RESOLVE THE DEBATE OF STAFF-TO-FAMILY VS STAFF-TO-MASS RELATIONSHIPS IN FARM EXTENSION WORK, THE 1954 FEDERAL EXTENSION APPROPRIATION BILL AUTHORIZED INCEPTION OF A PERSONAL CONTACT, FAMILY UNIT APPROACH CALLED FARM AND HOME DEVELOPMENT (FHD). THE FHD AGENT WORKED WITH THE FARMER AND HIS WIFE IN AN EDUCATIONAL PROGRAM WHICH…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-05
...] Dipping and Coating Operations (Dip Tanks) Standard; Extension of the Office of Management and Budget's... Standard on Dipping and Coating Operations (Dip Tanks) (29 CFR 1910.126(g)(4)). DATES: Comments must be... of efforts in obtaining information (29 U.S.C. 657). The Standard on Dipping and Coating Operations...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
...] Access to Employee Exposure and Medical Records; Extension of the Office of Management and Budget's (OMB... Regulation on Access to Employee Exposure and Medical Records (29 CFR 1910.1020). DATES: Comments must be... operating small businesses, and to reduce to the maximum extent feasible unnecessary duplication of efforts...
Local Foods in Maryland Schools and Implications for Extension: Findings from Schools and Farmers
ERIC Educational Resources Information Center
Oberholtzer, Lydia; Hanson, James C.; Brust, Gerald; Dimitri, Carolyn; Richman, Nessa
2012-01-01
This article describes results from a study examining the supply chain for local foods in Maryland school meals, the barriers and opportunities for increasing local foods in schools, and the development of Extension efforts to meet the needs identified. Interviews and surveys were administered with stakeholders, including farmers and food service…
The History, Status, Gaps, and Future Directions of Neurotoxicology in China.
Cai, Tongjian; Luo, Wenjing; Ruan, Diyun; Wu, Yi-Jun; Fox, Donald A; Chen, Jingyuan
2016-06-01
Rapid economic development in China has produced serious ecological, environmental, and health problems. Neurotoxicity has been recognized as a major public health problem. The Chinese government, research institutes, and scientists conducted extensive studies concerning the source, characteristics, and mechanisms of neurotoxicants. This paper presents, for the first time, a comprehensive history and review of major sources of neurotoxicants, national bodies/legislation engaged, and major neurotoxicology research in China. Peer-reviewed research and pollution studies by Chinese scientists from 1991 to 2015 were examined. PubMed, Web of Science and Chinese National Knowledge Infrastructure (CNKI) were the major search tools. The central problem is an increased exposure to neurotoxicants from air and water, food contamination, e-waste recycling, and manufacturing of household products. China formulated an institutional framework and standards system for management of major neurotoxicants. Basic and applied research was initiated, and international cooperation was achieved. The annual number of peer-reviewed neurotoxicology papers from Chinese authors increased almost 30-fold since 2001. Despite extensive efforts, neurotoxicity remains a significant public health problem. This provides great challenges and opportunities. We identified 10 significant areas that require major educational, environmental, governmental, and research efforts, as well as attention to public awareness. For example, there is a need to increase efforts to utilize new in vivo and in vitro models, determine the potential neurotoxicity and mechanisms involved in newly emerging pollutants, and examine the effects and mechanisms of mixtures. In the future, we anticipate working with scientists worldwide to accomplish these goals and eliminate, prevent and treat neurotoxicity. Cai T, Luo W, Ruan D, Wu YJ, Fox DA, Chen J. 2016. The history, status, gaps, and future directions of neurotoxicology in China. Environ Health Perspect 124:722-732; http://dx.doi.org/10.1289/ehp.1409566.
NASA Technical Reports Server (NTRS)
Jenkins, R. M.
1983-01-01
The present effort represents an extension of previous work wherein a calculation model for performing rapid pitchline optimization of axial gas turbine geometry, including blade profiles, is developed. The model requires no specification of geometric constraints. Output includes aerodynamic performance (adiabatic efficiency), hub-tip flow-path geometry, blade chords, and estimates of blade shape. Presented herein is a verification of the aerodynamic performance portion of the model, whereby detailed turbine test-rig data, including rig geometry, is input to the model to determine whether tested performance can be predicted. An array of seven (7) NASA single-stage axial gas turbine configurations is investigated, ranging in size from 0.6 kg/s to 63.8 kg/s mass flow and in specific work output from 153 J/g to 558 J/g at design (hot) conditions; stage loading factor ranges from 1.15 to 4.66.
De-Arabization of the Bedouin: A Study of an Inevitable Failure
ERIC Educational Resources Information Center
Yonah, Yossi; Abu-Saad, Ismael; Kaplan, Avi
2004-01-01
This paper offers an assessment of the efforts to de-Arabize the Bedouin Arab youth of the Negev. We show that despite the extensive efforts to achieve this goal, they have become pronouncedly alienated from the State of Israel, and are increasingly perceiving themselves as an integral part of Israel's Palestinian Arab national minority. The…
Program Standards and Expectations: Providing Clarity, Consistency, and Focus
ERIC Educational Resources Information Center
Diem, Keith G.
2016-01-01
The effort described in this article resulted from requests for clarity and consistency from new and existing Extension/4-H educators as well as from recommendations by university auditors. The primary purpose of the effort was to clarify standards for effective county-based 4-H youth development programs and to help focus the roles of 4-H…
ERIC Educational Resources Information Center
Jackson-Smith, Douglas B.; McEvoy, Jamie P.
2011-01-01
We assess the long-term effectiveness of outreach and education efforts associated with a water quality improvement project in a watershed located in northern Utah, USA. Conducted 15 years after the original project began, our research examines the lasting impacts of different extension activities on landowners' motivations to participate and…
ERIC Educational Resources Information Center
Cohen, John M.; Marshall, Terry
One of a series designed to aid community leaders, cooperative extension agents, local government officials, and others in their efforts to gain external resources needed to support local efforts in rural development, this handbook addresses three basic problem areas: gathering information on rural development needs of a community; locating…
Introduction - regional monitoring programs
Richard L. Hutto; C. John Ralph
2005-01-01
There is increasing interest in the initiation of regional or statewide monitoring programs that are less extensive than national efforts such as the Breeding Bird Survey. A number of regional programs have been in existence for a decade or more, so the papers in this section represented an effort to bring together the collective experience of the people who had...
77 FR 6080 - Taking and Importing Marine Mammals; U.S. Navy's Atlantic Fleet Active Sonar Training
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-07
... area occurred from August 2, 2010 to August 1, 2011. Visual Surveys The majority of monitoring effort...) Operating Area (OPAREA), with an extension of survey effort off Cape Hatteras. These locations serve as the... of year-round multi- disciplinary monitoring through the use of shipboard and aerial visual surveys...
NASA Astrophysics Data System (ADS)
Thornton, P. E.; Nacp Site Synthesis Participants
2010-12-01
The North American Carbon Program (NACP) synthesis effort includes an extensive intercomparison of modeled and observed ecosystem states and fluxes preformed with multiple models across multiple sites. The participating models span a range of complexity and intended application, while the participating sites cover a broad range of natural and managed ecosystems in North America, from the subtropics to arctic tundra, and coastal to interior climates. A unique characteristic of this collaborative effort is that multiple independent observations are available at all sites: fluxes are measured with the eddy covariance technique, and standard biometric and field sampling methods provide estimates of standing stock and annual production in multiple categories. In addition, multiple modeling approaches are employed to make predictions at each site, varying, for example, in the use of diagnostic vs. prognostic leaf area index. Given multiple independent observational constraints and multiple classes of model, we evaluate the internal consistency of observations at each site, and use this information to extend previously derived estimates of uncertainty in the flux observations. Model results are then compared with all available observations and models are ranked according to their consistency with each type of observation (high frequency flux measurement, carbon stock, annual production). We demonstrate a range of internal consistency across the sites, and show that some models which perform well against one observational metric perform poorly against others. We use this analysis to construct a hypothesis for combining eddy covariance, biometrics, and other standard physiological and ecological measurements which, as data collection proceeded over several years, would present an increasingly challenging target for next generation models.
Ball, James W.; Nordstrom, D. Kirk; Jenne, Everett A.
1980-01-01
A computerized chemical model, WATEQ2, has resulted from extensive additions to and revision of the WATEQ model of Truesdell and Jones (Truesdell, A. H., and Jones, B. F., 1974, WATEQ, a computer program for calculating chemical equilibria of natural waters: J. Res. U. S. Geol, Survey, v. 2, p. 233-274). The model building effort has necessitated searching the literature and selecting thermochemical data pertinent to the reactions added to the model. This supplementary report manes available the details of the reactions added to the model together with the selected thermochemical data and their sources. Also listed are details of program operation and a brief description of the output of the model. Appendices-contain a glossary of identifiers used in the PL/1 computer code, the complete PL/1 listing, and sample output from three water analyses used as test cases.
Engineering and fabrication cost considerations for cryogenic wind tunnel models
NASA Technical Reports Server (NTRS)
Boykin, R. M., Jr.; Davenport, J. B., Jr.
1983-01-01
Design and fabrication cost drivers for cryogenic transonic wind tunnel models are defined. The major cost factors for wind tunnel models are model complexity, tolerances, surface finishes, materials, material validation, and model inspection. The cryogenic temperatures require the use of materials with relatively high fracture toughness but at the same time high strength. Some of these materials are very difficult to machine, requiring extensive machine hours which can add significantly to the manufacturing costs. Some additional engineering costs are incurred to certify the materials through mechanical tests and nondestructive evaluation techniques, which are not normally required with conventional models. When instrumentation such as accelerometers and electronically scanned pressure modules is required, temperature control of these devices needs to be incorporated into the design, which requires added effort. Additional thermal analyses and subsystem tests may be necessary, which also adds to the design costs. The largest driver to the design costs is potentially the additional static and dynamic analyses required to insure structural integrity of the model and support system.
GCSS/WGNE Pacific Cross-section Intercomparison: Tropical and Subtropical Cloud Transitions
NASA Astrophysics Data System (ADS)
Teixeira, J.
2008-12-01
In this presentation I will discuss the role of the GEWEX Cloud Systems Study (GCSS) working groups in paving the way for substantial improvements in cloud parameterization in weather and climate models. The GCSS/WGNE Pacific Cross-section Intercomparison (GPCI) is an extension of GCSS and is a different type of model evaluation where climate models are analyzed along a Pacific Ocean transect from California to the equator. This approach aims at complementing the more traditional efforts in GCSS by providing a simple framework for the evaluation of models that encompasses several fundamental cloud regimes such as stratocumulus, shallow cumulus and deep cumulus, as well as the transitions between them. Currently twenty four climate and weather prediction models are participating in GPCI. We will present results of the comparison between models and recent satellite data. In particular, we will explore in detail the potential of the Atmospheric Infrared Sounder (AIRS) and CloudSat data for the evaluation of the representation of clouds and convection in climate models.
Preclinical models used for immunogenicity prediction of therapeutic proteins.
Brinks, Vera; Weinbuch, Daniel; Baker, Matthew; Dean, Yann; Stas, Philippe; Kostense, Stefan; Rup, Bonita; Jiskoot, Wim
2013-07-01
All therapeutic proteins are potentially immunogenic. Antibodies formed against these drugs can decrease efficacy, leading to drastically increased therapeutic costs and in rare cases to serious and sometimes life threatening side-effects. Many efforts are therefore undertaken to develop therapeutic proteins with minimal immunogenicity. For this, immunogenicity prediction of candidate drugs during early drug development is essential. Several in silico, in vitro and in vivo models are used to predict immunogenicity of drug leads, to modify potentially immunogenic properties and to continue development of drug candidates with expected low immunogenicity. Despite the extensive use of these predictive models, their actual predictive value varies. Important reasons for this uncertainty are the limited/insufficient knowledge on the immune mechanisms underlying immunogenicity of therapeutic proteins, the fact that different predictive models explore different components of the immune system and the lack of an integrated clinical validation. In this review, we discuss the predictive models in use, summarize aspects of immunogenicity that these models predict and explore the merits and the limitations of each of the models.
Biscarini, Andrea; Contemori, Samuele; Busti, Daniele; Botti, Fabio M; Pettorossi, Vito E
2016-12-08
Quadriceps strengthening exercises designed for the early phase of anterior cruciate ligament (ACL) rehabilitation should limit the anterior tibial translation developed by quadriceps contraction near full knee extension, in order to avoid excessive strain on the healing tissue. We hypothesize that knee-flexion exercises with simultaneous voluntary contraction of quadriceps (voluntary quadriceps cocontraction) can yield considerable levels of quadriceps activation while preventing the tibia from translating forward relative to the femur. Electromyographic activity in quadriceps and hamstring muscles was measured in 20 healthy males during isometric knee-flexion exercises executed near full knee extension with maximal voluntary effort of quadriceps cocontraction and external resistance (R) ranging from 0% to 60% of the 1-repetition maximum (1RM). Biomechanical modeling was applied to derive the shear (anterior/posterior) tibiofemoral force developed in each exercise condition. Isometric knee-flexion exercises with small external resistance (R=10% 1RM) and maximal voluntary effort of quadriceps cocontraction yielded a net posterior (ACL-unloading) tibial pull (P=0.005) and levels of activation of 32%, 50%, and 45% of maximum voluntary isometric contraction, for the rectus femoris, vastus medialis, and vastus lateralis, respectively. This exercise might potentially rank as one of the most appropriate quadriceps strengthening interventions in the early phase of ACL rehabilitation. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pytlak, E.
2014-12-01
This presentation will outline ongoing, multi-year hydroclimate change research between the Columbia River Management Joint Operating Committee (RMJOC), The University of Washington, Portland State University, and their many regional research partners and stakeholders. Climate change in the Columbia River Basin is of particular concern to the Bonneville Power Administration (BPA) and many Federal, Tribal and regional stakeholders. BPA, the U.S. Army Corp of Engineers, and U.S. Bureau of Reclamation, which comprise the RMJOC, conducted an extensive study in 2009-11 using climate change streamflows produced by the University of Washington Climate Impacts Group (CIG). The study reconfirmed that as more winter precipitation in the Columbia Basin falls as rain rather than snow by mid-century, particularly on the U.S. portion of the basin, increased winter runoff is likely, followed by an earlier spring snowmelt peak, followed by less summer flows as seasonal snowmelt diminished earlier in the water year. Since that initial effort, both global and regional climate change modeling has advanced. To take advantage of the new outputs from the Fifth Coupled Model Intercomparison Project (CMIP-5), the RMJOC, through BPA support, is sponsoring new hydroclimate research which considers not only the most recent information from the GCMs, but also the uncertainties introduced by the hydroclimate modeling process itself. Historical streamflows, which are used to calibrate hydrologic models and ascertain their reliability, are subject to both measurement and modeling uncertainties. Downscaling GCMs to a hydrologically useful spatial and temporal resolution introduces uncertainty, depending on the downscaling methods. Hydrologic modeling introduces uncertainties from calibration and geophysical states, some of which, like land surface characteristics, are likely to also change with time. In the upper Columbia Basin, glacier processes introduce yet another source of uncertainty. The latest joint effort attempts to ascertain the relative contributions of these uncertainties in comparison to the uncertainties brought by changing climate itself.
Hood, Heather M.; Ocasio, Linda R.; Sachs, Matthew S.; Galagan, James E.
2013-01-01
The filamentous fungus Neurospora crassa played a central role in the development of twentieth-century genetics, biochemistry and molecular biology, and continues to serve as a model organism for eukaryotic biology. Here, we have reconstructed a genome-scale model of its metabolism. This model consists of 836 metabolic genes, 257 pathways, 6 cellular compartments, and is supported by extensive manual curation of 491 literature citations. To aid our reconstruction, we developed three optimization-based algorithms, which together comprise Fast Automated Reconstruction of Metabolism (FARM). These algorithms are: LInear MEtabolite Dilution Flux Balance Analysis (limed-FBA), which predicts flux while linearly accounting for metabolite dilution; One-step functional Pruning (OnePrune), which removes blocked reactions with a single compact linear program; and Consistent Reproduction Of growth/no-growth Phenotype (CROP), which reconciles differences between in silico and experimental gene essentiality faster than previous approaches. Against an independent test set of more than 300 essential/non-essential genes that were not used to train the model, the model displays 93% sensitivity and specificity. We also used the model to simulate the biochemical genetics experiments originally performed on Neurospora by comprehensively predicting nutrient rescue of essential genes and synthetic lethal interactions, and we provide detailed pathway-based mechanistic explanations of our predictions. Our model provides a reliable computational framework for the integration and interpretation of ongoing experimental efforts in Neurospora, and we anticipate that our methods will substantially reduce the manual effort required to develop high-quality genome-scale metabolic models for other organisms. PMID:23935467
The Application of Commercial Advertising Methods to University Extension. Bulletin, 1919, No. 51
ERIC Educational Resources Information Center
Orvis, Mary Burchard
1919-01-01
For many years, colleges, universities, and State departments of education have become more and more conscious of the importance of extension education, and of the obligation resting upon them to promote it in every way practicable. This is especially true of the State university, which, in many States, is now making an honest effort to extend the…
ERIC Educational Resources Information Center
Blauch, Lloyd E.
1933-01-01
During the past quarter of a century there have been rather continuous and persistent efforts for Federal aid to education. Twenty-one years ago the Congress of the United States enacted the Smith-Lever Agricultural Extension Act, and 3 years later it passed the Smith-Hughes Vocational Education Act. Under the Smith-Lever Act and subsequent…
ERIC Educational Resources Information Center
Mason, Makena; Aihara-Sasaki, Maria; Grace, J. Kenneth
2013-01-01
The efficacy of Educate to Eradicate, a K-12 service-learning science curricula developed as part of a statewide, community-based Extension effort for termite prevention, was evaluated. The curricula use termite biology and control as the basis for science education and have been implemented in over 350 Hawaii public school classrooms with more…
IITET and shadow TT: an innovative approach to training at the point of need
NASA Astrophysics Data System (ADS)
Gross, Andrew; Lopez, Favio; Dirkse, James; Anderson, Darran; Berglie, Stephen; May, Christopher; Harkrider, Susan
2014-06-01
The Image Intensification and Thermal Equipment Training (IITET) project is a joint effort between Night Vision and Electronics Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) and the Army Research Institute (ARI) Fort Benning Research Unit. The IITET effort develops a reusable and extensible training architecture that supports the Army Learning Model and trains Manned-Unmanned Teaming (MUM-T) concepts to Shadow Unmanned Aerial Systems (UAS) payload operators. The training challenge of MUM-T during aviation operations is that UAS payload operators traditionally learn few of the scout-reconnaissance skills and coordination appropriate to MUM-T at the schoolhouse. The IITET effort leveraged the simulation experience and capabilities at NVESD and ARI's research to develop a novel payload operator training approach consistent with the Army Learning Model. Based on the training and system requirements, the team researched and identified candidate capabilities in several distinct technology areas. The training capability will support a variety of training missions as well as a full campaign. Data from these missions will be captured in a fully integrated AAR capability, which will provide objective feedback to the user in near-real-time. IITET will be delivered via a combination of browser and video streaming technologies, eliminating the requirement for a client download and reducing user computer system requirements. The result is a novel UAS Payload Operator training capability, nested within an architecture capable of supporting a wide variety of training needs for air and ground tactical platforms and sensors, and potentially several other areas requiring vignette-based serious games training.
Status and threats analysis for the Florida manatee (Trichechus manatus latirostris), 2012
Runge, Michael C.; Langtimm, Catherine A.; Martin, Julien; Fonnesbeck, Christopher J.
2015-01-01
The endangered West Indian manatee (Trichechus manatus), especially the Florida subspecies (T. m. latirostris), has been the focus of conservation efforts and extensive research since its listing under the Endangered Species Act. On the basis of the best information available as of December 2012, the threats facing the Florida manatee were determined to be less severe than previously thought, either because the conservation efforts have been successful, or because our knowledge of the demographic effects of those threats is increased, or both. Using the manatee Core Biological Model, we estimated the probability of the Florida manatee population on either the Atlantic or Gulf coast falling below 500 adults in the next 150 years to be 0.92 percent. The primary threats remain watercraft-related mortality and long-term loss of warm-water habitat. Since 2009, however, there have been a number of unusual events that have not yet been incorporated into this analysis, including several severely cold winters, a severe red-tide die off, and substantial loss of seagrass habitat in Brevard County, Fla. Further, the version of the Core Biological Model used in 2012 makes a number of assumptions that are under investigation. A revision of the Core Biological Model and an update of this quantitative threats analysis are underway as of 2015.
The Induction of Chaos in Electronic Circuits Final Report-October 1, 2001
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.M.Wheat, Jr.
2003-04-01
This project, now known by the name ''Chaos in Electronic Circuits,'' was originally tasked as a two-year project to examine various ''fault'' or ''non-normal'' operational states of common electronic circuits with some focus on determining the feasibility of exploiting these states. Efforts over the two-year duration of this project have been dominated by the study of the chaotic behavior of electronic circuits. These efforts have included setting up laboratory space and hardware for conducting laboratory tests and experiments, acquiring and developing computer simulation and analysis capabilities, conducting literature surveys, developing test circuitry and computer models to exercise and test ourmore » capabilities, and experimenting with and studying the use of RF injection as a means of inducing chaotic behavior in electronics. An extensive array of nonlinear time series analysis tools have been developed and integrated into a package named ''After Acquisition'' (AA), including capabilities such as Delayed Coordinate Embedding Mapping (DCEM), Time Resolved (3-D) Fourier Transform, and several other phase space re-creation methods. Many computer models have been developed for Spice and for the ATP (Alternative Transients Program), modeling the several working circuits that have been developed for use in the laboratory. And finally, methods of induction of chaos in electronic circuits have been explored.« less
ERIC Educational Resources Information Center
Scheer, Scott D.; Cochran, Graham R.; Harder, Amy; Place, Nick T.
2011-01-01
The purpose of this study was to compare and contrast an academic extension education model with an Extension human resource management model. The academic model of 19 competencies was similar across the 22 competencies of the Extension human resource management model. There were seven unique competencies for the human resource management model.…
ERIC Educational Resources Information Center
Boger, Robert P.; Cunningham, Jo Lynn
An extensive longitudinal research effort conducted through the Early Childhood Research Center at Michigan State University focused on understanding the forces leading to positive social and emotional development during the preschool years. Because of the rather limited base which was available from other studies for launching such an effort,…
NASA Technical Reports Server (NTRS)
Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.
2015-01-01
Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and Hybrid RANSLES/ Detached Eddy simulations (DES). Computation costs will be reported, along with comparison of accuracy and cost to much less expensive two-dimensional RANS simulations of the same geometry.
Measurement Requirements for Improved Modeling of Arcjet Facility Flows
NASA Technical Reports Server (NTRS)
Fletcher, Douglas G.
2000-01-01
Current efforts to develop new reusable launch vehicles and to pursue low-cost robotic planetary missions have led to a renewed interest in understanding arc-jet flows. Part of this renewed interest is concerned with improving the understanding of arc-jet test results and the potential use of available computational-fluid- dynamic (CFD) codes to aid in this effort. These CFD codes have been extensively developed and tested for application to nonequilibrium, hypersonic flow modeling. It is envisioned, perhaps naively, that the application of these CFD codes to the simulation of arc-jet flows would serve two purposes: first. the codes would help to characterize the nonequilibrium nature of the arc-jet flows; and second. arc-jet experiments could potentially be used to validate the flow models. These two objectives are, to some extent, mutually exclusive. However, the purpose of the present discussion is to address what role CFD codes can play in the current arc-jet flow characterization effort, and whether or not the simulation of arc-jet facility tests can be used to eva1uate some of the modeling that is used to formu1ate these codes. This presentation is organized into several sections. In the introductory section, the development of large-scale, constricted-arc test facilities within NASA is reviewed, and the current state of flow diagnostics using conventional instrumentation is summarized. The motivation for using CFD to simulate arc-jet flows is addressed in the next section, and the basic requirements for CFD models that would be used for these simulations are briefly discussed. This section is followed by a more detailed description of experimental measurements that are needed to initiate credible simulations and to evaluate their fidelity in the different flow regions of an arc-jet facility. Observations from a recent combined computational and experiment.al investigation of shock-layer flows in a large-scale arc-jet facility are then used to illustrate the current state of development of diagnostic instrumentation, CFD simulations, and general knowledge in the field of arc-jet characterization. Finally, the main points are summarized and recommendations for future efforts are given.
NAME Modeling and Climate Process Team
NASA Astrophysics Data System (ADS)
Schemm, J. E.; Williams, L. N.; Gutzler, D. S.
2007-05-01
NAME Climate Process and Modeling Team (CPT) has been established to address the need of linking climate process research to model development and testing activities for warm season climate prediction. The project builds on two existing NAME-related modeling efforts. One major component of this project is the organization and implementation of a second phase of NAMAP, based on the 2004 season. NAMAP2 will re-examine the metrics proposed by NAMAP, extend the NAMAP analysis to transient variability, exploit the extensive observational database provided by NAME 2004 to analyze simulation targets of special interest, and expand participation. Vertical column analysis will bring local NAME observations and model outputs together in a context where key physical processes in the models can be evaluated and improved. The second component builds on the current NAME-related modeling effort focused on the diurnal cycle of precipitation in several global models, including those implemented at NCEP, NASA and GFDL. Our activities will focus on the ability of the operational NCEP Global Forecast System (GFS) to simulate the diurnal and seasonal evolution of warm season precipitation during the NAME 2004 EOP, and on changes to the treatment of deep convection in the complicated terrain of the NAMS domain that are necessary to improve the simulations, and ultimately predictions of warm season precipitation These activities will be strongly tied to NAMAP2 to ensure technology transfer from research to operations. Results based on experiments conducted with the NCEP CFS GCM will be reported at the conference with emphasis on the impact of horizontal resolution in predicting warm season precipitation over North America.
Supersonics Project - Airport Noise Tech Challenge
NASA Technical Reports Server (NTRS)
Bridges, James
2010-01-01
The Airport Noise Tech Challenge research effort under the Supersonics Project is reviewed. While the goal of "Improved supersonic jet noise models validated on innovative nozzle concepts" remains the same, the success of the research effort has caused the thrust of the research to be modified going forward in time. The main activities from FY06-10 focused on development and validation of jet noise prediction codes. This required innovative diagnostic techniques to be developed and deployed, extensive jet noise and flow databases to be created, and computational tools to be developed and validated. Furthermore, in FY09-10 systems studies commissioned by the Supersonics Project showed that viable supersonic aircraft were within reach using variable cycle engine architectures if exhaust nozzle technology could provide 3-5dB of suppression. The Project then began to focus on integrating the technologies being developed in its Tech Challenge areas to bring about successful system designs. Consequently, the Airport Noise Tech Challenge area has shifted efforts from developing jet noise prediction codes to using them to develop low-noise nozzle concepts for integration into supersonic aircraft. The new plan of research is briefly presented by technology and timelines.
Programming biological models in Python using PySB.
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.
Programming biological models in Python using PySB
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis. PMID:23423320
Overview: Parity Violation and Fundamental Symmetries
NASA Astrophysics Data System (ADS)
Carlini, Roger
2017-09-01
The fields of nuclear and particle physics have undertaken extensive programs of research to search for evidence of new phenomena via the precision measurement of observables that are well predicted within the standard model of electroweak interaction. It is already known that the standard model is incomplete as it does not include gravity and dark matter/energy and therefore likely the low energy approximation of a more complex theory. This talk will be an overview of the motivation, experimental methods and status of some of these efforts (past and future) related to precision in-direct searches that are complementary to the direct searches underway at the Large Hadron Collider. This abstract is for the invited talk associated with the Mini-symposium titled ``Electro-weak Physics and Fundamental Symmetries'' organized by Julie Roche.
Magnetic Suspension Technology Development
NASA Technical Reports Server (NTRS)
Britcher, Colin
1998-01-01
This Cooperative Agreement, intended to support focused research efforts in the area of magnetic suspension systems, was initiated between NASA Langley Research Center (LaRC) and Old Dominion University (ODU) starting January 1, 1997. The original proposal called for a three-year effort, but funding for the second year proved to be unavailable, leading to termination of the agreement following a 5-month no-cost extension. This report covers work completed during the entire 17-month period of the award. This research built on work that had taken place over recent years involving both NASA LARC and the Principal Investigator (PI). The research was of a rather fundamental nature, although specific applications were kept in mind at all times, such as wind tunnel Magnetic Suspension and Balance Systems (MSBS), space payload pointing and vibration isolation systems, magnetic bearings for unconventional applications, magnetically levitated ground transportation and electromagnetic launch systems. Fundamental work was undertaken in areas such as the development of optimized magnetic configurations, analysis and modelling of eddy current effects, control strategies for magnetically levitated wind tunnel models and system calibration procedures. Despite the termination of this Cooperative Agreement, several aspects of the research work are currently continuing with alternative forms of support.
NASA Technical Reports Server (NTRS)
Pulkkinen, A.; Mahmood, S.; Ngwira, C.; Balch, C.; Lordan, R.; Fugate, D.; Jacobs, W.; Honkonen, I.
2015-01-01
A NASA Goddard Space Flight Center Heliophysics Science Division-led team that includes NOAA Space Weather Prediction Center, the Catholic University of America, Electric Power Research Institute (EPRI), and Electric Research and Management, Inc., recently partnered with the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) to better understand the impact of Geomagnetically Induced Currents (GIC) on the electric power industry. This effort builds on a previous NASA-sponsored Applied Sciences Program for predicting GIC, known as Solar Shield. The focus of the new DHS S&T funded effort is to revise and extend the existing Solar Shield system to enhance its forecasting capability and provide tailored, timely, actionable information for electric utility decision makers. To enhance the forecasting capabilities of the new Solar Shield, a key undertaking is to extend the prediction system coverage across Contiguous United States (CONUS), as the previous version was only applicable to high latitudes. The team also leverages the latest enhancements in space weather modeling capacity residing at Community Coordinated Modeling Center to increase the Technological Readiness Level, or Applications Readiness Level of the system http://www.nasa.gov/sites/default/files/files/ExpandedARLDefinitions4813.pdf.
Comparative Approaches to Understanding the Relation Between Aging and Physical Function.
Justice, Jamie N; Cesari, Matteo; Seals, Douglas R; Shively, Carol A; Carter, Christy S
2016-10-01
Despite dedicated efforts to identify interventions to delay aging, most promising interventions yielding dramatic life-span extension in animal models of aging are often ineffective when translated to clinical trials. This may be due to differences in primary outcomes between species and difficulties in determining the optimal clinical trial paradigms for translation. Measures of physical function, including brief standardized testing batteries, are currently being proposed as biomarkers of aging in humans, are predictive of adverse health events, disability, and mortality, and are commonly used as functional outcomes for clinical trials. Motor outcomes are now being incorporated into preclinical testing, a positive step toward enhancing our ability to translate aging interventions to clinical trials. To further these efforts, we begin a discussion of physical function and disability assessment across species, with special emphasis on mice, rats, monkeys, and man. By understanding how physical function is assessed in humans, we can tailor measurements in animals to better model those outcomes to establish effective, standardized translational functional assessments with aging. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
4D Dynamic Required Navigation Performance Final Report
NASA Technical Reports Server (NTRS)
Finkelsztein, Daniel M.; Sturdy, James L.; Alaverdi, Omeed; Hochwarth, Joachim K.
2011-01-01
New advanced four dimensional trajectory (4DT) procedures under consideration for the Next Generation Air Transportation System (NextGen) require an aircraft to precisely navigate relative to a moving reference such as another aircraft. Examples are Self-Separation for enroute operations and Interval Management for in-trail and merging operations. The current construct of Required Navigation Performance (RNP), defined for fixed-reference-frame navigation, is not sufficiently specified to be applicable to defining performance levels of such air-to-air procedures. An extension of RNP to air-to-air navigation would enable these advanced procedures to be implemented with a specified level of performance. The objective of this research effort was to propose new 4D Dynamic RNP constructs that account for the dynamic spatial and temporal nature of Interval Management and Self-Separation, develop mathematical models of the Dynamic RNP constructs, "Required Self-Separation Performance" and "Required Interval Management Performance," and to analyze the performance characteristics of these air-to-air procedures using the newly developed models. This final report summarizes the activities led by Raytheon, in collaboration with GE Aviation and SAIC, and presents the results from this research effort to expand the RNP concept to a dynamic 4D frame of reference.
ERIC Educational Resources Information Center
Baller, Robert D.; Shin, Dong-Joon; Richardson, Kelly K.
2005-01-01
In an effort to explain the spatial patterning of violence, we expanded Sutherland's (1947) concept of differential social organization to include the level of deviance exhibited by neighboring areas. To test the value of this extension, the geographic clustering of Japanese suicide and homicide rates is assessed using 1985 and 1995 data for…
Step up to the bar: avoiding discrimination in professional licensure.
Appelbaum, Paul S
2015-04-01
In their efforts to protect the public from impaired professionals, licensure boards often have created special rules for applicants with mental disorders. The authorities in charge of admission to the Louisiana bar required extensive disclosure of mental health status, even if an applicant's professional functioning was not impaired. After the U.S. Department of Justice found that Louisiana's practices violated applicants' rights under the Americans with Disabilities Act, the state agreed to focus on applicants' functional impairment rather than on mental disorders. This settlement may provide a model for licensure boards in other states and for other professions, including the health professions.
Sociocultural Behavior Influence Modelling & Assessment: Current Work and Research Frontiers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis
A common problem associated with the effort to better assess potential behaviors of various individuals within different countries is the shear difficulty in comprehending the dynamic nature of populations, particularly over time and considering feedback effects. This paper discusses a theory-based analytical capability designed to enable analysts to better assess the influence of events on individuals interacting within a country or region. These events can include changes in policy, man-made or natural disasters, migration, war, or other changes in environmental/economic conditions. In addition, this paper describes potential extensions of this type of research to enable more timely and accurate assessments.
NASA Technical Reports Server (NTRS)
Buchanan, H. J.
1983-01-01
Work performed in Large Space Structures Controls research and development program at Marshall Space Flight Center is described. Studies to develop a multilevel control approach which supports a modular or building block approach to the buildup of space platforms are discussed. A concept has been developed and tested in three-axis computer simulation utilizing a five-body model of a basic space platform module. Analytical efforts have continued to focus on extension of the basic theory and subsequent application. Consideration is also given to specifications to evaluate several algorithms for controlling the shape of Large Space Structures.
Continuum and discrete approach in modeling biofilm development and structure: a review.
Mattei, M R; Frunzo, L; D'Acunto, B; Pechaud, Y; Pirozzi, F; Esposito, G
2018-03-01
The scientific community has recognized that almost 99% of the microbial life on earth is represented by biofilms. Considering the impacts of their sessile lifestyle on both natural and human activities, extensive experimental activity has been carried out to understand how biofilms grow and interact with the environment. Many mathematical models have also been developed to simulate and elucidate the main processes characterizing the biofilm growth. Two main mathematical approaches for biomass representation can be distinguished: continuum and discrete. This review is aimed at exploring the main characteristics of each approach. Continuum models can simulate the biofilm processes in a quantitative and deterministic way. However, they require a multidimensional formulation to take into account the biofilm spatial heterogeneity, which makes the models quite complicated, requiring significant computational effort. Discrete models are more recent and can represent the typical multidimensional structural heterogeneity of biofilm reflecting the experimental expectations, but they generate computational results including elements of randomness and introduce stochastic effects into the solutions.
An improved model for the Earth's gravity field
NASA Technical Reports Server (NTRS)
Tapley, B. D.; Shum, C. K.; Yuan, D. N.; Ries, J. C.; Schutz, B. E.
1989-01-01
An improved model for the Earth's gravity field, TEG-1, was determined using data sets from fourteen satellites, spanning the inclination ranges from 15 to 115 deg, and global surface gravity anomaly data. The satellite measurements include laser ranging data, Doppler range-rate data, and satellite-to-ocean radar altimeter data measurements, which include the direct height measurement and the differenced measurements at ground track crossings (crossover measurements). Also determined was another gravity field model, TEG-1S, which included all the data sets in TEG-1 with the exception of direct altimeter data. The effort has included an intense scrutiny of the gravity field solution methodology. The estimated parameters included geopotential coefficients complete to degree and order 50 with selected higher order coefficients, ocean and solid Earth tide parameters, Doppler tracking station coordinates and the quasi-stationary sea surface topography. Extensive error analysis and calibration of the formal covariance matrix indicate that the gravity field model is a significant improvement over previous models and can be used for general applications in geodesy.
Implementing ADM1 for plant-wide benchmark simulations in Matlab/Simulink.
Rosen, C; Vrecko, D; Gernaey, K V; Pons, M N; Jeppsson, U
2006-01-01
The IWA Anaerobic Digestion Model No.1 (ADM1) was presented in 2002 and is expected to represent the state-of-the-art model within this field in the future. Due to its complexity the implementation of the model is not a simple task and several computational aspects need to be considered, in particular if the ADM1 is to be included in dynamic simulations of plant-wide or even integrated systems. In this paper, the experiences gained from a Matlab/Simulink implementation of ADM1 into the extended COST/IWA Benchmark Simulation Model (BSM2) are presented. Aspects related to system stiffness, model interfacing with the ASM family, mass balances, acid-base equilibrium and algebraic solvers for pH and other troublesome state variables, numerical solvers and simulation time are discussed. The main conclusion is that if implemented properly, the ADM1 will also produce high-quality results in dynamic plant-wide simulations including noise, discrete sub-systems, etc. without imposing any major restrictions due to extensive computational efforts.
Modeling Physiological Systems in the Human Body as Networks of Quasi-1D Fluid Flows
NASA Astrophysics Data System (ADS)
Staples, Anne
2008-11-01
Extensive research has been done on modeling human physiology. Most of this work has been aimed at developing detailed, three-dimensional models of specific components of physiological systems, such as a cell, a vein, a molecule, or a heart valve. While efforts such as these are invaluable to our understanding of human biology, if we were to construct a global model of human physiology with this level of detail, computing even a nanosecond in this computational being's life would certainly be prohibitively expensive. With this in mind, we derive the Pulsed Flow Equations, a set of coupled one-dimensional partial differential equations, specifically designed to capture two-dimensional viscous, transport, and other effects, and aimed at providing accurate and fast-to-compute global models for physiological systems represented as networks of quasi one-dimensional fluid flows. Our goal is to be able to perform faster-than-real time simulations of global processes in the human body on desktop computers.
Bending strength model for internal spur gear teeth
NASA Technical Reports Server (NTRS)
Savage, Michael; Rubadeux, K. L.; Coe, H. H.
1995-01-01
Internal spur gear teeth are normally stronger than pinion teeth of the same pitch and face width since external teeth are smaller at the base. However, ring gears which are narrower have an unequal addendum or are made of a material with a lower strength than that of the meshing pinion may be loaded more critically in bending. In this study, a model for the bending strength of an internal gear tooth as a function of the applied load pressure angle is presented which is based on the inscribed Lewis constant strength parabolic beam. The bending model includes a stress concentration factor and an axial compression term which are extensions of the model for an external gear tooth. The geometry of the Lewis factor determination is presented, the iteration to determine the factor is described, and the bending strength J factor is compared to that of an external gear tooth. This strength model will assist optimal design efforts for unequal addendum gears and gears of mixed materials.
Safety behavior: Job demands, job resources, and perceived management commitment to safety.
Hansez, Isabelle; Chmiel, Nik
2010-07-01
The job demands-resources model posits that job demands and resources influence outcomes through job strain and work engagement processes. We test whether the model can be extended to effort-related "routine" safety violations and "situational" safety violations provoked by the organization. In addition we test more directly the involvement of job strain than previous studies which have used burnout measures. Structural equation modeling provided, for the first time, evidence of predicted relationships between job strain and "routine" violations and work engagement with "routine" and "situational" violations, thereby supporting the extension of the job demands-resources model to safety behaviors. In addition our results showed that a key safety-specific construct 'perceived management commitment to safety' added to the explanatory power of the job demands-resources model. A predicted path from job resources to perceived management commitment to safety was highly significant, supporting the view that job resources can influence safety behavior through both general motivational involvement in work (work engagement) and through safety-specific processes.
Fuels and Lubrication Researcher at the Aircraft Engine Research Laboratory
1943-08-21
A researcher at the National Advisory Committee for Aeronautics (NACA) Aircraft Engine Research Laboratory studies the fuel ignition process. Improved fuels and lubrication was an area of particular emphasis at the laboratory during World War II. The military sought to use existing types of piston engines in order to get large numbers of aircraft into the air as quickly as possible. To accomplish its goals, however, the military needed to increase the performance of these engines without having to wait for new models or extensive redesigns. The Aircraft Engine Research Laboratory was called on to lead this effort. The use of superchargers successfully enhanced engine performance, but the resulting heat increased engine knock [fuel detonation] and structural wear. These effects could be offset with improved cooling, lubrication, and fuel mixtures. The NACA researchers in the Fuels and Lubrication Division concentrated on new synthetic fuels, higher octane fuels, and fuel-injection systems. The laboratory studied 16 different types of fuel blends during the war, including extensive investigations of triptane and xylidine.
Madhusoodhanan, C G; Sreeja, K G; Eldho, T I
2016-10-01
Climate change is a major concern in the twenty-first century and its assessments are associated with multiple uncertainties, exacerbated and confounded in the regions where human interventions are prevalent. The present study explores the challenges for climate change impact assessment on the water resources of India, one of the world's largest human-modified systems. The extensive human interventions in the Energy-Land-Water-Climate (ELWC) nexus significantly impact the water resources of the country. The direct human interventions in the landscape may surpass/amplify/mask the impacts of climate change and in the process also affect climate change itself. Uncertainties in climate and resource assessments add to the challenge. Formulating coherent resource and climate change policies in India would therefore require an integrated approach that would assess the multiple interlinkages in the ELWC nexus and distinguish the impacts of global climate change from that of regional human interventions. Concerted research efforts are also needed to incorporate the prominent linkages in the ELWC nexus in climate/earth system modelling.
VO-Dance an IVOA tools to easy publish data into VO and it's extension on planetology request
NASA Astrophysics Data System (ADS)
Smareglia, R.; Capria, M. T.; Molinaro, M.
2012-09-01
Data publishing through the self standing portals can be joined to VO resource publishing, i.e. astronomical resources deployed through VO compliant services. Since the IVOA (International Virtual Observatory Alliance) provides many protocols and standards for the various data flavors (images, spectra, catalogues … ), and since the data center has as a goal to grow up in number of hosted archives and services providing, the idea arose to find a way to easily deploy and maintain VO resources. VO-Dance is a java web application developed at IA2 that addresses this idea creating, in a dynamical way, VO resources out of database tables or views. It is structured to be potentially DBMS and platform independent and consists of 3 main tokens, an internal DB to store resources description and model metadata information, a restful web application to deploy the resources to the VO community. It's extension to planetology request is under study to best effort INAF software development and archive efficiency.
Stability and Bifurcation of a Fishery Model with Crowley-Martin Functional Response
NASA Astrophysics Data System (ADS)
Maiti, Atasi Patra; Dubey, B.
To understand the dynamics of a fishery system, a nonlinear mathematical model is proposed and analyzed. In an aquatic environment, we considered two populations: one is prey and another is predator. Here both the fish populations grow logistically and interaction between them is of Crowley-Martin type functional response. It is assumed that both the populations are harvested and the harvesting effort is assumed to be dynamical variable and tax is considered as a control variable. The existence of equilibrium points and their local stability are examined. The existence of Hopf-bifurcation, stability and direction of Hopf-bifurcation are also analyzed with the help of Center Manifold theorem and normal form theory. The global stability behavior of the positive equilibrium point is also discussed. In order to find the value of optimal tax, the optimal harvesting policy is used. To verify our analytical findings, an extensive numerical simulation is carried out for this model system.
COINSTAC: Decentralizing the future of brain imaging analysis
Ming, Jing; Verner, Eric; Sarwate, Anand; Kelly, Ross; Reed, Cory; Kahleck, Torran; Silva, Rogers; Panta, Sandeep; Turner, Jessica; Plis, Sergey; Calhoun, Vince
2017-01-01
In the era of Big Data, sharing neuroimaging data across multiple sites has become increasingly important. However, researchers who want to engage in centralized, large-scale data sharing and analysis must often contend with problems such as high database cost, long data transfer time, extensive manual effort, and privacy issues for sensitive data. To remove these barriers to enable easier data sharing and analysis, we introduced a new, decentralized, privacy-enabled infrastructure model for brain imaging data called COINSTAC in 2016. We have continued development of COINSTAC since this model was first introduced. One of the challenges with such a model is adapting the required algorithms to function within a decentralized framework. In this paper, we report on how we are solving this problem, along with our progress on several fronts, including additional decentralized algorithms implementation, user interface enhancement, decentralized regression statistic calculation, and complete pipeline specifications. PMID:29123643
The Friction Force Determination of Large-Sized Composite Rods in Pultrusion
NASA Astrophysics Data System (ADS)
Grigoriev, S. N.; Krasnovskii, A. N.; Kazakov, I. A.
2014-08-01
Nowadays, the simple pull-force models of pultrusion process are not suitable for large sized rods because they are not considered a chemical shrinkage and thermal expansion acting in cured material inside the die. But the pulling force of the resin-impregnated fibers as they travels through the heated die is essential factor in the pultrusion process. In order to minimize the number of trial-and-error experiments a new mathematical approach to determine the frictional force is presented. The governing equations of the model are stated in general terms and various simplifications are implemented in order to obtain solutions without extensive numerical efforts. The influence of different pultrusion parameters on the frictional force value is investigated. The results obtained by the model can establish a foundation by which process control parameters are selected to achieve an appropriate pull-force and can be used for optimization pultrusion process.
Extension of HCDstruct for Transonic Aeroservoelastic Analysis of Unconventional Aircraft Concepts
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
A substantial effort has been made to implement an enhanced aerodynamic modeling capability in the Higher-fidelity Conceptual Design and structural optimization tool. This additional capability is needed for a rapid, physics-based method of modeling advanced aircraft concepts at risk of structural failure due to dynamic aeroelastic instabilities. To adequately predict these instabilities, in particular for transonic applications, a generalized aerodynamic matching algorithm was implemented to correct the doublet-lattice model available in Nastran using solution data from a priori computational fluid dynamics anal- ysis. This new capability is demonstrated for two tube-and-wing aircraft configurations, including a Boeing 737-200 for implementation validation and the NASA D8 as a first use case. Results validate the current implementation of the aerodynamic matching utility and demonstrate the importance of using such a method for aircraft configurations featuring fuselage-wing aerodynamic interaction.
Semantic Support for Complex Ecosystem Research Environments
NASA Astrophysics Data System (ADS)
Klawonn, M.; McGuinness, D. L.; Pinheiro, P.; Santos, H. O.; Chastain, K.
2015-12-01
As ecosystems come under increasing stresses from diverse sources, there is growing interest in research efforts aimed at monitoring, modeling, and improving understanding of ecosystems and protection options. We aimed to provide a semantic infrastructure capable of representing data initially related to one large aquatic ecosystem research effort - the Jefferson project at Lake George. This effort includes significant historical observational data, extensive sensor-based monitoring data, experimental data, as well as model and simulation data covering topics including lake circulation, watershed runoff, lake biome food webs, etc. The initial measurement representation has been centered on monitoring data and related provenance. We developed a human-aware sensor network ontology (HASNetO) that leverages existing ontologies (PROV-O, OBOE, VSTO*) in support of measurement annotations. We explicitly support the human-aware aspects of human sensor deployment and collection activity to help capture key provenance that often is lacking. Our foundational ontology has since been generalized into a family of ontologies and used to create our human-aware data collection infrastructure that now supports the integration of measurement data along with simulation data. Interestingly, we have also utilized the same infrastructure to work with partners who have some more specific needs for specifying the environmental conditions where measurements occur, for example, knowing that an air temperature is not an external air temperature, but of the air temperature when windows are shut and curtains are open. We have also leveraged the same infrastructure to work with partners more interested in modeling smart cities with data feeds more related to people, mobility, environment, and living. We will introduce our human-aware data collection infrastructure, and demonstrate how it uses HASNetO and its supporting SOLR-based search platform to support data integration and semantic browsing. Further we will present learnings from its use in three relatively diverse large ecosystem research efforts and highlight some benefits and challenges related to our semantically-enhanced foundation.
Contrast and the justification of effort.
Klein, Emily D; Bhatt, Ramesh S; Zentall, Thomas R
2005-04-01
When humans are asked to evaluate rewards or outcomes that follow unpleasant (e.g., high-effort) events, they often assign higher value to that reward. This phenomenon has been referred to as cognitive dissonance or justification of effort. There is now evidence that a similar phenomenon can be found in nonhuman animals. When demonstrated in animals, however, it has been attributed to contrast between the unpleasant high effort and the conditioned stimulus for food. In the present experiment, we asked whether an analogous effect could be found in humans under conditions similar to those found in animals. Adult humans were trained to discriminate between shapes that followed a high-effort versus a low-effort response. In test, participants were found to prefer shapes that followed the high-effort response in training. These results suggest the possibility that contrast effects of the sort extensively studied in animals may play a role in cognitive dissonance and other related phenomena in humans.
Task Prioritization in Dual-Tasking: Instructions versus Preferences
Jansen, Reinier J.; van Egmond, René; de Ridder, Huib
2016-01-01
The role of task prioritization in performance tradeoffs during multi-tasking has received widespread attention. However, little is known on whether people have preferences regarding tasks, and if so, whether these preferences conflict with priority instructions. Three experiments were conducted with a high-speed driving game and an auditory memory task. In Experiment 1, participants did not receive priority instructions. Participants performed different sequences of single-task and dual-task conditions. Task performance was evaluated according to participants’ retrospective accounts on preferences. These preferences were reformulated as priority instructions in Experiments 2 and 3. The results showed that people differ in their preferences regarding task prioritization in an experimental setting, which can be overruled by priority instructions, but only after increased dual-task exposure. Additional measures of mental effort showed that performance tradeoffs had an impact on mental effort. The interpretation of these findings was used to explore an extension of Threaded Cognition Theory with Hockey’s Compensatory Control Model. PMID:27391779
Semantic-gap-oriented active learning for multilabel image annotation.
Tang, Jinhui; Zha, Zheng-Jun; Tao, Dacheng; Chua, Tat-Seng
2012-04-01
User interaction is an effective way to handle the semantic gap problem in image annotation. To minimize user effort in the interactions, many active learning methods were proposed. These methods treat the semantic concepts individually or correlatively. However, they still neglect the key motivation of user feedback: to tackle the semantic gap. The size of the semantic gap of each concept is an important factor that affects the performance of user feedback. User should pay more efforts to the concepts with large semantic gaps, and vice versa. In this paper, we propose a semantic-gap-oriented active learning method, which incorporates the semantic gap measure into the information-minimization-based sample selection strategy. The basic learning model used in the active learning framework is an extended multilabel version of the sparse-graph-based semisupervised learning method that incorporates the semantic correlation. Extensive experiments conducted on two benchmark image data sets demonstrated the importance of bringing the semantic gap measure into the active learning process.
The Impact of Attention on Judgments of Frequency and Duration
Winkler, Isabell; Glauer, Madlen; Betsch, Tilmann; Sedlmeier, Peter
2015-01-01
Previous studies that examined human judgments of frequency and duration found an asymmetrical relationship: While frequency judgments were quite accurate and independent of stimulus duration, duration judgments were highly dependent upon stimulus frequency. A potential explanation for these findings is that the asymmetry is moderated by the amount of attention directed to the stimuli. In the current experiment, participants' attention was manipulated in two ways: (a) intrinsically, by varying the type and arousal potential of the stimuli (names, low-arousal and high-arousal pictures), and (b) extrinsically, by varying the physical effort participants expended during the stimulus presentation (by lifting a dumbbell vs. relaxing the arm). Participants processed stimuli with varying presentation frequencies and durations and were subsequently asked to estimate the frequency and duration of each stimulus. Sensitivity to duration increased for pictures in general, especially when processed under physical effort. A large effect of stimulus frequency on duration judgments was obtained for all experimental conditions, but a similar large effect of presentation duration on frequency judgments emerged only in the conditions that could be expected to draw high amounts of attention to the stimuli: when pictures were judged under high physical effort. Almost no difference in the mutual impact of frequency and duration was obtained for low-arousal or high-arousal pictures. The mechanisms underlying the simultaneous processing of frequency and duration are discussed with respect to existing models derived from animal research. Options for the extension of such models to human processing of frequency and duration are suggested. PMID:26000712
The impact of attention on judgments of frequency and duration.
Winkler, Isabell; Glauer, Madlen; Betsch, Tilmann; Sedlmeier, Peter
2015-01-01
Previous studies that examined human judgments of frequency and duration found an asymmetrical relationship: While frequency judgments were quite accurate and independent of stimulus duration, duration judgments were highly dependent upon stimulus frequency. A potential explanation for these findings is that the asymmetry is moderated by the amount of attention directed to the stimuli. In the current experiment, participants' attention was manipulated in two ways: (a) intrinsically, by varying the type and arousal potential of the stimuli (names, low-arousal and high-arousal pictures), and (b) extrinsically, by varying the physical effort participants expended during the stimulus presentation (by lifting a dumbbell vs. relaxing the arm). Participants processed stimuli with varying presentation frequencies and durations and were subsequently asked to estimate the frequency and duration of each stimulus. Sensitivity to duration increased for pictures in general, especially when processed under physical effort. A large effect of stimulus frequency on duration judgments was obtained for all experimental conditions, but a similar large effect of presentation duration on frequency judgments emerged only in the conditions that could be expected to draw high amounts of attention to the stimuli: when pictures were judged under high physical effort. Almost no difference in the mutual impact of frequency and duration was obtained for low-arousal or high-arousal pictures. The mechanisms underlying the simultaneous processing of frequency and duration are discussed with respect to existing models derived from animal research. Options for the extension of such models to human processing of frequency and duration are suggested.
Lattice Boltzmann Methods to Address Fundamental Boiling and Two-Phase Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uddin, Rizwan
2012-01-01
This report presents the progress made during the fourth (no cost extension) year of this three-year grant aimed at the development of a consistent Lattice Boltzmann formulation for boiling and two-phase flows. During the first year, a consistent LBM formulation for the simulation of a two-phase water-steam system was developed. Results of initial model validation in a range of thermo-dynamic conditions typical for Boiling Water Reactors (BWRs) were shown. Progress was made on several fronts during the second year. Most important of these included the simulation of the coalescence of two bubbles including the surface tension effects. Work during themore » third year focused on the development of a new lattice Boltzmann model, called the artificial interface lattice Boltzmann model (AILB model) for the 3 simulation of two-phase dynamics. The model is based on the principle of free energy minimization and invokes the Gibbs-Duhem equation in the formulation of non-ideal forcing function. This was reported in detail in the last progress report. Part of the efforts during the last (no-cost extension) year were focused on developing a parallel capability for the 2D as well as for the 3D codes developed in this project. This will be reported in the final report. Here we report the work carried out on testing the AILB model for conditions including the thermal effects. A simplified thermal LB model, based on the thermal energy distribution approach, was developed. The simplifications are made after neglecting the viscous heat dissipation and the work done by pressure in the original thermal energy distribution model. Details of the model are presented here, followed by a discussion of the boundary conditions, and then results for some two-phase thermal problems.« less
SSME structural dynamic model development, phase 2
NASA Technical Reports Server (NTRS)
Foley, M. J.; Wilson, V. L.
1985-01-01
A set of test correlated mathematical models of the SSME High Pressure Oxygen Turbopump (HPOTP) housing and rotor assembly was produced. New analysis methods within the EISI/EAL and SPAR systems were investigated and runstreams for future use were developed. The LOX pump models have undergone extensive modification since the first phase of this effort was completed. The rotor assembly from the original model was abandoned and a new, more detailed model constructed. A description of the new rotor math model is presented. Also, the pump housing model was continually modified as additional test data have become available. This model is documented along with measured test results. Many of the more advanced features of the EAL/SPAR finite element analysis system were exercised. These included the cyclic symmetry option, the macro-element procedures, and the fluid analysis capability. In addition, a new tool was developed that allows an automated analysis of a disjoint structure in terms of its component modes. A complete description of the implementation of the Craig-Bampton method is given along with two worked examples.
NASA Astrophysics Data System (ADS)
Moonen, P.; Gromke, C.; Dorer, V.
2013-08-01
The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David
A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less
Climate Reanalysis: Progress and Future Prospects
NASA Technical Reports Server (NTRS)
Gelaro, Ron
2018-01-01
Reanalysis is the process whereby an unchanging data assimilation system is used to provide a consistent reprocessing of observations, typically spanning an extended segment of the historical data record. The process relies on an underlying model to combine often-disparate observations in a physically consistent manner, enabling production of gridded data sets for a broad range of applications including the study of historical weather events, preparation of climatologies, business sector development and, more recently, climate monitoring. Over the last few decades, several generations of reanalyses of the global atmosphere have been produced by various operational and research centers, focusing more or less on the period of regular conventional and satellite observations beginning in the mid to late twentieth century. There have also been successful efforts to extend atmospheric reanalyses back to the late nineteenth and early twentieth centuries, using mostly surface observations. Much progress has resulted from (and contributed to) advancements in numerical weather prediction, especially improved models and data assimilation techniques, increased computing capacity, the availability of new observation types and efforts to recover and improve the quality of historical ones. The recent extension of forecast systems that allow integrated modeling of meteorological, oceanic, land surface, and chemical variables provide the basic elements for coupled data assimilation. This has opened the door to the development of a new generation of coupled reanalyses of the Earth system, or integrated Earth system analyses (IESA). Evidence so far suggests that this approach can improve the analysis of currently uncoupled components of the Earth system, especially at their interface, and lead to increased predictability. However, extensive analysis coupling as envisioned for IESA, while progressing, still presents significant challenges. These include model biases that can be exacerbated when coupled, component systems with different physical characteristics and different spatial and temporal scales, and component observations in different media with different spatial and temporal frequencies and different latencies. Quantification of uncertainty in reanalyses is also a critical challenge and is important for expanding their utility as a tool for climate change assessment. This talk provides a brief overview of the progress of reanalysis development during recent decades, and describes remaining challenges in the progression toward coupled Earth system reanalyses.
CORSAIR Solar Energetic Particle Model
NASA Astrophysics Data System (ADS)
Sandroos, A.
2013-05-01
Acceleration of particles in coronal mass ejection (CME) driven shock waves is the most commonly accepted and best developed theory of the genesis of gradual solar energetic particle (SEP) events. The underlying acceleration mechanism is the diffusive shock acceleration (DSA). According to DSA, particles scatter from fluctuations present in the ambient magnetic field, which causes some particles to encounter the shock front repeatedly and to gain energy during each crossing. Currently STEREO and near-Earth spacecraft are providing valuable multi-point information on how SEP properties, such as composition and energy spectra, vary in longitude. Initial results have shown that longitude distributions of large CME-associated SEP events are much wider than reported in earlier studies. These findings have important consequences on SEP modeling. It is important to extend the present models into two or three spatial coordinates to properly take into account the effects of coronal and interplanetary (IP) magnetic geometry, and evolution of the CME and the associated shock, on the acceleration and transport of SEPs. We give a status update on CORSAIR project, which is an effort to develop a new self-consistent (total energy conserving) DSA acceleration model that is capable of modeling energetic particle acceleration and transport in IP space in two or three spatial dimensions. In the new model particles are propagated using guiding center approximation. Waves are modeled as (Lagrangian) wave packets propagating (anti)parallel to ambient magnetic field. Diffusion coefficients related to scattering from the waves are calculated using quasilinear theory. State of ambient plasma is obtained from an MHD simulation or by using idealized analytic models. CORSAIR is an extension to our earlier efforts to model the effects of magnetic geometry on SEP acceleration (Sandroos & Vainio, 2007,2009).
The JPEG XT suite of standards: status and future plans
NASA Astrophysics Data System (ADS)
Richter, Thomas; Bruylants, Tim; Schelkens, Peter; Ebrahimi, Touradj
2015-09-01
The JPEG standard has known an enormous market adoption. Daily, billions of pictures are created, stored and exchanged in this format. The JPEG committee acknowledges this success and spends continued efforts in maintaining and expanding the standard specifications. JPEG XT is a standardization effort targeting the extension of the JPEG features by enabling support for high dynamic range imaging, lossless and near-lossless coding, and alpha channel coding, while also guaranteeing backward and forward compatibility with the JPEG legacy format. This paper gives an overview of the current status of the JPEG XT standards suite. It discusses the JPEG legacy specification, and details how higher dynamic range support is facilitated both for integer and floating-point color representations. The paper shows how JPEG XT's support for lossless and near-lossless coding of low and high dynamic range images is achieved in combination with backward compatibility to JPEG legacy. In addition, the extensible boxed-based JPEG XT file format on which all following and future extensions of JPEG will be based is introduced. This paper also details how the lossy and lossless representations of alpha channels are supported to allow coding transparency information and arbitrarily shaped images. Finally, we conclude by giving prospects on upcoming JPEG standardization initiative JPEG Privacy & Security, and a number of other possible extensions in JPEG XT.
Bed Bug Information Clearinghouse
Its purpose is to help states, communities, and consumers in efforts to prevent and control bed bug infestations. Currently includes only reviewed material from federal/state/local government agencies, extension services, and universities.
Extensions to the time lag models for practical application to rocket engine stability design
NASA Astrophysics Data System (ADS)
Casiano, Matthew J.
The combustion instability problem in liquid-propellant rocket engines (LREs) has remained a tremendous challenge since their discovery in the 1930s. Improvements are usually made in solving the combustion instability problem primarily using computational fluid dynamics (CFD) and also by testing demonstrator engines. Another approach is to use analytical models. Analytical models can be used such that design, redesign, or improvement of an engine system is feasible in a relatively short period of time. Improvements to the analytical models can greatly aid in design efforts. A thorough literature review is first conducted on liquid-propellant rocket engine (LRE) throttling. Throttling is usually studied in terms of vehicle descent or ballistic missile control however there are many other cases where throttling is important. It was found that combustion instabilities are one of a few major issues that occur during deep throttling (other major issues are heat transfer concerns, performance loss, and pump dynamics). In the past and again recently, gas injected into liquid propellants has shown to be a viable solution to throttle engines and to eliminate some forms of combustion instability. This review uncovered a clever solution that was used to eliminate a chug instability in the Common Extensible Cryogenic Engine (CECE), a modified RL10 engine. A separate review was also conducted on classic time lag combustion instability models. Several new stability models are developed by incorporating important features to the classic and contemporary models, which are commonly used in the aerospace rocket industry. The first two models are extensions of the original Crocco and Cheng concentrated combustion model with feed system contributions. A third new model is an extension to the Wenzel and Szuch double-time lag model also with feed system contributions. The first new model incorporates the appropriate injector acoustic boundary condition which is neglected in contemporary models. This new feature shows that the injector boundary can play a significant role for combustion stability, especially for gaseous injection systems or a system with an injector orifice on the order of the size of the chamber. The second new model additionally accounts for resistive effects. Advanced signal analysis techniques are used to extract frequency-dependent damping from a gas generator component data set. The damping values are then used in the new stability model to more accurately represent the chamber response of the component. The results show a more realistic representation of stability margin by incorporating the appropriate damping effects into the chamber response from data. The original Crocco model, a contemporary model, and the two new models are all compared and contrasted to a marginally stable test case showing their applicability. The model that incorporates resistive aspects shows the best comparison to the test data. Parametrics are also examined to show the influence of the new features and their applicability. The new features allow a more accurate representation of stability margin to be obtained. The third new model is an extension to the Wenzel and Szuch double-time lag chug model. The feed system chug model is extended to account for generic propellant flow rates. This model is also extended to incorporate aspects due to oxygen boiling and helium injection in the feed system. The solutions to the classic models, for the single-time lag and the double-time lag models, are often plotted on a practical engine operating map, however the models have presented some difficulties for numerical algorithms for several reasons. Closed-form solutions for use on these practical operating maps are formulated and developed. These models are incorporated in a graphical user interface tool and the new model is compared to an extensive data set. It correctly predicts the stability behavior at various operating conditions incorporating the influence of injected helium and boiling oxygen in the feed system.
Taking Innovation To Scale In Primary Care Practices: The Functions Of Health Care Extension.
Ono, Sarah S; Crabtree, Benjamin F; Hemler, Jennifer R; Balasubramanian, Bijal A; Edwards, Samuel T; Green, Larry A; Kaufman, Arthur; Solberg, Leif I; Miller, William L; Woodson, Tanisha Tate; Sweeney, Shannon M; Cohen, Deborah J
2018-02-01
Health care extension is an approach to providing external support to primary care practices with the aim of diffusing innovation. EvidenceNOW was launched to rapidly disseminate and implement evidence-based guidelines for cardiovascular preventive care in the primary care setting. Seven regional grantee cooperatives provided the foundational elements of health care extension-technological and quality improvement support, practice capacity building, and linking with community resources-to more than two hundred primary care practices in each region. This article describes how the cooperatives varied in their approaches to extension and provides early empirical evidence that health care extension is a feasible and potentially useful approach for providing quality improvement support to primary care practices. With investment, health care extension may be an effective platform for federal and state quality improvement efforts to create economies of scale and provide practices with more robust and coordinated support services.
Evaluating Satellite-based Rainfall Estimates for Basin-scale Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Yilmaz, K. K.; Hogue, T. S.; Hsu, K.; Gupta, H. V.; Mahani, S. E.; Sorooshian, S.
2003-12-01
The reliability of any hydrologic simulation and basin outflow prediction effort depends primarily on the rainfall estimates. The problem of estimating rainfall becomes more obvious in basins with scarce or no rain gauges. We present an evaluation of satellite-based rainfall estimates for basin-scale hydrologic modeling with particular interest in ungauged basins. The initial phase of this study focuses on comparison of mean areal rainfall estimates from ground-based rain gauge network, NEXRAD radar Stage-III, and satellite-based PERSIANN (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks) and their influence on hydrologic model simulations over several basins in the U.S. Six-hourly accumulations of the above competing mean areal rainfall estimates are used as input to the Sacramento Soil Moisture Accounting Model. Preliminary experiments for the Leaf River Basin in Mississippi, for the period of March 2000 - June 2002, reveals that seasonality plays an important role in the comparison. There is an overestimation during the summer and underestimation during the winter in satellite-based rainfall with respect to the competing rainfall estimates. The consequence of this result on the hydrologic model is that simulated discharge underestimates the major observed peak discharges during early spring for the basin under study. Future research will entail developing correction procedures, which depend on different factors such as seasonality, geographic location and basin size, for satellite-based rainfall estimates over basins with dense rain gauge network and/or radar coverage. Extension of these correction procedures to satellite-based rainfall estimates over ungauged basins with similar characteristics has the potential for reducing the input uncertainty in ungauged basin modeling efforts.
Lewis Research Center earth resources program
NASA Technical Reports Server (NTRS)
Mark, H.
1972-01-01
The Lewis Research Center earth resources program efforts are in the areas of: (1) monitoring and rapid evaluation of water quality; (2) determining ice-type and ice coverage distribution to aid operations in a possible extension of the Great Lakes ice navigation and shipping season; (3) monitoring spread of crop viruses; and (4) extent of damage to strip mined areas as well as success of efforts to rehabilitate such areas for agriculture.
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aponte, C.I.
F and H Tank Farms generate supernate and sludge contaminated Low-Level Waste. The waste is collected, characterized, and packaged for disposal. Before the waste can be disposed of, however, it must be properly characterized. Since the radionuclide distribution in typical supernate is well known, its characterization is relatively straight forward and requires minimal effort. Non-routine waste, including potentially sludge contaminated, requires much more effort to effectively characterize. The radionuclide distribution must be determined. In some cases the waste can be contaminated by various sludge transfers with unique radionuclide distributions. In these cases, the characterization can require an extensive effort. Evenmore » after an extensive characterization effort, the container must still be prepared for shipping. Therefore a significant amount of time may elapse from the time the waste is generated until the time of disposal. During the time it is possible for a tornado or high wind scenario to occur. The purpose of this report is to determine the effect of a tornado on potential sludge contaminated waste, or Transuranic (TRU) waste in B-25s [large storage containers], to evaluate the potential impact on F and H Tank Farms, and to help establish a B-25 control program for tornado events.« less
NASA Technical Reports Server (NTRS)
Hall, Callie; Arnone, Robert
2006-01-01
The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved products are required between the researcher and customer. This document details the methods by which these coastal ocean products are produced including the data flow, distribution, and verification. Product applications and the degree to which these products are used successfully within NOAA and coordinated with the Mississippi Department of Marine Resources (MDMR) is benchmarked.
A comparative study of the constitutive models for silicon carbide
NASA Astrophysics Data System (ADS)
Ding, Jow-Lian; Dwivedi, Sunil; Gupta, Yogendra
2001-06-01
Most of the constitutive models for polycrystalline silicon carbide were developed and evaluated using data from either normal plate impact or Hopkinson bar experiments. At ISP, extensive efforts have been made to gain detailed insight into the shocked state of the silicon carbide (SiC) using innovative experimental methods, viz., lateral stress measurements, in-material unloading measurements, and combined compression shear experiments. The data obtained from these experiments provide some unique information for both developing and evaluating material models. In this study, these data for SiC were first used to evaluate some of the existing models to identify their strength and possible deficiencies. Motivated by both the results of this comparative study and the experimental observations, an improved phenomenological model was developed. The model incorporates pressure dependence of strength, rate sensitivity, damage evolution under both tension and compression, pressure confinement effect on damage evolution, stiffness degradation due to damage, and pressure dependence of stiffness. The model developments are able to capture most of the material features observed experimentally, but more work is needed to better match the experimental data quantitatively.
Lum, Kristian; Swarup, Samarth; Eubank, Stephen; Hawdon, James
2014-09-06
We build an agent-based model of incarceration based on the susceptible-infected-suspectible (SIS) model of infectious disease propagation. Our central hypothesis is that the observed racial disparities in incarceration rates between Black and White Americans can be explained as the result of differential sentencing between the two demographic groups. We demonstrate that if incarceration can be spread through a social influence network, then even relatively small differences in sentencing can result in large disparities in incarceration rates. Controlling for effects of transmissibility, susceptibility and influence network structure, our model reproduces the observed large disparities in incarceration rates given the differences in sentence lengths for White and Black drug offenders in the USA without extensive parameter tuning. We further establish the suitability of the SIS model as applied to incarceration by demonstrating that the observed structural patterns of recidivism are an emergent property of the model. In fact, our model shows a remarkably close correspondence with California incarceration data. This work advances efforts to combine the theories and methods of epidemiology and criminology.
Nonlinear Unsteady Aerodynamic Modeling Using Wind Tunnel and Computational Data
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.
2016-01-01
Extensions to conventional aircraft aerodynamic models are required to adequately predict responses when nonlinear unsteady flight regimes are encountered, especially at high incidence angles and under maneuvering conditions. For a number of reasons, such as loss of control, both military and civilian aircraft may extend beyond normal and benign aerodynamic flight conditions. In addition, military applications may require controlled flight beyond the normal envelope, and civilian flight may require adequate recovery or prevention methods from these adverse conditions. These requirements have led to the development of more general aerodynamic modeling methods and provided impetus for researchers to improve both techniques and the degree of collaboration between analytical and experimental research efforts. In addition to more general mathematical model structures, dynamic test methods have been designed to provide sufficient information to allow model identification. This paper summarizes research to develop a modeling methodology appropriate for modeling aircraft aerodynamics that include nonlinear unsteady behaviors using both experimental and computational test methods. This work was done at Langley Research Center, primarily under the NASA Aviation Safety Program, to address aircraft loss of control, prevention, and recovery aerodynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler; Shi, Ying; Santhanagopalan, Shriram
Predictive models of Li-ion battery lifetime must consider a multiplicity of electrochemical, thermal, and mechanical degradation modes experienced by batteries in application environments. To complicate matters, Li-ion batteries can experience different degradation trajectories that depend on storage and cycling history of the application environment. Rates of degradation are controlled by factors such as temperature history, electrochemical operating window, and charge/discharge rate. We present a generalized battery life prognostic model framework for battery systems design and control. The model framework consists of trial functions that are statistically regressed to Li-ion cell life datasets wherein the cells have been aged under differentmore » levels of stress. Degradation mechanisms and rate laws dependent on temperature, storage, and cycling condition are regressed to the data, with multiple model hypotheses evaluated and the best model down-selected based on statistics. The resulting life prognostic model, implemented in state variable form, is extensible to arbitrary real-world scenarios. The model is applicable in real-time control algorithms to maximize battery life and performance. We discuss efforts to reduce lifetime prediction error and accommodate its inevitable impact in controller design.« less
Campbell, Norm R C; Sheldon, Tobe
2010-07-01
To indicate the key elements of current Canadian programs to treat and control hypertension. In the early 1990s Canada had a hypertension treatment and control rate of 13%. A Canadian strategy to prevent and control hypertension was developed and a coalition of national organizations and volunteers formed to develop increasingly extensive programs. The Canadian effort was largely based on annually updated hypertension management recommendations, an integrated and extensive hypertension knowledge translation program and an increasingly comprehensive outcomes assessment program. After the start of the annual process in 1999, there were very large increases in diagnosis and hypertension treatment coupled with dropping rates of cardiovascular disease. More recent initiatives include an extensive education program for the public and people with hypertension, a program to reduce dietary salt and a funded leadership position. The treatment and control rate increased to 66% when last assessed (2007-2009). The study describes important aspects of the Canadian hypertension management programs to aid those wishing to develop similar programs. Many of the programs could be fully or partially implemented by other countries.
2012-01-01
Background Extensive recruitment effort at baseline increases representativeness of study populations by decreasing non-response and associated bias. First, it is not known to what extent increased attrition occurs during subsequent measurement waves among subjects who were hard-to-recruit at baseline and what characteristics the hard-to-recruit dropouts have compared to the hard-to-recruit retainers. Second, it is unknown whether characteristics of hard-to-recruit responders in a prospective population based cohort study are similar across age group and survey method. Methods First, we compared first wave (T1) easy-to-recruit with hard-to-recruit responders of the TRacking Adolescents’ Individual Lives Survey (TRAILS), a prospective population based cohort study of Dutch (pre)adolescents (at first wave: n = 2230, mean age = 11.09 (SD 0.56), 50.8% girls), with regard to response rates at subsequent measurement waves. Second, easy-to-recruit and hard-to-recruit participants at the fourth TRAILS measurement wave (n = 1881, mean age = 19.1 (SD 0.60), 52.3% girls) were compared with fourth wave non-responders and earlier stage drop-outs on family composition, socioeconomic position (SEP), intelligence (IQ), education, sociometric status, substance use, and psychopathology. Results First, over 60% of the hard-to-recruit responders at the first wave were retained in the sample eight years later at the fourth measurement wave. Hard-to-recruit dropouts did not differ from hard-to-recruit retainers. Second, extensive recruitment efforts for the web based survey convinced a population of nineteen year olds with similar characteristics as the hard-to-recruit eleven year olds that were persuaded to participate in a school-based survey. Some characteristics associated with being hard-to-recruit (as compared to being easy-to-recruit) were more pronounced among non-responders, resembling the baseline situation (De Winter et al.2005). Conclusions First, extensive recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods. PMID:22747967
System cost/performance analysis (study 2.3). Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Kazangey, T.
1973-01-01
The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.
NASA Astrophysics Data System (ADS)
Cao, M.; Xiao, J.
2008-02-01
Bearing excitation is one of the most important mechanical sources for vibration and noise generation in machine systems of a broad range of industries. Although extensively investigated, accurately predicting the vibration/acoustic behavior of bearings remains a challenging task because of its complicated nonlinear behaviors. While some ground work has been laid out on single-row deep-grooved ball (DGB) bearing, comprehensive modeling effort on spherical roller bearing (SRB) has yet to be carried out. This is mainly due to the facts that SRB system carries one more extra degree of freedom (DOF) on the moving race (could be either inner or outer race) and in general has more rolling elements compared with DGB. In this study, a comprehensive SRB excitation source model is developed. In addition to the vertical and horizontal displacements considered in previous investigations, the impacts of axial displacement/load are addressed by introducing the DOF in the axial shaft direction. Hence, instead of being treated as pre-assumed constants, the roller-inner/outer race contact angles are formulated as functions of the axial displacement of the moving race to reflect their dependence on the axial movement. The approach presented in this paper accounts for the point contacts between rollers and inner/outer races, as well as line contacts when the loads on individual rollers exceed the limit for point contact. A detailed contact-damping model reflecting the influences of the surface profiles and the speeds of the both contacting elements is developed and applied in the SRB model. Waviness of all the contact surfaces (including inner race, outer race, and rollers) is included and compared in this analysis. Extensive case studies are carried out to reveal the impacts of surface waviness, radial clearance, surface defects, and loading conditions on the force and displacement responses of the SRB system. System design guidelines are recommended based on the simulation results. This model is also applicable for bearing health monitoring, as demonstrated by the numerical case studies showing the frequency response of the system with moderate-to-large point defects on both inner and outer races, as well as the rollers. Comparisons between the simulation results and some conclusions reflecting common sense available in open literature serves as first hand partial validation of the developed model. Future validation efforts and further improvement directions are also provided. The comprehensive model developed in this investigation is a useful tool for machine system design, optimization, and performance evaluation.
NASA Astrophysics Data System (ADS)
Mollica, N. R.; Guo, W.; Cohen, A. L.; Huang, K. F.; Foster, G. L.; Donald, H.; Solow, A.
2017-12-01
Carbonate skeletons of scleractinian corals are important archives of ocean climate and environmental change. However, corals don't accrete their skeletons directly from ambient seawater, but from a calcifying fluid whose composition is strongly regulated. There is mounting evidence that the carbonate chemistry of this calcifying fluid significantly impacts the amount of carbonate the coral can precipitate, which in turn affects the geochemical composition of the skeleton produced. However the mechanistic link between calcifying fluid (cf) chemistry, particularly the up-regulation of pHcf and thereby aragonite saturation state (Ωcf), and coral calcification is not well understood. We explored this link by combining boron isotope measurements with in situ measurements of seawater temperature, salinity, and DIC to estimate Ωcf of nine Porites corals from four Pacific reefs. Associated calcification rates were quantified for each core via CT scanning. We do not observe a relationship between calcification rates and Ωcf or Ωsw. Instead, when we deconvolve calcification into linear extension and skeletal density, a significant correlation is observed between density and Ωcf, and also Ωsw while extension does not correlate with either. These observations are consistent with the two-step model of coral calcification, in which skeleton is secreted in two distinct phases: vertical extension creating new skeletal elements, followed by lateral thickening of existing elements that are covered by living tissue. We developed a numerical model of Porites skeletal growth that builds on this two-step model and links skeletal density with the external seawater environment via its influence on the chemistry of coral calcifying fluid. We validated the model using existing coral skeletal datasets from six Porites species collected across five reef sites, and quantified the effects of each seawater parameter (e.g. temperature, pH, DIC) on skeletal density. Our findings illustrate the sensitivity of the second phase of coral calcification to the carbonate chemistry of the calcifying fluid, and support previous coral proxy system modelling efforts by validating the two-step growth model on annual and seasonal scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miao, Yinbin; Mo, Kun; Jamison, Laura M.
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructure-basedmore » materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize the experimental efforts in FY16 including the following important experiments: (1) in-situ grain growth measurement of nano-grained UO 2; (2) investigation of surface morphology in micrograined UO 2; (3) Nano-indentation experiments on nano- and micro-grained UO 2. The highlight of this year is: we have successfully demonstrated our capability to in-situ measure grain size development while maintaining the stoichiometry of nano-grained UO 2 materials; the experiment is, for the first time, using synchrotron X-ray diffraction to in-situ measure grain growth behavior of UO 2.« less
Aerothermal modeling program, phase 2. Element B: Flow interaction experiment
NASA Technical Reports Server (NTRS)
Nikjooy, M.; Mongia, H. C.; Murthy, S. N. B.; Sullivan, J. P.
1986-01-01
The design process was improved and the efficiency, life, and maintenance costs of the turbine engine hot section was enhanced. Recently, there has been much emphasis on the need for improved numerical codes for the design of efficient combustors. For the development of improved computational codes, there is a need for an experimentally obtained data base to be used at test cases for the accuracy of the computations. The purpose of Element-B is to establish a benchmark quality velocity and scalar measurements of the flow interaction of circular jets with swirling flow typical of that in the dome region of annular combustor. In addition to the detailed experimental effort, extensive computations of the swirling flows are to be compared with the measurements for the purpose of assessing the accuracy of current and advanced turbulence and scalar transport models.
Decoherence and dissipation for a quantum system coupled to a local environment
NASA Technical Reports Server (NTRS)
Gallis, Michael R.
1994-01-01
Decoherence and dissipation in quantum systems has been studied extensively in the context of Quantum Brownian Motion. Effective decoherence in coarse grained quantum systems has been a central issue in recent efforts by Zurek and by Hartle and Gell-Mann to address the Quantum Measurement Problem. Although these models can yield very general classical phenomenology, they are incapable of reproducing relevant characteristics expected of a local environment on a quantum system, such as the characteristic dependence of decoherence on environment spatial correlations. I discuss the characteristics of Quantum Brownian Motion in a local environment by examining aspects of first principle calculations and by the construction of phenomenological models. Effective quantum Langevin equations and master equations are presented in a variety of representations. Comparisons are made with standard results such as the Caldeira-Leggett master equation.
Hints for an extension of the early exercise premium formula for American options
NASA Astrophysics Data System (ADS)
Bermin, Hans-Peter; Kohatsu-Higa, Arturo; Perelló, Josep
2005-09-01
There exists a non-closed formula for the American put option price and non-trivial computations are required to solve it. Strong efforts have been made to propose efficient numerical techniques but few have strong mathematical reasoning to ascertain why they work well. We present an extension of the American put price aiming to catch weaknesses of the numerical methods based on their non-fulfillment of the smooth pasting condition.
Expanding the role of reactive transport models in critical zone processes
Li, Li; Maher, Kate; Navarre-Sitchler, Alexis; Druhan, Jennifer; Meile, Christof; Lawrence, Corey; Moore, Joel; Perdrial, Julia; Sullivan, Pamela; Thompson, Aaron; Jin, Lixin; Bolton, Edward W.; Brantley, Susan L.; Dietrich, William E.; Mayer, K. Ulrich; Steefel, Carl; Valocchi, Albert J.; Zachara, John M.; Kocar, Benjamin D.; McIntosh, Jennifer; Tutolo, Benjamin M.; Kumar, Mukesh; Sonnenthal, Eric; Bao, Chen; Beisman, Joe
2017-01-01
Models test our understanding of processes and can reach beyond the spatial and temporal scales of measurements. Multi-component Reactive Transport Models (RTMs), initially developed more than three decades ago, have been used extensively to explore the interactions of geothermal, hydrologic, geochemical, and geobiological processes in subsurface systems. Driven by extensive data sets now available from intensive measurement efforts, there is a pressing need to couple RTMs with other community models to explore non-linear interactions among the atmosphere, hydrosphere, biosphere, and geosphere. Here we briefly review the history of RTM development, summarize the current state of RTM approaches, and identify new research directions, opportunities, and infrastructure needs to broaden the use of RTMs. In particular, we envision the expanded use of RTMs in advancing process understanding in the Critical Zone, the veneer of the Earth that extends from the top of vegetation to the bottom of groundwater. We argue that, although parsimonious models are essential at larger scales, process-based models offer tools to explore the highly nonlinear coupling that characterizes natural systems. We present seven testable hypotheses that emphasize the unique capabilities of process-based RTMs for (1) elucidating chemical weathering and its physical and biogeochemical drivers; (2) understanding the interactions among roots, micro-organisms, carbon, water, and minerals in the rhizosphere; (3) assessing the effects of heterogeneity across spatial and temporal scales; and (4) integrating the vast quantity of novel data, including “omics” data (genomics, transcriptomics, proteomics, metabolomics), elemental concentration and speciation data, and isotope data into our understanding of complex earth surface systems. With strong support from data-driven sciences, we are now in an exciting era where integration of RTM framework into other community models will facilitate process understanding across disciplines and across scales.
Cheyne, J
1989-01-01
During the typical 12- to 18-month voyage of a vaccine from manufacturer to immunization site, many situations arise in which the cold chain may be interrupted. Extensive efforts have been made in the 1980s to ensure an uninterrupted cold chain through the use of improved equipment and better training of personnel. One important advance is the vaccine cold-chain monitor, which identifies weak spots in the cold chain and prevents the use of heat-damaged vaccine. Further improvements will require efforts by the recipient countries (e.g., better use of the private sector for transport and equipment management), by donor agencies (e.g., greater consideration of the operational and maintenance costs of the equipment selected and resolution of fuel shortages), and by industry (e.g., more appropriate packaging and pricing of vaccine, extension of the expiration period, and increased heat stability.
Liquids and homemade explosive detection
NASA Astrophysics Data System (ADS)
Ellenbogen, Michael; Bijjani, Richard
2009-05-01
Excerpt from the US Transportation Security Agency website: "The ban on liquids, aerosols and gels was implemented on August 10 after a terrorist plot was foiled. Since then, experts from around the government, including the FBI and our national labs have analyzed the information we now have and have conducted extensive explosives testing to get a better understanding of this specific threat." In order to lift the ban and ease the burden on the flying public, Reveal began an extensive effort in close collaboration with the US and several other governments to help identify these threats. This effort resulted in the successful development and testing of an automated explosive detection system capable of resolving these threats with a high probability of detection and a low false alarm rate. We will present here some of the methodology and approach we took to address this problem.
Development and Demonstration of a Magnesium-Intensive Vehicle Front-End Substructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Logan, Stephen D.; Forsmark, Joy H.; Osborne, Richard
2016-07-01
This project is the final phase (designated Phase III) of an extensive, nine-year effort with the objectives of developing a knowledge base and enabling technologies for the design, fabrication and performance evaluation of magnesium-intensive automotive front-end substructures intended to partially or completely replace all-steel comparators, providing a weight savings approaching 50% of the baseline. Benefits of extensive vehicle weight reduction in terms of fuel economy increase, extended vehicle range, vehicle performance and commensurate reductions in greenhouse gas emissions are well known. An exemplary vehicle substructure considered by the project is illustrated in Figure 1, along with the exterior vehicle appearance.more » This unibody front-end “substructure” is one physical objective of the ultimate design and engineering aspects established at the outset of the larger collective effort.« less
Structural modeling of G-protein coupled receptors: An overview on automatic web-servers.
Busato, Mirko; Giorgetti, Alejandro
2016-08-01
Despite the significant efforts and discoveries during the last few years in G protein-coupled receptor (GPCR) expression and crystallization, the receptors with known structures to date are limited only to a small fraction of human GPCRs. The lack of experimental three-dimensional structures of the receptors represents a strong limitation that hampers a deep understanding of their function. Computational techniques are thus a valid alternative strategy to model three-dimensional structures. Indeed, recent advances in the field, together with extraordinary developments in crystallography, in particular due to its ability to capture GPCRs in different activation states, have led to encouraging results in the generation of accurate models. This, prompted the community of modelers to render their methods publicly available through dedicated databases and web-servers. Here, we present an extensive overview on these services, focusing on their advantages, drawbacks and their role in successful applications. Future challenges in the field of GPCR modeling, such as the predictions of long loop regions and the modeling of receptor activation states are presented as well. Copyright © 2016 Elsevier Ltd. All rights reserved.
Du-Cuny, Lei; Chen, Lu; Zhang, Shuxing
2014-01-01
Blockade of hERG channel prolongs the duration of the cardiac action potential and is a common reason for drug failure in preclinical safety trials. Therefore, it is of great importance to develop robust in silico tools to predict potential hERG blockers in the early stages of drug discovery and development. Herein we described comprehensive approaches to assess the discrimination of hERG-active and -inactive compounds by combining QSAR modeling, pharmacophore analysis, and molecular docking. Our consensus models demonstrated high predictive capacity and improved enrichment, and they could correctly classify 91.8% of 147 hERG blockers from 351 inactives. To further enhance our modeling effort, hERG homology models were constructed and molecular docking studies were conducted, resulting in high correlations (R2=0.81) between predicted and experimental binding affinities. We expect our unique models can be applied to efficient screening for hERG blockades, and our extensive understanding of the hERG-inhibitor interactions will facilitate the rational design of drugs devoid of hERG channel activity and hence with reduced cardiac toxicities. PMID:21902220
Customizing WRF-Hydro for the Laurentian Great Lakes Basin
NASA Astrophysics Data System (ADS)
Gronewold, A.; Pei, L.; Gochis, D.; Mason, L.; Sampson, K. M.; Dugger, A. L.; Read, L.; McCreight, J. L.; Xiao, C.; Lofgren, B. M.; Anderson, E. J.; Chu, P. Y.
2017-12-01
To advance the state of the art in regional hydrological forecasting, and to align with operational deployment of the National Water Model, a team of scientists has been customizing WRF-Hydro (the Weather Research and Forecasting model - Hydrological modeling extension package) to the entirety (including binational land and lake surfaces) of the Laurentian Great Lakes basin. Objectives of this customization project include opererational simulation and forecasting of the Great Lakes water balance and, in the short-term, research-oriented insights into modeling one- and two-way coupled lake-atmosphere and near-shore processes. Initial steps in this project have focused on overcoming inconsistencies in land surface hydrographic datasets between the United States and Canada. Improvements in the model's current representation of lake physics and stream routing are also critical components of this effort. Here, we present an update on the status of this project, including a synthesis of offline tests with WRF-Hydro based on the newly developed Great Lakes hydrographic data, and an assessment of the model's ability to simulate seasonal and multi-decadal hydrological response across the Great Lakes.
NASA Astrophysics Data System (ADS)
Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang
2018-02-01
Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.
SLS Model Based Design: A Navigation Perspective
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin
2018-01-01
The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.
Experimental investigation of nozzle/plume aerodynamics at hypersonic speeds
NASA Technical Reports Server (NTRS)
Bogdanoff, David W.; Cambier, Jean-Luc; Papadopoulos, Perikles
1994-01-01
Much of the work involved the Ames 16-Inch Shock Tunnel facility. The facility was reactivated and upgraded, a data acquisition system was configured and upgraded several times, several facility calibrations were performed and test entries with a wedge model with hydrogen injection and a full scramjet combustor model, with hydrogen injection, were performed. Extensive CFD modeling of the flow in the facility was done. This includes modeling of the unsteady flow in the driver and driven tubes and steady flow modeling of the nozzle flow. Other modeling efforts include simulations of non-equilibrium flows and turbulence, plasmas, light gas guns and the use of non-ideal gas equations of state. New experimental techniques to improve the performance of gas guns, shock tubes and tunnels and scramjet combustors were conceived and studied computationally. Ways to improve scramjet engine performance using steady and pulsed detonation waves were also studied computationally. A number of studies were performed on the operation of the ram accelerator, including investigations of in-tube gasdynamic heating and the use of high explosives to raise the velocity capability of the device.
Planning for bird conservation: a tale of two models
Johnson, Douglas H.; Winter, Maiken
2005-01-01
Planning for bird conservation has become increasingly reliant on remote sensing, geographical information systems, and, especially, models used to predict the occurrence of bird species as well as their density and demographics. We address the role of such tools by contrasting two models used in bird conservation. One, the Mallard ( Anas platyrhynchos) productivity model, is very detailed, mechanistic, and based on an enormous body of research. The Mallard model has been extensively used with success to guide management efforts for Mallards and certain other species of ducks. The other model, the concept of Bird Conservation Areas, is more simple, less mechanistic, and less well-grounded in research. This concept proposes that large patches of suitable habitat in a proper landscape will be adequate to maintain populations of birds. The Bird Conservation Area concept recently has been evaluated in the northern tallgrass prairie, where its fundamental assumptions have been found not to hold consistently. We argue that a more comprehensive understanding of the biology of individual species, and how they respond to habitat features, will be essential before we can use remotely sensed information and geographic information system products with confidence.
Comparison of different models for non-invasive FFR estimation
NASA Astrophysics Data System (ADS)
Mirramezani, Mehran; Shadden, Shawn
2017-11-01
Coronary artery disease is a leading cause of death worldwide. Fractional flow reserve (FFR), derived from invasively measuring the pressure drop across a stenosis, is considered the gold standard to diagnose disease severity and need for treatment. Non-invasive estimation of FFR has gained recent attention for its potential to reduce patient risk and procedural cost versus invasive FFR measurement. Non-invasive FFR can be obtained by using image-based computational fluid dynamics to simulate blood flow and pressure in a patient-specific coronary model. However, 3D simulations require extensive effort for model construction and numerical computation, which limits their routine use. In this study we compare (ordered by increasing computational cost/complexity): reduced-order algebraic models of pressure drop across a stenosis; 1D, 2D (multiring) and 3D CFD models; as well as 3D FSI for the computation of FFR in idealized and patient-specific stenosis geometries. We demonstrate the ability of an appropriate reduced order algebraic model to closely predict FFR when compared to FFR from a full 3D simulation. This work was supported by the NIH, Grant No. R01-HL103419.
Rebaudo, François; Dangles, Olivier
2011-10-01
Worldwide, the theory and practice of agricultural extension system have been dominated for almost half a century by Rogers' "diffusion of innovation theory". In particular, the success of integrated pest management (IPM) extension programs depends on the effectiveness of IPM information diffusion from trained farmers to other farmers, an important assumption which underpins funding from development organizations. Here we developed an innovative approach through an agent-based model (ABM) combining social (diffusion theory) and biological (pest population dynamics) models to study the role of cooperation among small-scale farmers to share IPM information for controlling an invasive pest. The model was implemented with field data, including learning processes and control efficiency, from large scale surveys in the Ecuadorian Andes. Our results predict that although cooperation had short-term costs for individual farmers, it paid in the long run as it decreased pest infestation at the community scale. However, the slow learning process placed restrictions on the knowledge that could be generated within farmer communities over time, giving rise to natural lags in IPM diffusion and applications. We further showed that if individuals learn from others about the benefits of early prevention of new pests, then educational effort may have a sustainable long-run impact. Consistent with models of information diffusion theory, our results demonstrate how an integrated approach combining ecological and social systems would help better predict the success of IPM programs. This approach has potential beyond pest management as it could be applied to any resource management program seeking to spread innovations across populations.
Thomas-Vaslin, Véronique; Six, Adrien; Ganascia, Jean-Gabriel; Bersini, Hugues
2013-01-01
Dynamic modeling of lymphocyte behavior has primarily been based on populations based differential equations or on cellular agents moving in space and interacting each other. The final steps of this modeling effort are expressed in a code written in a programing language. On account of the complete lack of standardization of the different steps to proceed, we have to deplore poor communication and sharing between experimentalists, theoreticians and programmers. The adoption of diagrammatic visual computer language should however greatly help the immunologists to better communicate, to more easily identify the models similarities and facilitate the reuse and extension of existing software models. Since immunologists often conceptualize the dynamical evolution of immune systems in terms of “state-transitions” of biological objects, we promote the use of unified modeling language (UML) state-transition diagram. To demonstrate the feasibility of this approach, we present a UML refactoring of two published models on thymocyte differentiation. Originally built with different modeling strategies, a mathematical ordinary differential equation-based model and a cellular automata model, the two models are now in the same visual formalism and can be compared. PMID:24101919
Solar thermal storage applications program
NASA Astrophysics Data System (ADS)
Peila, W. C.
1982-12-01
The efforts of the Storage Applications Program are reviewed. The program concentrated on the investigation of storage media and evaluation of storage methods. Extensive effort was given to experimental and analytical investigations of nitrate salts. Two tasks are the preliminary design of a 1200 MW/sub th/ system and the design, construction, operation, and evaluation of a subsystem research experiment, which utilized the same design. Some preliminary conclusions drawn from the subsystem research experiment are given.
Constructing and decoding unconventional ubiquitin chains.
Behrends, Christian; Harper, J Wade
2011-05-01
One of the most notable discoveries in the ubiquitin system during the past decade is the extensive use of diverse chain linkages to control signaling networks. Although the utility of Lys48- and Lys63-linked chains in protein turnover and molecular assembly, respectively, are well known, we are only beginning to understand how unconventional chain linkages are formed on target proteins and how such linkages are decoded by specific binding proteins. In this review, we summarize recent efforts to elucidate the machinery and mechanisms controlling assembly of Lys11-linked and linear (or Met1-linked) ubiquitin chains, and describe current models for how these chain types function in immune signaling and cell-cycle control.
An MDA Based Ontology Platform: AIR
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
In the past few years, software engineering has witnessed two major shifts: model-driven engineering has entered the mainstream, and some leading development tools have become open and extensible.1 AI has always been a spring of new ideas that have been adopted in software engineering, but most of its gems have stayed buried in laboratories, available only to a limited number of AI practitioners. Should AI tools be integrated into mainstream tools and could it be done? We think that it is feasible, and that both communities can benefit from this integration. In fact, some efforts in this direction have already been made, both by major industrial standardization bodies such as the OMG, and by academic laboratories.
Numerical aerodynamic simulation facility. Preliminary study extension
NASA Technical Reports Server (NTRS)
1978-01-01
The production of an optimized design of key elements of the candidate facility was the primary objective of this report. This was accomplished by effort in the following tasks: (1) to further develop, optimize and describe the function description of the custom hardware; (2) to delineate trade off areas between performance, reliability, availability, serviceability, and programmability; (3) to develop metrics and models for validation of the candidate systems performance; (4) to conduct a functional simulation of the system design; (5) to perform a reliability analysis of the system design; and (6) to develop the software specifications to include a user level high level programming language, a correspondence between the programming language and instruction set and outline the operation system requirements.
Rheological properties, shape oscillations, and coalescence of liquid drops with surfactants
NASA Technical Reports Server (NTRS)
Apfel, R. E.; Holt, R. G.
1990-01-01
A method was developed to deduce dynamic interfacial properties of liquid drops. The method involves measuring the frequency and damping of free quadrupole oscillations of an acoustically levitated drop. Experimental results from pure liquid-liquid systems agree well with theoretical predictions. Additionally, the effects of surfactants is considered. Extension of these results to a proposed microgravity experiment on the drop physics module (DPM) in USML-1 are discussed. Efforts are also underway to model the time history of the thickness of the fluid layer between two pre-coalescence drops, and to measure the film thickness experimentally. Preliminary results will be reported, along with plans for coalescence experiments proposed for USML-1.
NASA Astrophysics Data System (ADS)
Ferguson, D. B.; Guido, Z. S.; Buizer, J.; Roy, M.
2010-12-01
Bringing climate change issues into focus for decision makers is a growing challenge. Decision makers are often confronted with unique informational needs, a lack of useable information, and needs for customized climate change training, among other issues. Despite significant progress in improving climate literacy among certain stakeholders such as water managers, recent reports have highlighted the growing demand for climate-change information in regions and sectors across the US. In recent years many ventures have sprung up to address these gaps and have predominantly focused on K-12 education and resource management agencies such as the National Park Service and National Weather Service. However, two groups that are critical for integrating climate information into actions have received less attention: (1) policy makers and (2) outreach experts, such as Cooperative Extension agents. Climate Change Boot Camps (CCBC) is a joint effort between the Climate Assessment for the Southwest (CLIMAS)—a NOAA Regionally Integrated Sciences and Assessments (RISA) program—and researchers at Arizona State University to diagnose climate literacy and training gaps in Arizona and develop a process that converts these deficiencies into actionable knowledge among the two aforementioned groups. This presentation will highlight the initial phases of the CCBC process, which has as its outcomes the identification of effective strategies for reaching legislators, climate literacy and training needs for both policy makers and trainers, and effective metrics to evaluate the success of these efforts. Specific attention is given to evaluating the process from initial needs assessment to the effectiveness of the workshops. Web curriculum and training models made available on the internet will also be developed, drawing on extensive existing Web resources for other training efforts and converted to meet the needs of these two groups. CCBC will also leverage CLIMAS’ long history of engaging with stakeholders in the Southwest to facilitate to use of climate information in the decision process.
The History, Status, Gaps, and Future Directions of Neurotoxicology in China
Cai, Tongjian; Luo, Wenjing; Ruan, Diyun; Wu, Yi-Jun; Fox, Donald A.; Chen, Jingyuan
2016-01-01
Background: Rapid economic development in China has produced serious ecological, environmental, and health problems. Neurotoxicity has been recognized as a major public health problem. The Chinese government, research institutes, and scientists conducted extensive studies concerning the source, characteristics, and mechanisms of neurotoxicants. Objectives: This paper presents, for the first time, a comprehensive history and review of major sources of neurotoxicants, national bodies/legislation engaged, and major neurotoxicology research in China. Methods: Peer-reviewed research and pollution studies by Chinese scientists from 1991 to 2015 were examined. PubMed, Web of Science and Chinese National Knowledge Infrastructure (CNKI) were the major search tools. Results: The central problem is an increased exposure to neurotoxicants from air and water, food contamination, e-waste recycling, and manufacturing of household products. China formulated an institutional framework and standards system for management of major neurotoxicants. Basic and applied research was initiated, and international cooperation was achieved. The annual number of peer-reviewed neurotoxicology papers from Chinese authors increased almost 30-fold since 2001. Conclusions: Despite extensive efforts, neurotoxicity remains a significant public health problem. This provides great challenges and opportunities. We identified 10 significant areas that require major educational, environmental, governmental, and research efforts, as well as attention to public awareness. For example, there is a need to increase efforts to utilize new in vivo and in vitro models, determine the potential neurotoxicity and mechanisms involved in newly emerging pollutants, and examine the effects and mechanisms of mixtures. In the future, we anticipate working with scientists worldwide to accomplish these goals and eliminate, prevent and treat neurotoxicity. Citation: Cai T, Luo W, Ruan D, Wu YJ, Fox DA, Chen J. 2016. The history, status, gaps, and future directions of neurotoxicology in China. Environ Health Perspect 124:722–732; http://dx.doi.org/10.1289/ehp.1409566 PMID:26824332
Modeling Atmospheric CO2 Processes to Constrain the Missing Sink
NASA Technical Reports Server (NTRS)
Kawa, S. R.; Denning, A. S.; Erickson, D. J.; Collatz, J. C.; Pawson, S.
2005-01-01
We report on a NASA supported modeling effort to reduce uncertainty in carbon cycle processes that create the so-called missing sink of atmospheric CO2. Our overall objective is to improve characterization of CO2 source/sink processes globally with improved formulations for atmospheric transport, terrestrial uptake and release, biomass and fossil fuel burning, and observational data analysis. The motivation for this study follows from the perspective that progress in determining CO2 sources and sinks beyond the current state of the art will rely on utilization of more extensive and intensive CO2 and related observations including those from satellite remote sensing. The major components of this effort are: 1) Continued development of the chemistry and transport model using analyzed meteorological fields from the Goddard Global Modeling and Assimilation Office, with comparison to real time data in both forward and inverse modes; 2) An advanced biosphere model, constrained by remote sensing data, coupled to the global transport model to produce distributions of CO2 fluxes and concentrations that are consistent with actual meteorological variability; 3) Improved remote sensing estimates for biomass burning emission fluxes to better characterize interannual variability in the atmospheric CO2 budget and to better constrain the land use change source; 4) Evaluating the impact of temporally resolved fossil fuel emission distributions on atmospheric CO2 gradients and variability. 5) Testing the impact of existing and planned remote sensing data sources (e.g., AIRS, MODIS, OCO) on inference of CO2 sources and sinks, and use the model to help establish measurement requirements for future remote sensing instruments. The results will help to prepare for the use of OCO and other satellite data in a multi-disciplinary carbon data assimilation system for analysis and prediction of carbon cycle changes and carbodclimate interactions.
Extensive dispersal of Roanoke logperch (Percina rex) inferred from genetic marker data
Roberts, James H.; Angermeier, Paul; Hallerman, Eric M.
2016-01-01
The dispersal ecology of most stream fishes is poorly characterised, complicating conservation efforts for these species. We used microsatellite DNA marker data to characterise dispersal patterns and effective population size (Ne) for a population of Roanoke logperchPercina rex, an endangered darter (Percidae). Juveniles and candidate parents were sampled for 2 years at sites throughout the Roanoke River watershed. Dispersal was inferred via genetic assignment tests (ATs), pedigree reconstruction (PR) and estimation of lifetime dispersal distance under a genetic isolation-by-distance model. Estimates of Ne varied from 105 to 1218 individuals, depending on the estimation method. Based on PR, polygamy was frequent in parents of both sexes, with individuals spawning with an average of 2.4 mates. The sample contained 61 half-sibling pairs, but only one parent–offspring pair and no full-sib pairs, which limited our ability to discriminate natal dispersal of juveniles from breeding dispersal of their parents between spawning events. Nonetheless, all methods indicated extensive dispersal. The AT indicated unrestricted dispersal among sites ≤15 km apart, while siblings inferred by the PR were captured an average of 14 km and up to 55 km apart. Model-based estimates of median lifetime dispersal distance (6–24 km, depending on assumptions) bracketed AT and PR estimates, indicating that widely dispersed individuals do, on average, contribute to gene flow. Extensive dispersal of P. rex suggests that darters and other small benthic stream fishes may be unexpectedly mobile. Monitoring and management activities for such populations should encompass entire watersheds to fully capture population dynamics.
Lewis, Dale; Bell, Samuel D.; Fay, John; Bothi, Kim L.; Gatere, Lydiah; Kabila, Makando; Mukamba, Mwangala; Matokwani, Edwin; Mushimbalume, Matthews; Moraru, Carmen I.; Lehmann, Johannes; Lassoie, James; Wolfe, David; Lee, David R.; Buck, Louise; Travis, Alexander J.
2011-01-01
In the Luangwa Valley, Zambia, persistent poverty and hunger present linked challenges to rural development and biodiversity conservation. Both household coping strategies and larger-scale economic development efforts have caused severe natural resource degradation that limits future economic opportunities and endangers ecosystem services. A model based on a business infrastructure has been developed to promote and maintain sustainable agricultural and natural resource management practices, leading to direct and indirect conservation outcomes. The Community Markets for Conservation (COMACO) model operates primarily with communities surrounding national parks, strengthening conservation benefits produced by these protected areas. COMACO first identifies the least food-secure households and trains them in sustainable agricultural practices that minimize threats to natural resources while meeting household needs. In addition, COMACO identifies people responsible for severe natural resource depletion and trains them to generate alternative income sources. In an effort to maintain compliance with these practices, COMACO provides extension support and access to high-value markets that would otherwise be inaccessible to participants. Because the model is continually evolving via adaptive management, success or failure of the model as a whole is difficult to quantify at this early stage. We therefore test specific hypotheses and present data documenting the stabilization of previously declining wildlife populations; the meeting of thresholds of productivity that give COMACO access to stable, high-value markets and progress toward economic self-sufficiency; and the adoption of sustainable agricultural practices by participants and other community members. Together, these findings describe a unique, business-oriented model for poverty alleviation, food production, and biodiversity conservation. PMID:21873184
Lewis, Dale; Bell, Samuel D; Fay, John; Bothi, Kim L; Gatere, Lydiah; Kabila, Makando; Mukamba, Mwangala; Matokwani, Edwin; Mushimbalume, Matthews; Moraru, Carmen I; Lehmann, Johannes; Lassoie, James; Wolfe, David; Lee, David R; Buck, Louise; Travis, Alexander J
2011-08-23
In the Luangwa Valley, Zambia, persistent poverty and hunger present linked challenges to rural development and biodiversity conservation. Both household coping strategies and larger-scale economic development efforts have caused severe natural resource degradation that limits future economic opportunities and endangers ecosystem services. A model based on a business infrastructure has been developed to promote and maintain sustainable agricultural and natural resource management practices, leading to direct and indirect conservation outcomes. The Community Markets for Conservation (COMACO) model operates primarily with communities surrounding national parks, strengthening conservation benefits produced by these protected areas. COMACO first identifies the least food-secure households and trains them in sustainable agricultural practices that minimize threats to natural resources while meeting household needs. In addition, COMACO identifies people responsible for severe natural resource depletion and trains them to generate alternative income sources. In an effort to maintain compliance with these practices, COMACO provides extension support and access to high-value markets that would otherwise be inaccessible to participants. Because the model is continually evolving via adaptive management, success or failure of the model as a whole is difficult to quantify at this early stage. We therefore test specific hypotheses and present data documenting the stabilization of previously declining wildlife populations; the meeting of thresholds of productivity that give COMACO access to stable, high-value markets and progress toward economic self-sufficiency; and the adoption of sustainable agricultural practices by participants and other community members. Together, these findings describe a unique, business-oriented model for poverty alleviation, food production, and biodiversity conservation.
Hartman, Sarah; Widaman, Keith F; Belsky, Jay
2015-08-01
Manuck, Craig, Flory, Halder, and Ferrell (2011) reported that a theoretically anticipated effect of family rearing on girls' menarcheal age was genetically moderated by two single nucleotide polymorphisms (SNPs) of the estrogen receptor-α gene. We sought to replicate and extend these findings, studying 210 White females followed from birth. The replication was general because a different measure of the rearing environment was used in this inquiry (i.e., maternal sensitivity) than in the prior one (i.e., family cohesion). Extensions of the work included prospective rather than retrospective measurements of the rearing environment, reports of first menstruation within a year of its occurrence rather than decades later, accounting for some heritability of menarcheal age by controlling for maternal age of menarche, and using a new model-fitting approach to competitively compare diathesis-stress versus differential-susceptibility models of Gene × Environment interaction. The replication/extension effort proved successful in the case of both estrogen receptor-α SNPs, with the Gene × Environment interactions principally reflecting diathesis-stress: lower levels of maternal sensitivity predicted earlier age of menarche for girls homozygous for the minor alleles of either SNP but not for girls carrying other genotypes. Results are discussed in light of the new analytic methods adopted.
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
Cichosz, Simon Lebech; Johansen, Mette Dencker; Hejlesen, Ole
2015-10-14
Diabetes is one of the top priorities in medical science and health care management, and an abundance of data and information is available on these patients. Whether data stem from statistical models or complex pattern recognition models, they may be fused into predictive models that combine patient information and prognostic outcome results. Such knowledge could be used in clinical decision support, disease surveillance, and public health management to improve patient care. Our aim was to review the literature and give an introduction to predictive models in screening for and the management of prevalent short- and long-term complications in diabetes. Predictive models have been developed for management of diabetes and its complications, and the number of publications on such models has been growing over the past decade. Often multiple logistic or a similar linear regression is used for prediction model development, possibly owing to its transparent functionality. Ultimately, for prediction models to prove useful, they must demonstrate impact, namely, their use must generate better patient outcomes. Although extensive effort has been put in to building these predictive models, there is a remarkable scarcity of impact studies. © 2015 Diabetes Technology Society.
Quality assurance of the gene ontology using abstraction networks.
Ochs, Christopher; Perl, Yehoshua; Halper, Michael; Geller, James; Lomax, Jane
2016-06-01
The gene ontology (GO) is used extensively in the field of genomics. Like other large and complex ontologies, quality assurance (QA) efforts for GO's content can be laborious and time consuming. Abstraction networks (AbNs) are summarization networks that reveal and highlight high-level structural and hierarchical aggregation patterns in an ontology. They have been shown to successfully support QA work in the context of various ontologies. Two kinds of AbNs, called the area taxonomy and the partial-area taxonomy, are developed for GO hierarchies and derived specifically for the biological process (BP) hierarchy. Within this framework, several QA heuristics, based on the identification of groups of anomalous terms which exhibit certain taxonomy-defined characteristics, are introduced. Such groups are expected to have higher error rates when compared to other terms. Thus, by focusing QA efforts on anomalous terms one would expect to find relatively more erroneous content. By automatically identifying these potential problem areas within an ontology, time and effort will be saved during manual reviews of GO's content. BP is used as a testbed, with samples of three kinds of anomalous BP terms chosen for a taxonomy-based QA review. Additional heuristics for QA are demonstrated. From the results of this QA effort, it is observed that different kinds of inconsistencies in the modeling of GO can be exposed with the use of the proposed heuristics. For comparison, the results of QA work on a sample of terms chosen from GO's general population are presented.
During the past decade we have extensively studied coastal ecosystems in the Great Lakes. Some research efforts have linked coastal receiving systems to conditions in their contributing watersheds; others have focused on developing invasive species detection and monitoring strat...
Sex Trafficking, Violence Victimization, and Condom Non-Use Among Prostituted Women in Nicaragua
Decker, Michele R.; Mack, Katelyn P.; Barrows, Jeffery J.; Silverman, Jay G.
2013-01-01
Synopsis Prostituted women report disempowerment-related barriers to condom use, extensive violence victimization and trafficking experiences; findings indicate that disempowerment must be addressed within STI/HIV prevention efforts. PMID:19577234
DOT National Transportation Integrated Search
1991-05-01
This study is part of an effort by the National Highway Traffic Safety : Administration (NHTSA) to determine the accuracy of the VASCAR-plus speed measurement device. VASCAR-plus is used extensively for speed law enforcement by state and local police...
NASA Astrophysics Data System (ADS)
Dronova, I.; Taddeo, S.; Foster, K.
2017-12-01
Projecting ecosystem responses to global change relies on the accurate understanding of properties governing their functions in different environments. An important variable in models of ecosystem function is canopy leaf area index (LAI; leaf area per unit ground area) declared as one of the Essential Climate Variables in the Global Climate Observing System and extensively measured in terrestrial landscapes. However, wetlands have been largely under-represented in these efforts, which globally limits understanding of their contribution to carbon sequestration, climate regulation and resilience to natural and anthropogenic disturbances. This study provides a global synthesis of >350 wetland-specific LAI observations from 182 studies and compares LAI among wetland ecosystem and vegetation types, biomes and measurement approaches. Results indicate that most wetland types and even individual locations show a substantial local dispersion of LAI values (average coefficient of variation 65%) due to heterogeneity of environmental properties and vegetation composition. Such variation indicates that mean LAI values may not sufficiently represent complex wetland environments, and the use of this index in ecosystem function models needs to incorporate within-site variation in canopy properties. Mean LAI did not significantly differ between direct and indirect measurement methods on a pooled global sample; however, within some of the specific biomes and wetland types significant contrasts between these approaches were detected. These contrasts highlight unique aspects of wetland vegetation physiology and canopy structure affecting measurement principles that need to be considered in generalizing canopy properties in ecosystem models. Finally, efforts to assess wetland LAI using remote sensing strongly indicate the promise of this technology for cost-effective regional-scale modeling of canopy properties similar to terrestrial systems. However, such efforts urgently require more rigorous corrections for three-dimensional contributions of non-canopy material and non-vegetated surfaces to wetland canopy reflectance.
Baker, Matthew R; Schindler, Daniel E; Essington, Timothy E; Hilborn, Ray
2014-01-01
Few studies have considered the management implications of mortality to target fish stocks caused by non-retention in commercial harvest gear (escape mortality). We demonstrate the magnitude of this previously unquantified source of mortality and its implications for the population dynamics of exploited stocks, biological metrics, stock productivity, and optimal management. Non-retention in commercial gillnet fisheries for Pacific salmon (Oncorhynchus spp.) is common and often leads to delayed mortality in spawning populations. This represents losses, not only to fishery harvest, but also in future recruitment to exploited stocks. We estimated incidence of non-retention in Alaskan gillnet fisheries for sockeye salmon (O. nerka) and found disentanglement injuries to be extensive and highly variable between years. Injuries related to non-retention were noted in all spawning populations, and incidence of injury ranged from 6% to 44% of escaped salmon across nine river systems over five years. We also demonstrate that non-retention rates strongly correlate with fishing effort. We applied maximum likelihood and Bayesian approaches to stock-recruitment analyses, discounting estimates of spawning salmon to account for fishery-related mortality in escaped fish. Discounting spawning stock estimates as a function of annual fishing effort improved model fits to historical stock-recruitment data in most modeled systems. This suggests the productivity of exploited stocks has been systematically underestimated. It also suggests that indices of fishing effort may be used to predict escape mortality and correct for losses. Our results illustrate how explicitly accounting for collateral effects of fishery extraction may improve estimates of productivity and better inform management metrics derived from estimates of stock-recruitment analyses.
NASA Technical Reports Server (NTRS)
Gradl, Paul R.; Valentine, Peter G.
2017-01-01
Upper stage and in-space liquid rocket engines are optimized for performance through the use of high area ratio nozzles to fully expand combustion gases to low exit pressures, increasing exhaust velocities. Due to the large size of such nozzles, and the related engine performance requirements, carbon-carbon (C-C) composite nozzle extensions are being considered to reduce weight impacts. Currently, the state-of-the-art is represented by the metallic and foreign composite nozzle extensions limited to approximately 2000 degrees F. used on the Atlas V, Delta IV, Falcon 9, and Ariane 5 launch vehicles. NASA and industry partners are working towards advancing the domestic supply chain for C-C composite nozzle extensions. These development efforts are primarily being conducted through the NASA Small Business Innovation Research (SBIR) program in addition to other low level internal research efforts. This has allowed for the initial material development and characterization, subscale hardware fabrication, and completion of hot-fire testing in relevant environments. NASA and industry partners have designed, fabricated and hot-fire tested several subscale domestically produced C-C extensions to advance the material and coatings fabrication technology for use with a variety of liquid rocket and scramjet engines. Testing at NASA's Marshall Space Flight Center (MSFC) evaluated heritage and state-of-the-art C-C materials and coatings, demonstrating the initial capabilities of the high temperature materials and their fabrication methods. This paper discusses the initial material development, design and fabrication of the subscale carbon-carbon nozzle extensions, provides an overview of the test campaign, presents results of the hot fire testing, and discusses potential follow-on development work. The follow on work includes the fabrication of ultra-high temperature materials, larger C-C nozzle extensions, material characterization, sub-element testing and hot-fire testing at larger scale.
Taking Innovation To Scale In Primary Care Practices: The Functions Of Health Care Extension
Ono, Sarah S.; Crabtree, Benjamin F.; Hemler, Jennifer R.; Balasubramanian, Bijal A.; Edwards, Samuel T.; Green, Larry A.; Kaufman, Arthur; Solberg, Leif I.; Miller, William L.; Woodson, Tanisha Tate; Sweeney, Shannon M.; Cohen, Deborah J.
2018-01-01
Health care extension is an approach to providing external support to primary care practices with the aim of diffusing innovation. EvidenceNOW was launched to rapidly disseminate and implement evidence-based guidelines for cardiovascular preventive care in the primary care setting. Seven regional grantee cooperatives provided the foundational elements of health care extension—technological and quality improvement support, practice capacity building, and linking with community resources—to more than two hundred primary care practices in each region. This article describes how the cooperatives varied in their approaches to extension and provides early empirical evidence that health care extension is a feasible and potentially useful approach for providing quality improvement support to primary care practices. With investment, health care extension may be an effective platform for federal and state quality improvement efforts to create economies of scale and provide practices with more robust and coordinated support services. PMID:29401016
Biomass power for rural development. Technical progress report, January 1--March 31, 1998
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neuhauser, E.
Brief progress reports are presented on the following tasks: design packages for retrofits at the Dunkirk Station; fuel supply and site development plans; major equipment guarantees and project risk sharing; power production commitment; power plant site plan, construction and environmental permits; and experimental strategies for system evaluation. The paper then discusses in more detail the following: feedstock development efforts; clone-site testing and genetic studies; and efforts at outreach, extension and technology transfer.
ERIC Educational Resources Information Center
Wagner, Mary; Newman, Lynn; Cameto, Renee
2004-01-01
Background: Since the early 1980s there have been extensive federal, state, and local efforts to improve schools for all students, including broad policy initiatives intended to change the school experiences of students with disabilities. These efforts have had significant impacts on policy and practice at all levels of the education system,…
Domain Wall Evolution in Phase Transforming Oxides
2015-01-14
configumtions under driving forces (e.g. changes in temperature and electric fields) in an effort to: 1) understand the underlying linkage between -1...configurations under driving forces (e.g. changes in temperature and electric fields) in an effort to: 1) understand the underlying linkage between...Extensive domain wall motion and deaging resistance in morphotropic 0.55Bi(Ni1/2Ti1/2)O3–0.45PbTiO3 polycrystalline ferroelectrics, Applied Physics Letters
[Sincerity of effort: isokinetic evaluation of knee extension].
Colombo, R; Demaiti, G; Sartorio, F; Orlandini, D; Vercelli, S; Ferriero, G
2008-01-01
The aim of this study was to find a reliable method to evaluate the sincerity of the muscular maximal effort performed in a dynamometric isokinetic test of knee flexion-extension. The coefficient of variation of the peak torque (CV) and 3 new indices were analysed: (1) the average coefficient of variation calculated on the complete peak torque curve (CVM); (2) the slope of the regression line in an endurance test (PRR); (3) the correlation coefficient of the peak torques in the same endurance test (CCR). Twenty healthy subjects underwent assessment in two different trials, maximal (MX) and 50% submaximal (SMX), with 20 minutes of rest between trials. Each trial consisted of 4 tests, each of 3 repetitions, at angular speed of 30, 180, 30, and 180 degrees/s, respectively, and 1 test of 15 repetitions at 240 degrees/s. Our findings confirmed the ability of CV to detect a high percentage of sincere efforts: at 30 degrees/s Sensibility (Sns)=100% and Specificity (Spc)=70%; at 180 degrees/s Sns=75%, Spc=95%. The 3 new indices here proposed showed high characteristics of Sns and Spc, generally better than those of CV. CVM showed at 180 degrees/s Sns=90% and Spc=100%, while at 30 degrees/s Sns=90%, Spc=75%. PRR was the best index identifying all the efforts, except one (Sns=100%, Spc=95%). The CCR coefficient showed Sns and Spc values both of 90%.
NASA Astrophysics Data System (ADS)
Gan, Zhixing; Xu, Hao; Hao, Yanling
2016-04-01
Luminescent nanomaterials, with wide applications in biosensing, bioimaging, illumination and display techniques, have been consistently garnering enormous research attention. In particular, those with wavelength-controllable emissions could be highly beneficial. Carbon nanostructures, including graphene quantum dots (GQDs) and other graphene oxide derivates (GODs), with excitation-dependent photoluminescence (PL), which means their fluorescence color could be tuned simply by changing the excitation wavelength, have attracted lots of interest. However the intrinsic mechanism for the excitation-dependent PL is still obscure and fiercely debated presently. In this review, we attempt to summarize the latest efforts to explore the mechanism, including the quantum confinement effect, surface traps model, giant red-edge effect, edge states model and electronegativity of heteroatom model, as well as the newly developed synergistic model, to seek some clues to unravel the mechanism. Meanwhile the controversial difficulties for each model are further discussed. Besides this, the challenges and potential influences of the synthetic methodology and development of the materials are illustrated extensively to elicit more thought and constructive attempts toward their application.
Expanding Health Care Access Through Education: Dissemination and Implementation of the ECHO Model.
Katzman, Joanna G; Galloway, Kevin; Olivas, Cynthia; McCoy-Stafford, Kimberly; Duhigg, Daniel; Comerci, George; Kalishman, Summers; Buckenmaier, Chester C; McGhee, Laura; Joltes, Kristin; Bradford, Andrea; Shelley, Brian; Hernandez, Jessica; Arora, Sanjeev
2016-03-01
Project ECHO (Extension for Community Healthcare Outcomes) is an evidence-based model that provides high-quality medical education for common and complex diseases through telementoring and comanagement of patients with primary care clinicians. In a one to many knowledge network, the ECHO model helps to bridge the gap between primary care clinicians and specialists by enhancing the knowledge, skills, confidence, and practice of primary care clinicians in their local communities. As a result, patients in rural and urban underserved areas are able to receive best practice care without long waits or having to travel long distances. The ECHO model has been replicated in 43 university hubs in the United States and five other countries. A new replication tool was developed by the Project ECHO Pain team and U.S. Army Medical Command to ensure a high-fidelity replication of the model. The adoption of the tool led to successful replication of ECHO in the Army Pain initiative. This replication tool has the potential to improve the fidelity of ECHO replication efforts around the world. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
Equivalent Relaxations of Optimal Power Flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bose, S; Low, SH; Teeraratkul, T
2015-03-01
Several convex relaxations of the optimal power flow (OPF) problem have recently been developed using both bus injection models and branch flow models. In this paper, we prove relations among three convex relaxations: a semidefinite relaxation that computes a full matrix, a chordal relaxation based on a chordal extension of the network graph, and a second-order cone relaxation that computes the smallest partial matrix. We prove a bijection between the feasible sets of the OPF in the bus injection model and the branch flow model, establishing the equivalence of these two models and their second-order cone relaxations. Our results implymore » that, for radial networks, all these relaxations are equivalent and one should always solve the second-order cone relaxation. For mesh networks, the semidefinite relaxation and the chordal relaxation are equally tight and both are strictly tighter than the second-order cone relaxation. Therefore, for mesh networks, one should either solve the chordal relaxation or the SOCP relaxation, trading off tightness and the required computational effort. Simulations are used to illustrate these results.« less
Modeling Alzheimer’s disease in transgenic rats
2013-01-01
Alzheimer’s disease (AD) is the most common form of dementia. At the diagnostic stage, the AD brain is characterized by the accumulation of extracellular amyloid plaques, intracellular neurofibrillary tangles and neuronal loss. Despite the large variety of therapeutic approaches, this condition remains incurable, since at the time of clinical diagnosis, the brain has already suffered irreversible and extensive damage. In recent years, it has become evident that AD starts decades prior to its clinical presentation. In this regard, transgenic animal models can shed much light on the mechanisms underlying this “pre-clinical” stage, enabling the identification and validation of new therapeutic targets. This paper summarizes the formidable efforts to create models mimicking the various aspects of AD pathology in the rat. Transgenic rat models offer distinctive advantages over mice. Rats are physiologically, genetically and morphologically closer to humans. More importantly, the rat has a well-characterized, rich behavioral display. Consequently, rat models of AD should allow a more sophisticated and accurate assessment of the impact of pathology and novel therapeutics on cognitive outcomes. PMID:24161192
Major advances in extension education programs in dairy production.
Chase, L E; Ely, L O; Hutjens, M F
2006-04-01
The dairy industry has seen structural changes in the last 25 yr that have an impact on extension programming. The number of cows in the United States has decreased by 17%, whereas the number of dairy farms has decreased by 74%. The average milk production per cow has increased from 5,394 to 8,599 kg/lactation. Even though there are fewer farms, dairy farm managers are asking for more specific and targeted information. The extension resources available have also decreased during this period. Because of these changes, shifts have taken place in extension programming and staffing. A key change has been a shift to subject matter-targeted programs and workshops. Extension has also incorporated and expanded use of the Internet. Discussion groups, subject matter courses, and searchable databases are examples of Internet use. There will be continuing shifts in the demographics of the US dairy industry that will influence future extension efforts. It is also probable that fewer extension professionals will be available to provide programming due to changes in funding sources at national, state, and local levels. Future shifts in extension programming will be needed to provide the information needs of the industry with a smaller number of extension workers.
Allocating HIV prevention funds in the United States: recommendations from an optimization model.
Lasry, Arielle; Sansom, Stephanie L; Hicks, Katherine A; Uzunangelov, Vladislav
2012-01-01
The Centers for Disease Control and Prevention (CDC) had an annual budget of approximately $327 million to fund health departments and community-based organizations for core HIV testing and prevention programs domestically between 2001 and 2006. Annual HIV incidence has been relatively stable since the year 2000 and was estimated at 48,600 cases in 2006 and 48,100 in 2009. Using estimates on HIV incidence, prevalence, prevention program costs and benefits, and current spending, we created an HIV resource allocation model that can generate a mathematically optimal allocation of the Division of HIV/AIDS Prevention's extramural budget for HIV testing, and counseling and education programs. The model's data inputs and methods were reviewed by subject matter experts internal and external to the CDC via an extensive validation process. The model projects the HIV epidemic for the United States under different allocation strategies under a fixed budget. Our objective is to support national HIV prevention planning efforts and inform the decision-making process for HIV resource allocation. Model results can be summarized into three main recommendations. First, more funds should be allocated to testing and these should further target men who have sex with men and injecting drug users. Second, counseling and education interventions ought to provide a greater focus on HIV positive persons who are aware of their status. And lastly, interventions should target those at high risk for transmitting or acquiring HIV, rather than lower-risk members of the general population. The main conclusions of the HIV resource allocation model have played a role in the introduction of new programs and provide valuable guidance to target resources and improve the impact of HIV prevention efforts in the United States.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
NASA Astrophysics Data System (ADS)
Ayres, Thomas
2009-07-01
This is a Calibration Archival proposal to develop, implement, and test enhancements to the pipeline wavelength scales of STIS echelle spectra, to take full advantage of the extremely high performance of which the instrument is capable. The motivation is a recent extensive investigation--The Deep Lamp Project--which identified systematic wavelength distortions in all 44 primary and secondary settings of the four STIS echelle modes: E140M, E140H, E230M, and E230H. The method was to process deep exposures of the onboard Pt/Cr-Ne calibration source as if they were science images, and measure deviations of the lamp lines from their laboratory wavelengths. An approach has been developed to correct the distortions post facto, but it would be preferable to implement a more robust dispersion model in the pipeline itself. The proposed study will examine a more extensive set of WAVECALs than in the exploratory Deep Lamp effort, and will benefit from a new laboratory line list specifically for the STIS lamps. Ironing out the wrinkles in the STIS wavelength scales will impact many diverse science investigations, especially the Legacy Archival project "StarCAT."
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi
2015-04-01
Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Beeson, Harold D.; Davis, Dennis D.; Ross, William L., Sr.; Tapphorn, Ralph M.
2002-01-01
This document represents efforts accomplished at the NASA Johnson Space Center White Sands Test Facility (WSTF) in support of the Enhanced Technology for Composite Overwrapped Pressure Vessels (COPV) Program, a joint research and technology effort among the U.S. Air Force, NASA, and the Aerospace Corporation. WSTF performed testing for several facets of the program. Testing that contributed to the Task 3.0 COPV database extension objective included baseline structural strength, failure mode and safe-life, impact damage tolerance, sustained load/impact effect, and materials compatibility. WSTF was also responsible for establishing impact protection and control requirements under Task 8.0 of the program. This included developing a methodology for establishing an impact control plan. Seven test reports detail the work done at WSTF. As such, this document contributes to the database of information regarding COPV behavior that will ensure performance benefits and safety are maintained throughout vessel service life.
Flexible Rover Architecture for Science Instrument Integration and Testing
NASA Technical Reports Server (NTRS)
Bualat, Maria G.; Kobayashi, Linda; Lee, Susan Y.; Park, Eric
2006-01-01
At NASA Ames Research Center, the Intelligent Robotics Group (IRG) fields the K9 and K10 class rovers. Both use a mobile robot hardware architecture designed for extensibility and reconfigurability that allows for rapid changes in instrumentation and provides a high degree of modularity. Over the past ssveral years, we have worked with instrument developers at NASA centers, universities, and national laboratories to integrate or partially integrate their instruments onboard the K9 and K10 rovers. Early efforts required considerable interaction to work through integration issues such as power, data protocol and mechanical mounting. These interactions informed the design of our current avionics architecture, and have simplified more recent integration projects. In this paper, we will describe the IRG extensible avionics and software architecture and the effect it has had on our recent instrument integration efforts, including integration of four Mars Instrument Development Program devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, C.A.; Conant, R.A.; Golich, G.M.
1995-12-31
This paper summarizes the (preliminary) findings from extensive field studies of hydraulic fracture orientation in diatomite waterfloods and related efforts to monitor the induced surface subsidence. Included are case studies from the Belridge and Lost Hills diatomite reservoirs. The primary purpose of the paper is to document a large volume of tiltmeter hydraulic fracture orientation data that demonstrates waterflood-induced fracture reorientation--a phenomenon not previously considered in waterflood development planning. Also included is a brief overview of three possible mechanisms for the observed waterflood fracture reorientation. A discussion section details efforts to isolate the operative mechanism(s) from the most extensive casemore » study, as well as suggesting a possible strategy for detecting and possibly mitigating some of the adverse effects of production/injection induced reservoir stress changes--reservoir compaction and surface subsidence as well as fracture reorientation.« less
Status of commercial fuel cell powerplant system development
NASA Technical Reports Server (NTRS)
Warshay, Marvin
1987-01-01
The primary focus is on the development of commercial Phosphoric Acid Fuel Cell (PAFC) powerplant systems because the PAFC, which has undergone extensive development, is currently the closest fuel cell system to commercialization. Shorter discussions are included on the high temperature fuel cell systems which are not as mature in their development, such as the Molten Carbonate Fuel Cell (MCFC) and the Solid Oxide Fuel Cell (SOFC). The alkaline and the Solid Polymer Electrolyte (SPE) fuel cell systems, are also included, but their discussions are limited to their prospects for commercial development. Currently, although the alkaline fuel cell continues to be used for important space applications there are no commercial development programs of significant size in the USA and only small efforts outside. The market place for fuel cells and the status of fuel cell programs in the USA receive extensive treatment. The fuel cell efforts outside the USA, especially the large Japanese programs, are also discussed.
Detection and Production of Methane Hydrate
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Hirasaki; Walter Chapman; Gerald Dickens
This project seeks to understand regional differences in gas hydrate systems from the perspective of as an energy resource, geohazard, and long-term climate influence. Specifically, the effort will: (1) collect data and conceptual models that targets causes of gas hydrate variance, (2) construct numerical models that explain and predict regional-scale gas hydrate differences in 2-dimensions with minimal 'free parameters', (3) simulate hydrocarbon production from various gas hydrate systems to establish promising resource characteristics, (4) perturb different gas hydrate systems to assess potential impacts of hot fluids on seafloor stability and well stability, and (5) develop geophysical approaches that enable remotemore » quantification of gas hydrate heterogeneities so that they can be characterized with minimal costly drilling. Our integrated program takes advantage of the fact that we have a close working team comprised of experts in distinct disciplines. The expected outcomes of this project are improved exploration and production technology for production of natural gas from methane hydrates and improved safety through understanding of seafloor and well bore stability in the presence of hydrates. The scope of this project was to more fully characterize, understand, and appreciate fundamental differences in the amount and distribution of gas hydrate and how this would affect the production potential of a hydrate accumulation in the marine environment. The effort combines existing information from locations in the ocean that are dominated by low permeability sediments with small amounts of high permeability sediments, one permafrost location where extensive hydrates exist in reservoir quality rocks and other locations deemed by mutual agreement of DOE and Rice to be appropriate. The initial ocean locations were Blake Ridge, Hydrate Ridge, Peru Margin and GOM. The permafrost location was Mallik. Although the ultimate goal of the project was to understand processes that control production potential of hydrates in marine settings, Mallik was included because of the extensive data collected in a producible hydrate accumulation. To date, such a location had not been studied in the oceanic environment. The project worked closely with ongoing projects (e.g. GOM JIP and offshore India) that are actively investigating potentially economic hydrate accumulations in marine settings. The overall approach was fivefold: (1) collect key data concerning hydrocarbon fluxes which is currently missing at all locations to be included in the study, (2) use this and existing data to build numerical models that can explain gas hydrate variance at all four locations, (3) simulate how natural gas could be produced from each location with different production strategies, (4) collect new sediment property data at these locations that are required for constraining fluxes, production simulations and assessing sediment stability, and (5) develop a method for remotely quantifying heterogeneities in gas hydrate and free gas distributions. While we generally restricted our efforts to the locations where key parameters can be measured or constrained, our ultimate aim was to make our efforts universally applicable to any hydrate accumulation.« less
LEGO-MM: LEarning structured model by probabilistic loGic Ontology tree for MultiMedia.
Tang, Jinhui; Chang, Shiyu; Qi, Guo-Jun; Tian, Qi; Rui, Yong; Huang, Thomas S
2016-09-22
Recent advances in Multimedia ontology have resulted in a number of concept models, e.g., LSCOM and Mediamill 101, which are accessible and public to other researchers. However, most current research effort still focuses on building new concepts from scratch, very few work explores the appropriate method to construct new concepts upon the existing models already in the warehouse. To address this issue, we propose a new framework in this paper, termed LEGO1-MM, which can seamlessly integrate both the new target training examples and the existing primitive concept models to infer the more complex concept models. LEGOMM treats the primitive concept models as the lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, we first formulate the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. Then, we incorporate new target training information simultaneously to efficiently disambiguate the underlying logic tree and correct the error propagation. Extensive experiments are conducted on a large vehicle domain data set from ImageNet. The results demonstrate that LEGO-MM has significantly superior performance over existing state-of-the-art methods, which build new concept models from scratch.
Coupling the System Analysis Module with SAS4A/SASSYS-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fanning, T. H.; Hu, R.
2016-09-30
SAS4A/SASSYS-1 is a simulation tool used to perform deterministic analysis of anticipated events as well as design basis and beyond design basis accidents for advanced reactors, with an emphasis on sodium fast reactors. SAS4A/SASSYS-1 has been under development and in active use for nearly forty-five years, and is currently maintained by the U.S. Department of Energy under the Office of Advanced Reactor Technology. Although SAS4A/SASSYS-1 contains a very capable primary and intermediate system modeling component, PRIMAR-4, it also has some shortcomings: outdated data management and code structure makes extension of the PRIMAR-4 module somewhat difficult. The user input format formore » PRIMAR-4 also limits the number of volumes and segments that can be used to describe a given system. The System Analysis Module (SAM) is a fairly new code development effort being carried out under the U.S. DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM is being developed with advanced physical models, numerical methods, and software engineering practices; however, it is currently somewhat limited in the system components and phenomena that can be represented. For example, component models for electromagnetic pumps and multi-layer stratified volumes have not yet been developed. Nor is there support for a balance of plant model. Similarly, system-level phenomena such as control-rod driveline expansion and vessel elongation are not represented. This report documents fiscal year 2016 work that was carried out to couple the transient safety analysis capabilities of SAS4A/SASSYS-1 with the system modeling capabilities of SAM under the joint support of the ART and NEAMS programs. The coupling effort was successful and is demonstrated by evaluating an unprotected loss of flow transient for the Advanced Burner Test Reactor (ABTR) design. There are differences between the stand-alone SAS4A/SASSYS-1 simulations and the coupled SAS/SAM simulations, but these are mainly attributed to the limited maturity of the SAM development effort. The severe accident modeling capabilities in SAS4A/SASSYS-1 (sodium boiling, fuel melting and relocation) will continue to play a vital role for a long time. Therefore, the SAS4A/SASSYS-1 modernization effort should remain a high priority task under the ART program to ensure continued participation in domestic and international SFR safety collaborations and design optimizations. On the other hand, SAM provides an advanced system analysis tool, with improved numerical solution schemes, data management, code flexibility, and accuracy. SAM is still in early stages of development and will require continued support from NEAMS to fulfill its potential and to mature into a production tool for advanced reactor safety analysis. The effort to couple SAS4A/SASSYS-1 and SAM is the first step on the integration of these modeling capabilities.« less
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr. (Principal Investigator)
1996-01-01
The goal of this research project is to develop assumed-stress hybrid elements with rotational degrees of freedom for analyzing composite structures. During the first year of the three-year activity, the effort was directed to further assess the AQ4 shell element and its extensions to buckling and free vibration problems. In addition, the development of a compatible 2-node beam element was to be accomplished. The extensions and new developments were implemented in the Computational Structural Mechanics Testbed COMET. An assessment was performed to verify the implementation and to assess the performance of these elements in terms of accuracy. During the second and third years, extensions to geometrically nonlinear problems were developed and tested. This effort involved working with the nonlinear solution strategy as well as the nonlinear formulation for the elements. This research has resulted in the development and implementation of two additional element processors (ES22 for the beam element and ES24 for the shell elements) in COMET. The software was developed using a SUN workstation and has been ported to the NASA Langley Convex named blackbird. Both element processors are now part of the baseline version of COMET.
Extending Our Understanding of Compliant Thermal Barrier Performance
NASA Technical Reports Server (NTRS)
Demange, Jeffrey J.; Finkbeiner, Joshua R.; Dunlap, Patrick H.
2014-01-01
Thermal barriers and seals are integral components in the thermal protection systems (TPS) of nearly all aerospace vehicles. They are used to minimize the flow of hot gases through interfaces and protect underlying temperature-sensitive components and systems. Although thermal barriers have been used extensively on many aerospace vehicles, the factors affecting their thermal and mechanical performance are not well-understood. Because of this, vehicle TPS designers are often left with little guidance on how to properly design and optimize these barriers. An ongoing effort to better understand thermal barrier performance and develop models and design tools is in progress at the NASA Glenn Research Center. Testing has been conducted to understand the degree to which insulation density influences structural performance and permeability. In addition, the development of both thermal and mechanical models is ongoing with the goal of providing an improved ability to design and implement these critical TPS components.
Post-Genomics and Vaccine Improvement for Leishmania
Seyed, Negar; Taheri, Tahereh; Rafati, Sima
2016-01-01
Leishmaniasis is a parasitic disease that primarily affects Asia, Africa, South America, and the Mediterranean basin. Despite extensive efforts to develop an effective prophylactic vaccine, no promising vaccine is available yet. However, recent advancements in computational vaccinology on the one hand and genome sequencing approaches on the other have generated new hopes in vaccine development. Computational genome mining for new vaccine candidates is known as reverse vaccinology and is believed to further extend the current list of Leishmania vaccine candidates. Reverse vaccinology can also reduce the intrinsic risks associated with live attenuated vaccines. Individual epitopes arranged in tandem as polytopes are also a possible outcome of reverse genome mining. Here, we will briefly compare reverse vaccinology with conventional vaccinology in respect to Leishmania vaccine, and we will discuss how it influences the aforementioned topics. We will also introduce new in vivo models that will bridge the gap between human and laboratory animal models in future studies. PMID:27092123
Maan, Martine E.; Sefc, Kristina M.
2013-01-01
Cichlid fishes constitute one of the most species-rich families of vertebrates. In addition to complex social behaviour and morphological versatility, they are characterised by extensive diversity in colouration, both within and between species. Here, we review the cellular and molecular mechanisms underlying colour variation in this group and the selective pressures responsible for the observed variation. We specifically address the evidence for the hypothesis that divergence in colouration is associated with the evolution of reproductive isolation between lineages. While we conclude that cichlid colours are excellent models for understanding the role of animal communication in species divergence, we also identify taxonomic and methodological biases in the current research effort. We suggest that the integration of genomic approaches with ecological and behavioural studies, across the entire cichlid family and beyond it, will contribute to the utility of the cichlid model system for understanding the evolution of biological diversity. PMID:23665150
Hucl, Tomas; Gallmeier, Eike; Kern, Scott E
2007-06-01
Single therapeutic agents very often fail in unselected patients. It is therefore commonplace to combine an agent specifically with a selected patient subgroup or with another agent. To support such efforts, it is useful to clarify the distinctions between the terms and the mathematical models used in analyzing combinations. To incorporate molecular disease classifications, the familiar concept of the therapeutic window is modified to define a pharmacogenetic window, which is an unambiguous numerical measure of the magnitude of interaction produced by a combination, and to define a test of pharmacogenetic synergy. In contrast, certain common comparative methods, such as vertical windows (comparing effects at a given dose) and animal models of mutational targets may be dominated by undesirable features. Although this discussion is oriented towards cancer therapy, an extension of these concepts to other comparative biologic assays is feasible and advisable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lovill, J.E.; Sullivan, T.J.; Weichel, R.L.
A total ozone retrieval model has been developed to process radiance data gathered by a satellite-mounted multichannel filter radiometer (MFR). Extensive effort went into theoretical radiative transfer modeling, a retrieval scheme was developed, and the technique was applied to the MFR radiance measurements. The high quality of the total ozone retrieval results was determined through comparisons with Dobson measurements. Included in the report are global total ozone maps for 20 days between May 12 and July 5, 1977. A comparison of MFR results for 13 days in June 1977 with Dobson spectrophotometer measurements of ozone for the same period showedmore » good agreement: there was a root-mean-square difference of 6.2% (equivalent to 20.2 m.atm.cm). The estimated global total ozone value for June 1977 (296 m.atm.cm) was in good agreement with satellite backscatter ultraviolet data for June 1970 (304 m.atm.cm) and June 1971 (preliminary data--299 m.atm.cm).« less
Turan, Nil; Kalko, Susana; Stincone, Anna; Clarke, Kim; Sabah, Ayesha; Howlett, Katherine; Curnow, S John; Rodriguez, Diego A; Cascante, Marta; O'Neill, Laura; Egginton, Stuart; Roca, Josep; Falciani, Francesco
2011-09-01
Chronic Obstructive Pulmonary Disease (COPD) is an inflammatory process of the lung inducing persistent airflow limitation. Extensive systemic effects, such as skeletal muscle dysfunction, often characterize these patients and severely limit life expectancy. Despite considerable research efforts, the molecular basis of muscle degeneration in COPD is still a matter of intense debate. In this study, we have applied a network biology approach to model the relationship between muscle molecular and physiological response to training and systemic inflammatory mediators. Our model shows that failure to co-ordinately activate expression of several tissue remodelling and bioenergetics pathways is a specific landmark of COPD diseased muscles. Our findings also suggest that this phenomenon may be linked to an abnormal expression of a number of histone modifiers, which we discovered correlate with oxygen utilization. These observations raised the interesting possibility that cell hypoxia may be a key factor driving skeletal muscle degeneration in COPD patients.
NASA Astrophysics Data System (ADS)
Spicer, Graham L. C.; Azarin, Samira M.; Yi, Ji; Young, Scott T.; Ellis, Ronald; Bauer, Greta M.; Shea, Lonnie D.; Backman, Vadim
2016-10-01
In cancer biology, there has been a recent effort to understand tumor formation in the context of the tissue microenvironment. In particular, recent progress has explored the mechanisms behind how changes in the cell-extracellular matrix ensemble influence progression of the disease. The extensive use of in vitro tissue culture models in simulant matrix has proven effective at studying such interactions, but modalities for non-invasively quantifying aspects of these systems are scant. We present the novel application of an imaging technique, Inverse Spectroscopic Optical Coherence Tomography, for the non-destructive measurement of in vitro biological samples during matrix remodeling. Our findings indicate that the nanoscale-sensitive mass density correlation shape factor D of cancer cells increases in response to a more crosslinked matrix. We present a facile technique for the non-invasive, quantitative study of the micro- and nano-scale structure of the extracellular matrix and its host cells.
An inhibitor of oxidative phosphorylation exploits cancer vulnerability.
Molina, Jennifer R; Sun, Yuting; Protopopova, Marina; Gera, Sonal; Bandi, Madhavi; Bristow, Christopher; McAfoos, Timothy; Morlacchi, Pietro; Ackroyd, Jeffrey; Agip, Ahmed-Noor A; Al-Atrash, Gheath; Asara, John; Bardenhagen, Jennifer; Carrillo, Caroline C; Carroll, Christopher; Chang, Edward; Ciurea, Stefan; Cross, Jason B; Czako, Barbara; Deem, Angela; Daver, Naval; de Groot, John Frederick; Dong, Jian-Wen; Feng, Ningping; Gao, Guang; Gay, Jason; Do, Mary Geck; Greer, Jennifer; Giuliani, Virginia; Han, Jing; Han, Lina; Henry, Verlene K; Hirst, Judy; Huang, Sha; Jiang, Yongying; Kang, Zhijun; Khor, Tin; Konoplev, Sergej; Lin, Yu-Hsi; Liu, Gang; Lodi, Alessia; Lofton, Timothy; Ma, Helen; Mahendra, Mikhila; Matre, Polina; Mullinax, Robert; Peoples, Michael; Petrocchi, Alessia; Rodriguez-Canale, Jaime; Serreli, Riccardo; Shi, Thomas; Smith, Melinda; Tabe, Yoko; Theroff, Jay; Tiziani, Stefano; Xu, Quanyun; Zhang, Qi; Muller, Florian; DePinho, Ronald A; Toniatti, Carlo; Draetta, Giulio F; Heffernan, Timothy P; Konopleva, Marina; Jones, Philip; Di Francesco, M Emilia; Marszalek, Joseph R
2018-06-11
Metabolic reprograming is an emerging hallmark of tumor biology and an actively pursued opportunity in discovery of oncology drugs. Extensive efforts have focused on therapeutic targeting of glycolysis, whereas drugging mitochondrial oxidative phosphorylation (OXPHOS) has remained largely unexplored, partly owing to an incomplete understanding of tumor contexts in which OXPHOS is essential. Here, we report the discovery of IACS-010759, a clinical-grade small-molecule inhibitor of complex I of the mitochondrial electron transport chain. Treatment with IACS-010759 robustly inhibited proliferation and induced apoptosis in models of brain cancer and acute myeloid leukemia (AML) reliant on OXPHOS, likely owing to a combination of energy depletion and reduced aspartate production that leads to impaired nucleotide biosynthesis. In models of brain cancer and AML, tumor growth was potently inhibited in vivo following IACS-010759 treatment at well-tolerated doses. IACS-010759 is currently being evaluated in phase 1 clinical trials in relapsed/refractory AML and solid tumors.
Observations and Modelling of the Zodiacal Light
NASA Astrophysics Data System (ADS)
Kelsall, T.
1994-12-01
The DIRBE instrument on the COBE satellite performed a full-sky survey in ten bands covering the spectral range from 1.25 to 240 microns, and made measurements of the polarization from 1.25 to 3.5 microns. These observations provide a wealth of data on the radiations from the interplanetary dust cloud (IPD). The presentation covers the observations, the model-independent findings, and the results from the extensive efforts of the DIRBE team to model the IPD. Emphasis is placed on describing the importance of correctly accounting for the IPD contribution to the observed-sky signal for the purpose of detecting the cosmic infrared background. (*) The NASA/Goddard Space Flight Center (GSFC) is responsible for the design, development, and operation of the COBE mission. GSFC is also responsible for the development of the analysis software and for the production of the mission data sets. Scientific guidance is provided by the COBE Science Working Group. The COBE program is supported by the Astrophysics Division of NASA's Office of Space Science.
Asymmetry in Signal Oscillations Contributes to Efficiency of Periodic Systems.
Bae, Seul-A; Acevedo, Alison; Androulakis, Ioannis P
2016-01-01
Oscillations are an important feature of cellular signaling that result from complex combinations of positive- and negative-feedback loops. The encoding and decoding mechanisms of oscillations based on amplitude and frequency have been extensively discussed in the literature in the context of intercellular and intracellular signaling. However, the fundamental questions of whether and how oscillatory signals offer any competitive advantages-and, if so, what-have not been fully answered. We investigated established oscillatory mechanisms and designed a study to analyze the oscillatory characteristics of signaling molecules and system output in an effort to answer these questions. Two classic oscillators, Goodwin and PER, were selected as the model systems, and corresponding no-feedback models were created for each oscillator to discover the advantage of oscillating signals. Through simulating the original oscillators and the matching no-feedback models, we show that oscillating systems have the capability to achieve better resource-to-output efficiency, and we identify oscillatory characteristics that lead to improved efficiency.
Self-consistent modeling of CFETR baseline scenarios for steady-state operation
NASA Astrophysics Data System (ADS)
Chen, Jiale; Jian, Xiang; Chan, Vincent S.; Li, Zeyu; Deng, Zhao; Li, Guoqiang; Guo, Wenfeng; Shi, Nan; Chen, Xi; CFETR Physics Team
2017-07-01
Integrated modeling for core plasma is performed to increase confidence in the proposed baseline scenario in the 0D analysis for the China Fusion Engineering Test Reactor (CFETR). The steady-state scenarios are obtained through the consistent iterative calculation of equilibrium, transport, auxiliary heating and current drives (H&CD). Three combinations of H&CD schemes (NB + EC, NB + EC + LH, and EC + LH) are used to sustain the scenarios with q min > 2 and fusion power of ˜70-150 MW. The predicted power is within the target range for CFETR Phase I, although the confinement based on physics models is lower than that assumed in 0D analysis. Ideal MHD stability analysis shows that the scenarios are stable against n = 1-10 ideal modes, where n is the toroidal mode number. Optimization of RF current drive for the RF-only scenario is also presented. The simulation workflow for core plasma in this work provides a solid basis for a more extensive research and development effort for the physics design of CFETR.
Research in millimeter wave techniques
NASA Technical Reports Server (NTRS)
Mcmillan, R. W.
1978-01-01
During the past six months, efforts on this project have been devoted to: (1) continuation of construction and testing of a 6 GHz subharmonic mixer model with extension of the pumping frequency of this mixer to omega sub s/4, (2) construction of a 183 GHz subharmonic mixer based on the results of tests on this 6 GHz model, (3) ground-based radiometric measurements at 183 GHz, (4) fabrication and testing of wire grid interferometers, (5) calculations of reflected and lost power in these interferometers, and (6) calculations of the antenna temperature due to water vapor to be expected in down-looking radiometry as a function of frequency. Significant events during the past six months include: (1) Receipt of a 183 GHz single-ended fundamental mixer, (2) attainment of 6 db single sideband conversion loss with the 6 GHz subharmonic mixer model by using a 1.5 GHz (omega sub s/4) pump frequency, (3) additional ground-based radiometric measurements and (4) derivation of equations for reflection and loss for wire grid interferometers.
Meeting report from the fourth meeting of the Computational Modeling in Biology Network (COMBINE)
Waltemath, Dagmar; Bergmann, Frank T.; Chaouiya, Claudine; Czauderna, Tobias; Gleeson, Padraig; Goble, Carole; Golebiewski, Martin; Hucka, Michael; Juty, Nick; Krebs, Olga; Le Novère, Nicolas; Mi, Huaiyu; Moraru, Ion I.; Myers, Chris J.; Nickerson, David; Olivier, Brett G.; Rodriguez, Nicolas; Schreiber, Falk; Smith, Lucian; Zhang, Fengkai; Bonnet, Eric
2014-01-01
The Computational Modeling in Biology Network (COMBINE) is an initiative to coordinate the development of community standards and formats in computational systems biology and related fields. This report summarizes the topics and activities of the fourth edition of the annual COMBINE meeting, held in Paris during September 16-20 2013, and attended by a total of 96 people. This edition pioneered a first day devoted to modeling approaches in biology, which attracted a broad audience of scientists thanks to a panel of renowned speakers. During subsequent days, discussions were held on many subjects including the introduction of new features in the various COMBINE standards, new software tools that use the standards, and outreach efforts. Significant emphasis went into work on extensions of the SBML format, and also into community-building. This year’s edition once again demonstrated that the COMBINE community is thriving, and still manages to help coordinate activities between different standards in computational systems biology.
Adding kinetics and hydrodynamics to the CHEETAH thermochemical code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fried, L.E., Howard, W.M., Souers, P.C.
1997-01-15
In FY96 we released CHEETAH 1.40, which made extensive improvements on the stability and user friendliness of the code. CHEETAH now has over 175 users in government, academia, and industry. Efforts have also been focused on adding new advanced features to CHEETAH 2.0, which is scheduled for release in FY97. We have added a new chemical kinetics capability to CHEETAH. In the past, CHEETAH assumed complete thermodynamic equilibrium and independence of time. The addition of a chemical kinetic framework will allow for modeling of time-dependent phenomena, such as partial combustion and detonation in composite explosives with large reaction zones. Wemore » have implemented a Wood-Kirkwood detonation framework in CHEETAH, which allows for the treatment of nonideal detonations and explosive failure. A second major effort in the project this year has been linking CHEETAH to hydrodynamic codes to yield an improved HE product equation of state. We have linked CHEETAH to 1- and 2-D hydrodynamic codes, and have compared the code to experimental data. 15 refs., 13 figs., 1 tab.« less
NASA Technical Reports Server (NTRS)
Wingate, Robert J.
2012-01-01
After the launch scrub of Space Shuttle mission STS-133 on November 5, 2010, large cracks were discovered in two of the External Tank intertank stringers. The NASA Marshall Space Flight Center, as managing center for the External Tank Project, coordinated the ensuing failure investigation and repair activities with several organizations, including the manufacturer, Lockheed Martin. To support the investigation, the Marshall Space Flight Center formed an ad-hoc stress analysis team to complement the efforts of Lockheed Martin. The team undertook six major efforts to analyze or test the structural behavior of the stringers. Extensive finite element modeling was performed to characterize the local stresses in the stringers near the region of failure. Data from a full-scale tanking test and from several subcomponent static load tests were used to confirm the analytical conclusions. The analysis and test activities of the team are summarized. The root cause of the stringer failures and the flight readiness rationale for the repairs that were implemented are discussed.
Synthesis of User Needs for Arctic Sea Ice Predictions
NASA Astrophysics Data System (ADS)
Wiggins, H. V.; Turner-Bogren, E. J.; Sheffield Guy, L.
2017-12-01
Forecasting Arctic sea ice on sub-seasonal to seasonal scales in a changing Arctic is of interest to a diverse range of stakeholders. However, sea ice forecasting is still challenging due to high variability in weather and ocean conditions and limits to prediction capabilities; the science needs for observations and modeling are extensive. At a time of challenged science funding, one way to prioritize sea ice prediction efforts is to examine the information needs of various stakeholder groups. This poster will present a summary and synthesis of existing surveys, reports, and other literature that examines user needs for sea ice predictions. The synthesis will include lessons learned from the Sea Ice Prediction Network (a collaborative, multi-agency-funded project focused on seasonal Arctic sea ice predictions), the Sea Ice for Walrus Outlook (a resource for Alaska Native subsistence hunters and coastal communities, that provides reports on weather and sea ice conditions), and other efforts. The poster will specifically compare the scales and variables of sea ice forecasts currently available, as compared to what information is requested by various user groups.
NASA Astrophysics Data System (ADS)
Tubiana, Jerome; Kass, Alex J.; Newman, Maya Y.; Levitz, David
2015-07-01
Detecting pre-cancer in epithelial tissues such as the cervix is a challenging task in low-resources settings. In an effort to achieve low cost cervical cancer screening and diagnostic method for use in low resource settings, mobile colposcopes that use a smartphone as their engine have been developed. Designing image analysis software suited for this task requires proper modeling of light propagation from the abnormalities inside tissues to the camera of the smartphones. Different simulation methods have been developed in the past, by solving light diffusion equations, or running Monte Carlo simulations. Several algorithms exist for the latter, including MCML and the recently developed MCX. For imaging purpose, the observable parameter of interest is the reflectance profile of a tissue under some specific pattern of illumination and optical setup. Extensions of the MCX algorithm to simulate this observable under these conditions were developed. These extensions were validated against MCML and diffusion theory for the simple case of contact measurements, and reflectance profiles under colposcopy imaging geometry were also simulated. To validate this model, the diffuse reflectance profiles of tissue phantoms were measured with a spectrometer under several illumination and optical settings for various homogeneous tissues phantoms. The measured reflectance profiles showed a non-trivial deviation across the spectrum. Measurements of an added absorber experiment on a series of phantoms showed that absorption of dye scales linearly when fit to both MCX and diffusion models. More work is needed to integrate a pupil into the experiment.
NASA Astrophysics Data System (ADS)
Karimzadeh, Shaghayegh; Askan, Aysegul
2018-04-01
Located within a basin structure, at the conjunction of North East Anatolian, North Anatolian and Ovacik Faults, Erzincan city center (Turkey) is one of the most hazardous regions in the world. Combination of the seismotectonic and geological settings of the region has resulted in series of significant seismic activities including the 1939 (Ms 7.8) as well as the 1992 (Mw = 6.6) earthquakes. The devastative 1939 earthquake occurred in the pre-instrumental era in the region with no available local seismograms. Thus, a limited number of studies exist on that earthquake. However, the 1992 event, despite the sparse local network at that time, has been studied extensively. This study aims to simulate the 1939 Erzincan earthquake using available regional seismic and geological parameters. Despite several uncertainties involved, such an effort to quantitatively model the 1939 earthquake is promising, given the historical reports of extensive damage and fatalities in the area. The results of this study are expressed in terms of anticipated acceleration time histories at certain locations, spatial distribution of selected ground motion parameters and felt intensity maps in the region. Simulated motions are first compared against empirical ground motion prediction equations derived with both local and global datasets. Next, anticipated intensity maps of the 1939 earthquake are obtained using local correlations between peak ground motion parameters and felt intensity values. Comparisons of the estimated intensity distributions with the corresponding observed intensities indicate a reasonable modeling of the 1939 earthquake.
Extensive degeneracy, Coulomb phase and magnetic monopoles in artificial square ice.
Perrin, Yann; Canals, Benjamin; Rougemaille, Nicolas
2016-12-15
Artificial spin-ice systems are lithographically patterned arrangements of interacting magnetic nanostructures that were introduced as way of investigating the effects of geometric frustration in a controlled manner. This approach has enabled unconventional states of matter to be visualized directly in real space, and has triggered research at the frontier between nanomagnetism, statistical thermodynamics and condensed matter physics. Despite efforts to create an artificial realization of the square-ice model-a two-dimensional geometrically frustrated spin-ice system defined on a square lattice-no simple geometry based on arrays of nanomagnets has successfully captured the macroscopically degenerate ground-state manifold of the model. Instead, square lattices of nanomagnets are characterized by a magnetically ordered ground state that consists of local loop configurations with alternating chirality. Here we show that all of the characteristics of the square-ice model are observed in an artificial square-ice system that consists of two sublattices of nanomagnets that are vertically separated by a small distance. The spin configurations we image after demagnetizing our arrays reveal unambiguous signatures of a Coulomb phase and algebraic spin-spin correlations, which are characterized by the presence of 'pinch' points in the associated magnetic structure factor. Local excitations-the classical analogues of magnetic monopoles-are free to evolve in an extensively degenerate, divergence-free vacuum. We thus provide a protocol that could be used to investigate collective magnetic phenomena, including Coulomb phases and the physics of ice-like materials.
Energy Efficient Engine (E3) combustion system component technology performance report
NASA Technical Reports Server (NTRS)
Burrus, D. L.; Chahrour, C. A.; Foltz, H. L.; Sabla, P. E.; Seto, S. P.; Taylor, J. R.
1984-01-01
The Energy Efficient Engine (E3) combustor effort was conducted as part of the overall NASA/GE E3 Program. This effort included the selection of an advanced double-annular combustion system design. The primary intent of this effort was to evolve a design that meets the stringent emissions and life goals of the E3, as well as all of the usual performance requirements of combustion systems for modern turbofan engines. Numerous detailed design studies were conducted to define the features of the combustion system design. Development test hardware was fabricated, and an extensive testing effort was undertaken to evaluate the combustion system subcomponents in order to verify and refine the design. Technology derived from this effort was incorporated into the engine combustion hardware design. The advanced engine combustion system was then evaluated in component testing to verify the design intent. What evolved from this effort was an advanced combustion system capable of satisfying all of the combustion system design objectives and requirements of the E3.
Denkyirah, Elisha Kwaku; Okoffo, Elvis Dartey; Adu, Derick Taylor; Aziz, Ahmed Abdul; Ofori, Amoako; Denkyirah, Elijah Kofi
2016-01-01
Pesticides are a significant component of the modern agricultural technology that has been widely adopted across the globe to control pests, diseases, weeds and other plant pathogens, in an effort to reduce or eliminate yield losses and maintain high product quality. Although pesticides are said to be toxic and exposes farmers to risk due to the hazardous effects of these chemicals, pesticide use among cocoa farmers in Ghana is still high. Furthermore, cocoa farmers do not apply pesticide on their cocoa farms at the recommended frequency of application. In view of this, the study assessed the factors influencing cocoa farmers' decision to use pesticide and frequency of pesticide application. A total of 240 cocoa farmers from six cocoa growing communities in the Brong Ahafo Region of Ghana were selected for the study using the multi-stage sampling technique. The Probit and Tobit regression models were used to estimate factors influencing farmers' decision to use pesticide and frequency of pesticide application, respectively. Results of the study revealed that the use of pesticide is still high among farmers in the Region and that cocoa farmers do not follow the Ghana Cocoa Board recommended frequency of pesticide application. In addition, cocoa farmers in the study area were found to be using both Ghana Cocoa Board approved/recommended and unapproved pesticides for cocoa production. Gender, age, educational level, years of farming experience, access to extension service, availability of agrochemical shop and access to credit significantly influenced farmers' decision to use pesticides. Also, educational level, years of farming experience, membership of farmer based organisation, access to extension service, access to credit and cocoa income significantly influenced frequency of pesticide application. Since access to extension service is one key factor that reduces pesticide use and frequency of application among cocoa farmers, it is recommended that policies by government and non-governmental organisations should be aimed at mobilizing resources towards the expansion of extension education. In addition, extension service should target younger farmers as well as provide information on alternative pest control methods in order to reduce pesticide use among cocoa farmers. Furthermore, extension service/agents should target cocoa farmers with less years of farming experience and encourage cocoa farmers to join farmer based organisations in order to decrease frequency of pesticide application.
Type Safe Extensible Programming
NASA Astrophysics Data System (ADS)
Chae, Wonseok
2009-10-01
Software products evolve over time. Sometimes they evolve by adding new features, and sometimes by either fixing bugs or replacing outdated implementations with new ones. When software engineers fail to anticipate such evolution during development, they will eventually be forced to re-architect or re-build from scratch. Therefore, it has been common practice to prepare for changes so that software products are extensible over their lifetimes. However, making software extensible is challenging because it is difficult to anticipate successive changes and to provide adequate abstraction mechanisms over potential changes. Such extensibility mechanisms, furthermore, should not compromise any existing functionality during extension. Software engineers would benefit from a tool that provides a way to add extensions in a reliable way. It is natural to expect programming languages to serve this role. Extensible programming is one effort to address these issues. In this thesis, we present type safe extensible programming using the MLPolyR language. MLPolyR is an ML-like functional language whose type system provides type-safe extensibility mechanisms at several levels. After presenting the language, we will show how these extensibility mechanisms can be put to good use in the context of product line engineering. Product line engineering is an emerging software engineering paradigm that aims to manage variations, which originate from successive changes in software.
AFWAL FY80 Technical Accomplishments Report.
1981-12-01
through cooperative effort of the Materials and Certain compositions in the titanium aluminide Propulsion Laboratories. In addition to an extensive system...Bonded Structures Technology Transitioned .................................................. 43 Superplastically Formed and Diffusion Bonded Titanium ...Technology ................................................................................................. 75 First RSR Radial Wafer Blade Engine Test
DECISION-SUPPORT TOOLS FOR MANAGING WASTEWATER COLLECTION SYSTEMS
Wastewater collection systems are an extensive part of the nation's infrastructure. As these systems become older, more preventative maintenance and renewal are required. For municipalities to cost-effectively plan, organize, and implement this effort, they require improved inf...
US 93 preconstruction wildlife monitoring field methods handbook : final report.
DOT National Transportation Integrated Search
2006-11-01
The US 93 reconstruction project on the Flathead Indian Reservation in northwest Montana represents one of the most extensive wildlife-sensitive highway design efforts to occur in the continental United States. The reconstruction will include install...
Inquiry, Investigation, and Communication in the Student-Directed Laboratory.
ERIC Educational Resources Information Center
Janners, Martha Y.
1988-01-01
Describes how to organize a student-directed laboratory investigation which is based on amphibian metamorphosis, lasts for nearly a term, and involves extensive group effort. Explains the assignment, student response and opinion, formal paper, and instructor responsibilities. (RT)
Characterization of Stress Corrosion Cracking Using Laser Ultrasonics
DOT National Transportation Integrated Search
2008-08-31
In-service inspection of gas and oil pipelines is a subject of great current interest. Issues of safety and fitness for service have driven extensive efforts to develop effective monitoring and inspection techniques. A number of effective NDT techniq...
Design considerations for bridge deck joint-sealing systems : summary report.
DOT National Transportation Integrated Search
1992-07-01
This is a report summary which summarizes a three year research effort related to the study of bridge deck expansion joint movements. Bridge deck expansion joint systems often develop serious problems requiring extensive and expensive maintenance. Th...
Commercial Vehicle Information Exchange Window (CVIEW) Roadside Enforcement/Compliance Project
DOT National Transportation Integrated Search
2012-09-04
An extensive effort was undertaken by Clough Harbour & Associates on behalf of, and with assistance from, the New : York State Department of Transportation (NYSDOT) in order to research and design a prototype roadside commercial : vehicle electronic ...
76 FR 71066 - HUD Draft Environmental Justice Strategy, Extension of Public Comment Period
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... may access this number through TTY by calling the toll-free Federal Relay Service at (800) 877-8339... step in a larger Administration-wide effort to ensure strong protection from environmental and health...
The conception, birth, and growth of a missile umbilical system
NASA Technical Reports Server (NTRS)
Nordman, G. W.
1977-01-01
The design development was traced for the sprint 2 and improved sprint 2 missile system (ISMS) umbilical system. The unique system requirements, umbilical designs considered to meet the requirements, and the problems encountered and solutions derived during the design, and development testing of the selected systems are described. The sprint 2 development effort consisted of design, analysis, and testing activities. The ISMS effort involved the performance of an extensive trade study to determine the optimum design to meet the ISMS conditions.
Digital processing of the Mariner 10 images of Venus and Mercury
NASA Technical Reports Server (NTRS)
Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.
1977-01-01
An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.
Remote detection of riverine traffic using an ad hoc wireless sensor network
NASA Astrophysics Data System (ADS)
Athan, Stephan P.
2005-05-01
Trafficking of illegal drugs on riverine and inland waterways continues to proliferate in South America. While there has been a successful joint effort to cut off overland and air trafficking routes, there exists a vast river network and Amazon region consisting of over 13,000 water miles that remains difficult to adequately monitor, increasing the likelihood of narcotics moving along this extensive river system. Hence, an effort is underway to provide remote unattended riverine detection in lieu of manned or attended detection measures.
The Effect of Failing to Recapitalize the B-52H Defensive Avionics System on Future Operations
2009-04-01
reactive jamming, compensation for wing flexing to prevent wire chaffing and ensuring the necessary degree of cooling is available to extend LRU life ...repair parts. Parts of this system including the 6-gun CRT, referenced earlier, were replaced with a commercial off-the- shelf item built by Condor...This effort will provide a small extension to the service life of the existing BWOs. There is also an effort underway to replace the BWOs with
Sources of Sahelian-Sudan moisture: Insights from a moisture-tracing atmospheric model
NASA Astrophysics Data System (ADS)
Salih, Abubakr A. M.; Zhang, Qiong; Pausata, Francesco S. R.; Tjernström, Michael
2016-07-01
The summer rainfall across Sahelian-Sudan is one of the main sources of water for agriculture, human, and animal needs. However, the rainfall is characterized by large interannual variability, which has attracted extensive scientific efforts to understand it. This study attempts to identify the source regions that contribute to the Sahelian-Sudan moisture budget during July through September. We have used an atmospheric general circulation model with an embedded moisture-tracing module (Community Atmosphere Model version 3), forced by observed (1979-2013) sea-surface temperatures. The result suggests that about 40% of the moisture comes with the moisture flow associated with the seasonal migration of the Intertropical Convergence Zone (ITCZ) and originates from Guinea Coast, central Africa, and the Western Sahel. The Mediterranean Sea, Arabian Peninsula, and South Indian Ocean regions account for 10.2%, 8.1%, and 6.4%, respectively. Local evaporation and the rest of the globe supply the region with 20.3% and 13.2%, respectively. We also compared the result from this study to a previous analysis that used the Lagrangian model FLEXPART forced by ERA-Interim. The two approaches differ when comparing individual regions, but are in better agreement when neighboring regions of similar atmospheric flow features are grouped together. Interannual variability with the rainfall over the region is highly correlated with contributions from regions that are associated with the ITCZ movement, which is in turn linked to the Atlantic Multidecadal Oscillation. Our result is expected to provide insights for the effort on seasonal forecasting of the rainy season over Sahelian Sudan.
Stigma and intellectual disability: potential application of mental illness research.
Ditchman, Nicole; Werner, Shirli; Kosyluk, Kristin; Jones, Nev; Elg, Brianna; Corrigan, Patrick W
2013-05-01
Individuals with intellectual disabilities (ID) and individuals with mental illness are consistently found to be among the most socially excluded populations and continue to face substantial health, housing, and employment disparities due to stigma. Although this has spurred extensive research efforts and theoretical advancements in the study of stigma toward mental illness, the stigma of ID has received only limited attention. In this article we explore the application of mental illness stigma research for ID. We carefully reviewed the existing research on mental illness stigma as a foundation for a parallel summary of the empirical literature on attitudes and stigma related to ID. Based on our review, there has not been a systematic approach to the study of stigma toward ID. However, multilevel conceptual models of stigma have received much attention in the mental illness literature. These models have been used to inform targeted interventions and have application to the study of the stigma process for individuals with ID. Nonetheless, there are indeed key differences between-as well as substantial variability within-the ID and mental illness populations that must be considered. Stigma is an issue of social justice impacting the lives of individuals with ID, yet there remains virtually no systematic framework applied to the understanding of the stigma process for this group. Future research can draw on the stigma models developed in the mental illness literature to guide more rigorous research efforts and ultimately the development of effective, multilevel stigma-change strategies for ID.
Rich Support for Heterogeneous Polar Data in RAMADDA
NASA Astrophysics Data System (ADS)
McWhirter, J.; Crosby, C. J.; Griffith, P. C.; Khalsa, S.; Lazzara, M. A.; Weber, W. J.
2013-12-01
Difficult to navigate environments, tenuous logistics, strange forms, deeply rooted cultures - these are all experiences shared by Polar scientist in the field as well as the developers of the underlying data management systems back in the office. Among the key data management challenges that Polar investigations present are the heterogeneity and complexity of data that are generated. Polar regions are intensely studied across many science domains through a variety of techniques - satellite and aircraft remote sensing, in-situ observation networks, modeling, sociological investigations, and extensive PI-driven field project data collection. While many data management efforts focus on large homogeneous collections of data targeting specific science domains (e.g., satellite, GPS, modeling), multi-disciplinary efforts that focus on Polar data need to be able to address a wide range of data formats, science domains and user communities. There is growing use of the RAMADDA (Repository for Archiving, Managing and Accessing Diverse Data) system to manage and provide services for Polar data. RAMADDA is a freely available extensible data repository framework that supports a wide range of data types and services to allow the creation, management, discovery and use of data and metadata. The broad range of capabilities provided by RAMADDA and its extensibility makes it well-suited as an archive solution for Polar data. RAMADDA can run in a number of diverse contexts - as a centralized archive, at local institutions, and can even run on an investigator's laptop in the field, providing in-situ metadata and data management services. We are actively developing archives and support for a number of Polar initiatives: - NASA-Arctic Boreal Vulnerability Experiment (ABoVE): ABoVE is a long-term multi-instrument field campaign that will make use of a wide range of data. We have developed an extensive ontology of program, project and site metadata in RAMADDA, in support of the ABoVE Science Definition Team and Project Office. See: http://above.nasa.gov - UNAVCO Terrestrial Laser Scanning (TLS): UNAVCO's Polar program provides support for terrestrial laser scanning field projects. We are using RAMADDA to archive these field projects, with over 40 projects ingested to date. - NASA-IceBridge: As part of the NASA LiDAR Access System (NLAS) project, RAMADDA supports numerous airborne and satellite LiDAR data sets - GLAS, LVIS, ATM, Paris, McORDS, etc. - Antarctic Meteorological Research Center (AMRC): Satellite and surface observation network - Support for numerous other data from AON-ACADIS, Greenland GC-Net, NOAA-GMD, AmeriFlux, etc. In this talk we will discuss some of the challenges that Polar data brings to geoinformatics and describe the approaches we have taken to address these challenges in RAMADDA.
Flowering time control and applications in plant breeding.
Jung, Christian; Müller, Andreas E
2009-10-01
Shifting the seasonal timing of reproduction is a major goal of plant breeding efforts to produce novel varieties that are better adapted to local environments and changing climatic conditions. The key regulators of floral transition have been studied extensively in model species, and in recent years a growing number of related genes have been identified in crop species, with some notable exceptions. These sequences and variants thereof, as well as several major genes which were only identified in crop species, can now be used by breeders as molecular markers and for targeted genetic modification of flowering time. This article reviews the major floral regulatory pathways and discusses current and novel strategies for altering bolting and flowering behavior in crop plants.
Second order tensor finite element
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Fly, J.; Berry, C.; Tworzydlo, W.; Vadaketh, S.; Bass, J.
1990-01-01
The results of a research and software development effort are presented for the finite element modeling of the static and dynamic behavior of anisotropic materials, with emphasis on single crystal alloys. Various versions of two dimensional and three dimensional hybrid finite elements were implemented and compared with displacement-based elements. Both static and dynamic cases are considered. The hybrid elements developed in the project were incorporated into the SPAR finite element code. In an extension of the first phase of the project, optimization of experimental tests for anisotropic materials was addressed. In particular, the problem of calculating material properties from tensile tests and of calculating stresses from strain measurements were considered. For both cases, numerical procedures and software for the optimization of strain gauge and material axes orientation were developed.
Rice, Japonica (Oryza sativa L.).
Main, Marcy; Frame, Bronwyn; Wang, Kan
2015-01-01
The importance of rice, as a food crop, is reflected in the extensive global research being conducted in an effort to improve and better understand this particular agronomic plant. In regard to biotechnology, this has led to the development of numerous genetic transformation protocols. Over the years, many of these methods have become increasingly straightforward, rapid, and efficient, thereby making rice valuable as a model crop for scientific research and functional genomics. The focus of this chapter is on one such protocol that uses Agrobacterium-mediated transformation of Oryza sativa L. ssp. Japonica cv. Nipponbare with an emphasis on tissue desiccation. The explants consist of callus derived from mature seeds which are cocultivated on filter paper postinfection. Hygromycin selection is used for the recovery of subsequent genetically engineered events.
NASA Astrophysics Data System (ADS)
Willgoose, G. R.; Cohen, S.; Svoray, T.; Sela, S.; Hancock, G. R.
2010-12-01
Numerical models are an important tool for studying landscape processes as they allow us to isolate specific processes and drivers and test various physics and spatio-temporal scenarios. Here we use a distributed physically-based soil evolution model (mARM4D) to describe the drivers and processes controlling soil-landscape evolution on a field-site at the fringe between the Mediterranean and desert regions of Israel. This study is an initial effort in a larger project aimed at improving our understanding of the mechanisms and drivers that led to the extensive removal of soils from the loess covered hillslopes of this region. This specific region is interesting as it is located between the Mediterranean climate region in which widespread erosion from hillslopes was attributed to human activity during the Holocene and the arid region in which extensive removal of loess from hillslopes was shown to have been driven by climatic changes during the late-Pleistocene. First we study the sediment transport mechanism of the soil-landscape evolution processes in our study-site. We simulate soil-landscape evolution with only one sediment transport process (fluvial or diffusive) at a time. We find that diffusive sediment transport is likely the dominant process in this site as it resulted in soil distributions that better corresponds to current observations. We then simulate several realistic climatic/anthropogenic scenarios (based on the literature) in order to quantify the sensitivity of the soil-landscape evolution process to temporal fluctuations. We find that this site is relatively insensitive to short term (several thousands of years) sharp, changes. This suggests that climate, rather then human activity, was the main driver for the extensive removal of loess from the hillslopes.
Model-assisted development of a laminography inspection system
NASA Astrophysics Data System (ADS)
Grandin, R.; Gray, J.
2012-05-01
Traditional computed tomography (CT) is an effective method of determining the internal structure of an object through non-destructive means; however, inspection of certain objects, such as those with planar geometrics or with limited access, requires an alternate approach. An alternative is laminography and has been the focus of a number of researchers in the past decade for both medical and industrial inspections. Many research efforts rely on geometrically-simple analytical models, such as the Shepp-Logan phantom, for the development of their algorithms. Recent work at the Center for Non-Destructive Evaluation makes extensive use of a forward model, XRSIM, to study artifacts arising from the reconstruction method, the effects of complex geometries and known issues such as high density features on the laminography reconstruction process. The use of a model provides full knowledge of all aspects of the geometry and provides a means to quantitatively evaluate the impact of methods designed to reduce artifacts generated by the reconstruction methods or that are result of the part geometry. We will illustrate the use of forward simulations to quantitatively assess reconstruction algorithm development and artifact reduction.
NASA Astrophysics Data System (ADS)
Mukherjee, Anamitra; Patel, Niravkumar D.; Bishop, Chris; Dagotto, Elbio
2015-06-01
Lattice spin-fermion models are important to study correlated systems where quantum dynamics allows for a separation between slow and fast degrees of freedom. The fast degrees of freedom are treated quantum mechanically while the slow variables, generically referred to as the "spins," are treated classically. At present, exact diagonalization coupled with classical Monte Carlo (ED + MC) is extensively used to solve numerically a general class of lattice spin-fermion problems. In this common setup, the classical variables (spins) are treated via the standard MC method while the fermion problem is solved by exact diagonalization. The "traveling cluster approximation" (TCA) is a real space variant of the ED + MC method that allows to solve spin-fermion problems on lattice sizes with up to 103 sites. In this publication, we present a novel reorganization of the TCA algorithm in a manner that can be efficiently parallelized. This allows us to solve generic spin-fermion models easily on 104 lattice sites and with some effort on 105 lattice sites, representing the record lattice sizes studied for this family of models.
de Gramatica, Martina; Massacci, Fabio; Shim, Woohyun; Turhan, Uğur; Williams, Julian
2017-02-01
We analyze the issue of agency costs in aviation security by combining results from a quantitative economic model with a qualitative study based on semi-structured interviews. Our model extends previous principal-agent models by combining the traditional fixed and varying monetary responses to physical and cognitive effort with nonmonetary welfare and potentially transferable value of employees' own human capital. To provide empirical evidence for the tradeoffs identified in the quantitative model, we have undertaken an extensive interview process with regulators, airport managers, security personnel, and those tasked with training security personnel from an airport operating in a relatively high-risk state, Turkey. Our results indicate that the effectiveness of additional training depends on the mix of "transferable skills" and "emotional" buy-in of the security agents. Principals need to identify on which side of a critical tipping point their agents are to ensure that additional training, with attached expectations of the burden of work, aligns the incentives of employees with the principals' own objectives. © 2016 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Lindsey, Rebecca; Goldman, Nir; Fried, Laurence
2017-06-01
Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
The Chancellor's Model School Project (CMSP)
NASA Technical Reports Server (NTRS)
Lopez, Gil
1999-01-01
What does it take to create and implement a 7th to 8th grade middle school program where the great majority of students achieve at high academic levels regardless of their previous elementary school backgrounds? This was the major question that guided the research and development of a 7-year long project effort entitled the Chancellor's Model School Project (CMSP) from September 1991 to August 1998. The CMSP effort conducted largely in two New York City public schools was aimed at creating and testing a prototype 7th and 8th grade model program that was organized and test-implemented in two distinct project phases: Phase I of the CMSP effort was conducted from 1991 to 1995 as a 7th to 8th grade extension of an existing K-6 elementary school, and Phase II was conducted from 1995 to 1998 as a 7th to 8th grade middle school program that became an integral part of a newly established 7-12th grade high school. In Phase I, the CMSP demonstrated that with a highly structured curriculum coupled with strong academic support and increased learning time, students participating in the CMSP were able to develop a strong foundation for rigorous high school coursework within the space of 2 years (at the 7th and 8th grades). Mathematics and Reading test score data during Phase I of the project, clearly indicated that significant academic gains were obtained by almost all students -- at both the high and low ends of the spectrum -- regardless of their previous academic performance in the K-6 elementary school experience. The CMSP effort expanded in Phase II to include a fully operating 7-12 high school model. Achievement gains at the 7th and 8th grade levels in Phase II were tempered by the fact that incoming 7th grade students' academic background at the CMSP High School was significantly lower than students participating in Phase 1. Student performance in Phase II was also affected by the broadening of the CMSP effort from a 7-8th grade program to a fully functioning 7-12 high school which as a consequence lessened the focus and structure available to the 7-8th grade students and teachers -- as compared to Phase I. Nevertheless, the CMSP does represent a unique curriculum model for 7th and 8th grade students in urban middle schools. Experience in both Phase I and Phase II of the project allowed the CMSP to be developed and tested along the broad range of parameters and characteristics that embody an operating public school in an urban environment.
The influence of mitigation on sage-grouse habitat selection within an energy development field.
Fedy, Bradley C; Kirol, Christopher P; Sutphin, Andrew L; Maechtle, Thomas L
2015-01-01
Growing global energy demands ensure the continued growth of energy development. Energy development in wildlife areas can significantly impact wildlife populations. Efforts to mitigate development impacts to wildlife are on-going, but the effectiveness of such efforts is seldom monitored or assessed. Greater sage-grouse (Centrocercus urophasianus) are sensitive to energy development and likely serve as an effective umbrella species for other sagebrush-steppe obligate wildlife. We assessed the response of birds within an energy development area before and after the implementation of mitigation action. Additionally, we quantified changes in habitat distribution and abundance in pre- and post-mitigation landscapes. Sage-grouse avoidance of energy development at large spatial scales is well documented. We limited our research to directly within an energy development field in order to assess the influence of mitigation in close proximity to energy infrastructure. We used nest-location data (n = 488) within an energy development field to develop habitat selection models using logistic regression on data from 4 years of research prior to mitigation and for 4 years following the implementation of extensive mitigation efforts (e.g., decreased activity, buried powerlines). The post-mitigation habitat selection models indicated less avoidance of wells (well density β = 0.18 ± 0.08) than the pre-mitigation models (well density β = -0.09 ± 0.11). However, birds still avoided areas of high well density and nests were not found in areas with greater than 4 wells per km2 and the majority of nests (63%) were located in areas with ≤ 1 well per km2. Several other model coefficients differed between the two time periods and indicated stronger selection for sagebrush (pre-mitigation β = 0.30 ± 0.09; post-mitigation β = 0.82 ± 0.08) and less avoidance of rugged terrain (pre-mitigation β = -0.35 ± 0.12; post-mitigation β = -0.05 ± 0.09). Mitigation efforts implemented may be responsible for the measurable improvement in sage-grouse nesting habitats within the development area. However, we cannot reject alternative hypotheses concerning the influence of population density and intraspecific competition. Additionally, we were unable to assess the actual fitness consequences of mitigation or the source-sink dynamics of the habitats. We compared the pre-mitigation and post-mitigation models predicted as maps with habitats ranked from low to high relative probability of use (equal-area bins: 1 - 5). We found more improvement in habitat rank between the two time periods around mitigated wells compared to non-mitigated wells. Informed mitigation within energy development fields could help improve habitats within the field. We recommend that any mitigation effort include well-informed plans to monitor the effectiveness of the implemented mitigation actions that assess both habitat use and relevant fitness parameters.
Guidelines for development of the Iowa statewide transportation improvement program (STIP). Revised.
DOT National Transportation Integrated Search
2004-01-01
The Transportation Equity Act for the 21st Century (TEA-21) continues the Intermodal Surface Transportation Efficiency Act of 1991's requirement for an extensive, ongoing cooperative planning effort for programming federal funding. Iowa's STIP is dev...
Comprehensive Transit Plan for the Virgin Islands - Technical Report
DOT National Transportation Integrated Search
1989-01-01
This report contains a description of the elements and recommendations of a transportation study of the islands of St. Thomas, St. Croix, and St. John in the U.S. Virgin Island archipelago. An extensive data collection effort, including traffic volum...
Acute exposure to the tri-substituted organotin trimethyltin (TMT) causes neuronal degeneration in the hippocampus, amygdala, pyriform cortex, and neocortex. Developmental exposure to TMT impairs later learning and memory. Despite extensive efforts elucidating neuropathological...
... Multiple Sclerosis: Symptoms, Diagnosis, Treatment and Latest NIH Research Past Issues / Spring 2012 Table of Contents Symptoms ... my MS will ever go away? Latest NIH Research Scientists continue their extensive efforts to create new ...
TIMSS 2007 Assessment Frameworks
ERIC Educational Resources Information Center
Mullis, Ina V. S.; Martin, Michael O.; Ruddock, Graham J.; O'Sullivan, Christine Y.; Arora, Alka; Erberber, Ebru
2005-01-01
Developing the Trends in International Mathematics and Science Study (TIMSS) 2007 Assessment Frameworks represents an extensive collaborative effort involving individuals and expert groups from more than 60 countries around the world. The document contains three frameworks for implementing TIMSS 2007--the Mathematics Framework, the Science…
Curating and sharing structures and spectra for the environmental community
The increasing popularity of high mass accuracy non-target mass spectrometry methods has yielded extensive identification efforts based on spectral and chemical compound databases in the environmental community and beyond. Increasingly, new methods are relying on open data resour...
77 FR 11625 - Child Restraint Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
...),'' Stammen; Vehicle Research and Test Center, National Highway Traffic Safety Administration (September 2004... exceeded. However, during extensive post-NPRM booster seat testing, inconsistencies in the test protocol... substantial rulemaking and research efforts to try to address test variability. NHTSA investigated the ATD's...
Revision of empirical electric field modeling in the inner magnetosphere using Cluster data
NASA Astrophysics Data System (ADS)
Matsui, H.; Torbert, R. B.; Spence, H. E.; Khotyaintsev, Yu. V.; Lindqvist, P.-A.
2013-07-01
Using Cluster data from the Electron Drift (EDI) and the Electric Field and Wave (EFW) instruments, we revise our empirically-based, inner-magnetospheric electric field (UNH-IMEF) model at 2
Weak data do not make a free lunch, only a cheap meal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Zhipu; Rajashankar, Kanagalaghatta; Dauter, Zbigniew, E-mail: dauter@anl.gov
2014-02-01
Refinement and analysis of four structures with various data resolution cutoffs suggests that at present there are no reliable criteria for judging the diffraction data resolution limit and the condition I/σ(I) = 2.0 is reasonable. However, extending the limit by about 0.2 Å beyond the resolution defined by this threshold does not deteriorate the quality of refined structures and in some cases may be beneficial. Four data sets were processed at resolutions significantly exceeding the criteria traditionally used for estimating the diffraction data resolution limit. The analysis of these data and the corresponding model-quality indicators suggests that the criteria ofmore » resolution limits widely adopted in the past may be somewhat conservative. Various parameters, such as R{sub merge} and I/σ(I), optical resolution and the correlation coefficients CC{sub 1/2} and CC*, can be used for judging the internal data quality, whereas the reliability factors R and R{sub free} as well as the maximum-likelihood target values and real-space map correlation coefficients can be used to estimate the agreement between the data and the refined model. However, none of these criteria provide a reliable estimate of the data resolution cutoff limit. The analysis suggests that extension of the maximum resolution by about 0.2 Å beyond the currently adopted limit where the I/σ(I) value drops to 2.0 does not degrade the quality of the refined structural models, but may sometimes be advantageous. Such an extension may be particularly beneficial for significantly anisotropic diffraction. Extension of the maximum resolution at the stage of data collection and structure refinement is cheap in terms of the required effort and is definitely more advisable than accepting a too conservative resolution cutoff, which is unfortunately quite frequent among the crystal structures deposited in the Protein Data Bank.« less
Vision for an Open, Global Greenhouse Gas Information System (GHGIS)
NASA Astrophysics Data System (ADS)
Duren, R. M.; Butler, J. H.; Rotman, D.; Ciais, P.; Greenhouse Gas Information System Team
2010-12-01
Over the next few years, an increasing number of entities ranging from international, national, and regional governments, to businesses and private land-owners, are likely to become more involved in efforts to limit atmospheric concentrations of greenhouse gases. In such a world, geospatially resolved information about the location, amount, and rate of greenhouse gas (GHG) emissions will be needed, as well as the stocks and flows of all forms of carbon through the earth system. The ability to implement policies that limit GHG concentrations would be enhanced by a global, open, and transparent greenhouse gas information system (GHGIS). An operational and scientifically robust GHGIS would combine ground-based and space-based observations, carbon-cycle modeling, GHG inventories, synthesis analysis, and an extensive data integration and distribution system, to provide information about anthropogenic and natural sources, sinks, and fluxes of greenhouse gases at temporal and spatial scales relevant to decision making. The GHGIS effort was initiated in 2008 as a grassroots inter-agency collaboration intended to identify the needs for such a system, assess the capabilities of current assets, and suggest priorities for future research and development. We will present a vision for an open, global GHGIS including latest analysis of system requirements, critical gaps, and relationship to related efforts at various agencies, the Group on Earth Observations, and the Intergovernmental Panel on Climate Change.
Successful Community-Based Conservation: The Story of Millbank and Pterourus (Papilio) homerus
Garraway, Eric; Parnell, John; Lewis, Delano S.
2017-01-01
The literature on community-based environmental management is very extensive and the discussion of the pros and cons is continuing. Presented here is an example of a successful interaction between university-based entomologists and a local rural community, detailing the change in the attitude of the town of Millbank, Jamaica, from a Giant Swallowtail Butterfly collecting site to a model for community protection of a species and its environment. A review of some of the research work on community-based conservation efforts is included. These linkages take a considerable time to establish and the efforts spent by scientific personnel, governmental representatives and eco-tourists are itemized to emphasize how specific conservation activities have inspired confidence in the local community, thus engendering trust and mutual respect between the two groups. Reviews of the developed legislative support from both international and state entities also must be in place, and these are included in chronological detail as much as possible. Finally, a review of the long-term funding of educational and other local programs providing a level of stability to the conservation effort, until the local community can take over the protection of the species and/or habitat, is provided. Of utmost importance is a comprehensive educational campaign to not only sensitize the community, but the larger society, so that there can be buy-in from all stakeholders. PMID:28708090
Successful Community-Based Conservation: The Story of Millbank and Pterourus (Papilio) homerus.
Garraway, Eric; Parnell, John; Lewis, Delano S
2017-07-14
The literature on community-based environmental management is very extensive and the discussion of the pros and cons is continuing. Presented here is an example of a successful interaction between university-based entomologists and a local rural community, detailing the change in the attitude of the town of Millbank, Jamaica, from a Giant Swallowtail Butterfly collecting site to a model for community protection of a species and its environment. A review of some of the research work on community-based conservation efforts is included. These linkages take a considerable time to establish and the efforts spent by scientific personnel, governmental representatives and eco-tourists are itemized to emphasize how specific conservation activities have inspired confidence in the local community, thus engendering trust and mutual respect between the two groups. Reviews of the developed legislative support from both international and state entities also must be in place, and these are included in chronological detail as much as possible. Finally, a review of the long-term funding of educational and other local programs providing a level of stability to the conservation effort, until the local community can take over the protection of the species and/or habitat, is provided. Of utmost importance is a comprehensive educational campaign to not only sensitize the community, but the larger society, so that there can be buy-in from all stakeholders.
Towards a Global Greenhouse Gas Information System (GHGIS)
NASA Astrophysics Data System (ADS)
Duren, Riley; Butler, James; Rotman, Doug; Miller, Charles; Decola, Phil; Sheffner, Edwin; Tucker, Compton; Mitchiner, John; Jonietz, Karl; Dimotakis, Paul
2010-05-01
Over the next few years, an increasing number of entities ranging from international, national, and regional governments, to businesses and private land-owners, are likely to become more involved in efforts to limit atmospheric concentrations of greenhouse gases. In such a world, geospatially resolved information about the location, amount, and rate of greenhouse gas (GHG) emissions will be needed, as well as the stocks and flows of all forms of carbon through terrestrial ecosystems and in the oceans. The ability to implement policies that limit GHG concentrations would be enhanced by a global, open, and transparent greenhouse gas information system (GHGIS). An operational and scientifically robust GHGIS would combine ground-based and space-based observations, carbon-cycle modeling, GHG inventories, meta-analysis, and an extensive data integration and distribution system, to provide information about sources, sinks, and fluxes of greenhouse gases at policy-relevant temporal and spatial scales. The GHGIS effort was initiated in 2008 as a grassroots inter-agency collaboration intended to rigorously identify the needs for such a system, assess the capabilities of current assets, and suggest priorities for future research and development. We will present a status of the GHGIS effort including our latest analysis and ideas for potential near-term pilot projects with potential relevance to European initiatives including the Global Monitoring for Environment and Security (GMES) and the Integrated Carbon Observing System (ICOS).
NASA Astrophysics Data System (ADS)
Lin, Shian-Jiann; Harris, Lucas; Chen, Jan-Huey; Zhao, Ming
2014-05-01
A multi-scale High-Resolution Atmosphere Model (HiRAM) is being developed at NOAA/Geophysical Fluid Dynamics Laboratory. The model's dynamical framework is the non-hydrostatic extension of the vertically Lagrangian finite-volume dynamical core (Lin 2004, Monthly Wea. Rev.) constructed on a stretchable (via Schmidt transformation) cubed-sphere grid. Physical parametrizations originally designed for IPCC-type climate predictions are in the process of being modified and made more "scale-aware", in an effort to make the model suitable for multi-scale weather-climate applications, with horizontal resolution ranging from 1 km (near the target high-resolution region) to as low as 400 km (near the antipodal point). One of the main goals of this development is to enable simulation of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously thought impossible. We will present preliminary results, covering a very wide spectrum of temporal-spatial scales, ranging from simulation of tornado genesis (hours), Madden-Julian Oscillations (intra-seasonal), topical cyclones (seasonal), to Quasi Biennial Oscillations (intra-decadal), using the same global multi-scale modeling system.
The Impact of State Legislation and Model Policies on Bullying in Schools.
Terry, Amanda
2018-04-01
The purpose of this study was to determine the impact of the coverage of state legislation and the expansiveness ratings of state model policies on the state-level prevalence of bullying in schools. The state-level prevalence of bullying in schools was based on cross-sectional data from the 2013 High School Youth Risk Behavior Survey. Multiple regression was conducted to determine whether the coverage of state legislation and the expansiveness rating of a state model policy affected the state-level prevalence of bullying in schools. The purpose and definition category of components in state legislation and the expansiveness rating of a state model policy were statistically significant predictors of the state-level prevalence of bullying in schools. The other 3 categories of components in state legislation-District Policy Development and Review, District Policy Components, and Additional Components-were not statistically significant predictors in the model. Extensive coverage in the purpose and definition category of components in state legislation and a high expansiveness rating of a state model policy may be important in efforts to reduce bullying in schools. Improving these areas may reduce the state-level prevalence of bullying in schools. © 2018, American School Health Association.
Computational Analyses of Pressurization in Cryogenic Tanks
NASA Technical Reports Server (NTRS)
Ahuja, Vineet; Hosangadi, Ashvin; Mattick, Stephen; Lee, Chun P.; Field, Robert E.; Ryan, Harry
2008-01-01
A) Advanced Gas/Liquid Framework with Real Fluids Property Routines: I. A multi-fluid formulation in the preconditioned CRUNCH CFD(Registered TradeMark) code developed where a mixture of liquid and gases can be specified: a) Various options for Equation of state specification available (from simplified ideal fluid mixtures, to real fluid EOS such as SRK or BWR models). b) Vaporization of liquids driven by pressure value relative to vapor pressure and combustion of vapors allowed. c) Extensive validation has been undertaken. II. Currently working on developing primary break-up models and surface tension effects for more rigorous phase-change modeling and interfacial dynamics B) Framework Applied to Run-time Tanks at Ground Test Facilities C) Framework Used For J-2 Upper Stage Tank Modeling: 1) NASA MSFC tank pressurization: a) Hydrogen and oxygen tank pre-press, repress and draining being modeled at NASA MSFC. 2) NASA AMES tank safety effort a) liquid hydrogen and oxygen are separated by a baffle in the J-2 tank. We are modeling pressure rise and possible combustion if a hole develops in the baffle and liquid hydrogen leaks into the oxygen tank. Tank pressure rise rates simulated and risk of combustion evaluated.
NASA Technical Reports Server (NTRS)
Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.
2006-01-01
System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.
NASA Astrophysics Data System (ADS)
Hayley, Kevin; Schumacher, J.; MacMillan, G. J.; Boutin, L. C.
2014-05-01
Expanding groundwater datasets collected by automated sensors, and improved groundwater databases, have caused a rapid increase in calibration data available for groundwater modeling projects. Improved methods of subsurface characterization have increased the need for model complexity to represent geological and hydrogeological interpretations. The larger calibration datasets and the need for meaningful predictive uncertainty analysis have both increased the degree of parameterization necessary during model calibration. Due to these competing demands, modern groundwater modeling efforts require a massive degree of parallelization in order to remain computationally tractable. A methodology for the calibration of highly parameterized, computationally expensive models using the Amazon EC2 cloud computing service is presented. The calibration of a regional-scale model of groundwater flow in Alberta, Canada, is provided as an example. The model covers a 30,865-km2 domain and includes 28 hydrostratigraphic units. Aquifer properties were calibrated to more than 1,500 static hydraulic head measurements and 10 years of measurements during industrial groundwater use. Three regionally extensive aquifers were parameterized (with spatially variable hydraulic conductivity fields), as was the aerial recharge boundary condition, leading to 450 adjustable parameters in total. The PEST-based model calibration was parallelized on up to 250 computing nodes located on Amazon's EC2 servers.
NASA Technical Reports Server (NTRS)
Hayden, Jeffrey L.; Jeffries, Alan
2012-01-01
The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described
The Data Platform for Climate Research and Action: Introducing Climate Watch
NASA Astrophysics Data System (ADS)
Hennig, R. J.; Ge, M.; Friedrich, J.; Lebling, K.; Carlock, G.; Arcipowska, A.; Mangan, E.; Biru, H.; Tankou, A.; Chaudhury, M.
2017-12-01
The Paris Agreement, adopted through Decision 1/CP.21, brings all nations together to take on ambitious efforts to combat climate change. Open access to climate data supporting climate research, advancing knowledge, and informing decision making is key to encourage and strengthen efforts of stakeholders at all levels to address and respond to effects of climate change. Climate Watch is a robust online data platform developed in response to the urgent needs of knowledge and tools to empower climate research and action, including those of researchers, policy makers, the private sector, civil society, and all other non-state actors. Building on the rapid growing technology of open data and information sharing, Climate Watch is equipped with extensive amount of climate data, informative visualizations, concise yet efficient user interface, and connection to resources users need to gather insightful information on national and global progress towards delivering on the objective of the Convention and the Paris Agreement. Climate Watch brings together hundreds of quantitative and qualitative indicators for easy explore, visualize, compare, download at global, national, and sectoral levels: Greenhouse gas (GHG) emissions for more than 190 countries over the1850-2014 time period, covering all seven Kyoto Gases following IPCC source/sink categories; Structured information on over 150 NDCs facilitating the clarity, understanding and transparency of countries' contributions to address climate change; Over 6500 identified linkages between climate actions in NDCs across the 169 targets of the sustainable development goals (SDG); Over 200 indicators describing low carbon pathways from models and scenarios by integrated assessment models (IAMs) and national sources; and Data on vulnerability and risk, policies, finance, and many more. Climate Watch platform is developed as part of the broader efforts within the World Resources Institute, the NDC Partnership, and in collaboration with GIZ, UNFCCC, World Bank, and Climate Analytics.
Analysis of hybrid electric/thermofluidic inputs for wet shape memory alloy actuators
NASA Astrophysics Data System (ADS)
Flemming, Leslie; Mascaro, Stephen
2013-01-01
A wet shape memory alloy (SMA) actuator is characterized by an SMA wire embedded within a compliant fluid-filled tube. Heating and cooling of the SMA wire produces a linear contraction and extension of the wire. Thermal energy can be transferred to and from the wire using combinations of resistive heating and free/forced convection. This paper analyzes the speed and efficiency of a simulated wet SMA actuator using a variety of control strategies involving different combinations of electrical and thermofluidic inputs. A computational fluid dynamics (CFD) model is used in conjunction with a temperature-strain model of the SMA wire to simulate the thermal response of the wire and compute strains, contraction/extension times and efficiency. The simulations produce cycle rates of up to 5 Hz for electrical heating and fluidic cooling, and up to 2 Hz for fluidic heating and cooling. The simulated results demonstrate efficiencies up to 0.5% for electric heating and up to 0.2% for fluidic heating. Using both electric and fluidic inputs concurrently improves the speed and efficiency of the actuator and allows for the actuator to remain contracted without continually delivering energy to the actuator, because of the thermal capacitance of the hot fluid. The characterized speeds and efficiencies are key requirements for implementing broader research efforts involving the intelligent control of electric and thermofluidic networks to optimize the speed and efficiency of wet actuator arrays.
NASA Astrophysics Data System (ADS)
Holmes, Sarah
2017-04-01
It is more important than ever to study the oceans and especially the shelf seas, which are disproportionately productive, sustaining over 90% of global fisheries . The economic and societal significance of these shallow oceans, as the interface through which society interacts with the marine environment, makes them highly relevant to the decisions of policy-makers and stakeholders. These decision-makers rely upon empirical data informed by consistent and extensive monitoring and assessment from experts in the field, yet long-term, spatially-extensive datasets of the marine environment do not exist or are of poor quality. Modelling the shelf seas with biogeochemical models can provide valuable data, allowing scientists to look at both past and future scenarios to estimate ecosystem response to change. In particular, the European Regional Sea Ecosystem Model or ERSEM combines not only the complex hydrographical aspects of the North West European shelf, but also vast numbers of biological and chemical parameters. Though huge efforts across the modelling community are invested into developing and ultimately increasing the reliability of models such as the ERSEM, this is typically achieved by looking at relationships with aforementioned observed datasets, restricting model accuracy and our understanding of ecosystem processes. It is for this reason that proxy data of the marine environment is so valuable. Of all marine proxies available, sclerochronology, the study of the growth bands on long-lived marine molluscs, is the only proven to provide novel, high resolution, multi-centennial, annually-resolved, absolutely-dated archives of past ocean environment, analogous to dendrochronology. For the first time, this PhD project will combine the proxy data of sclerochronology with model hindcast data from the ERSEM with the aim to better understand the North West European shelf sea environment and potentially improve predictions of future climate change in this region and beyond.
Tandem Cylinder Noise Predictions
NASA Technical Reports Server (NTRS)
Lockard, David P.; Khorrami, Mehdi R.; CHoudhari, Meelan M.; Hutcheson, Florence V.; Brooks, Thomas F.; Stead, Daniel J.
2007-01-01
In an effort to better understand landing-gear noise sources, we have been examining a simplified configuration that still maintains some of the salient features of landing-gear flow fields. In particular, tandem cylinders have been studied because they model a variety of component level interactions. The present effort is directed at the case of two identical cylinders spatially separated in the streamwise direction by 3.7 diameters. Experimental measurements from the Basic Aerodynamic Research Tunnel (BART) and Quiet Flow Facility (QFF) at NASA Langley Research Center (LaRC) have provided steady surface pressures, detailed off-surface measurements of the flow field using Particle Image Velocimetry (PIV), hot-wire measurements in the wake of the rear cylinder, unsteady surface pressure data, and the radiated noise. The experiments were conducted at a Reynolds number of 166 105 based on the cylinder diameter. A trip was used on the upstream cylinder to insure a fully turbulent shedding process and simulate the effects of a high Reynolds number flow. The parallel computational effort uses the three-dimensional Navier-Stokes solver CFL3D with a hybrid, zonal turbulence model that turns off the turbulence production term everywhere except in a narrow ring surrounding solid surfaces. The current calculations further explore the influence of the grid resolution and spanwise extent on the flow and associated radiated noise. Extensive comparisons with the experimental data are used to assess the ability of the computations to simulate the details of the flow. The results show that the pressure fluctuations on the upstream cylinder, caused by vortex shedding, are smaller than those generated on the downstream cylinder by wake interaction. Consequently, the downstream cylinder dominates the noise radiation, producing an overall directivity pattern that is similar to that of an isolated cylinder. Only calculations based on the full length of the model span were able to capture the complete decay in the spanwise correlation, thereby producing reasonable noise radiation levels.
ERIC Educational Resources Information Center
Lowry, Christina; Little, Robert
1985-01-01
The benefits of prototyping as a basis for system design include better specifications, earlier discovery of omissions and extensions, and the likelihood of salvaging much of the effort expended on the prototype. Risks and methods of prototyping during rapid systems development are also noted. (Author/MLW)
MANUAL: BIOVENTING PRINCIPLES AND PRACTICE VOLUME II. BIOVENTING DESIGN
The results from bioventing research and development efforts and from the pilot-scale bioventing systems have been used to produce this two-volume manual. Although this design manual has been written based on extensive experience with petroleum hydrocarbons (and thus, many exampl...
Regional Sustainability: The San Luis Basin Metrics Project
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...
ERIC Educational Resources Information Center
Breslin, Patrick
1988-01-01
Describes cooperative campaign by central Chilean farmers to reduce use of dangerous chemicals. Describes cooperative's rural extension program targeting misuse of pesticides. Describes concern of chemical danger to local and foreign consumers. Cooperative's effort described as balancing short-term economic gains with long-term health and…
Masters Program at Claremont and Other University Extension Activities
ERIC Educational Resources Information Center
Carroll, Ann-Marie, Ed.
1971-01-01
Descriptions of a liberal studies program at Claremont College, California; the University of British Columbia local government project; recent Syracuse University publications in continuing education; Kansas State University teleteaching efforts; the interinstitutional University without Walls" consortium; and the University of Washington…
Development of a Multidisciplinary Approach to Access Sustainability
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute the metrics. Moreover, individual metrics do not capture all aspects of a system that are relevan...
Social Network Structures among Groundnut Farmers
ERIC Educational Resources Information Center
Thuo, Mary; Bell, Alexandra A.; Bravo-Ureta, Boris E.; Okello, David K.; Okoko, Evelyn Nasambu; Kidula, Nelson L.; Deom, C. Michael; Puppala, Naveen
2013-01-01
Purpose: Groundnut farmers in East Africa have experienced declines in production despite research and extension efforts to increase productivity. This study examined how social network structures related to acquisition of information about new seed varieties and productivity among groundnut farmers in Uganda and Kenya.…
Negotiating Diversity: Fostering Collaborative Interpretations of Case Studies
ERIC Educational Resources Information Center
Guo, Shujie; Cockburn-Wootten, Cheryl; Munshi, Debashish
2014-01-01
The intercultural divides in values, perceptions, and interpretations of concepts have been studied extensively by international business and intercultural communication scholars. Consequentially, much effort in university classrooms is spent on focusing on the differences between groups and on finding ways to "manage" cultural…
Simulations of Bluff Body Flow Interaction for Noise Source Modeling
NASA Technical Reports Server (NTRS)
Khorrami, Medi R.; Lockard David P.; Choudhari, Meelan M.; Jenkins, Luther N.; Neuhart, Dan H.; McGinley, Catherine B.
2006-01-01
The current study is a continuation of our effort to characterize the details of flow interaction between two cylinders in a tandem configuration. This configuration is viewed to possess many of the pertinent flow features of the highly interactive unsteady flow field associated with the main landing gear of large civil transports. The present effort extends our previous two-dimensional, unsteady, Reynolds Averaged Navier-Stokes computations to three dimensions using a quasilaminar, zonal approach, in conjunction with a two-equation turbulence model. Two distinct separation length-to-diameter ratios of L/D = 3.7 and 1.435, representing intermediate and short separation distances between the two cylinders, are simulated. The Mach 0.166 simulations are performed at a Reynolds number of Re = 1.66 105 to match the companion experiments at NASA Langley Research Center. Extensive comparisons with the measured steady and unsteady surface pressure and off-surface particle image velocimetry data show encouraging agreement. Both prominent and some of the more subtle trends in the mean and fluctuating flow fields are correctly predicted. Both computations and the measured data reveal a more robust and energetic shedding process at L/D = 3.7 in comparison with the weaker shedding in the shorter separation case of L/D = 1.435. The vortex shedding frequency based on the computed surface pressure spectra is in reasonable agreement with the measured Strouhal frequency.
100 Years of Attempts to Transform Physics Education
NASA Astrophysics Data System (ADS)
Otero, Valerie K.; Meltzer, David E.
2016-12-01
As far back as the late 1800s, U.S. physics teachers expressed many of the same ideas about physics education reform that are advocated today. However, several popular reform efforts eventually failed to have wide impact, despite strong and enthusiastic support within the physics education community. Broad-scale implementation of improved instructional models today may be just as elusive as it has been in the past, and for similar reasons. Although excellent instructional models exist and have been available for decades, effective and scalable plans for transforming practice on a national basis have yet to be developed and implemented. Present-day teachers, education researchers, and policy makers can find much to learn from past efforts, both in their successes and their failures. To this end, we present a brief outline of some key ideas in U.S. physics education during the past 130 years. We address three core questions that are prominent in the literature: (a) Why and how should physics be taught? (b) What physics should be taught? (c) To whom should physics be taught? Related issues include the role of the laboratory and attempts to make physics relevant to everyday life. We provide here only a brief summary of the issues and debates found in primary-source literature; an extensive collection of historical resources on physics education is available at https://sites.google.com/site/physicseducationhistory/home.
Bayesian inversion using a geologically realistic and discrete model space
NASA Astrophysics Data System (ADS)
Jaeggli, C.; Julien, S.; Renard, P.
2017-12-01
Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.
Verification of the Multi-Axial, Temperature and Time Dependent (MATT) Failure Criterion
NASA Technical Reports Server (NTRS)
Richardson, David E.; Macon, David J.
2005-01-01
An extensive test and analytical effort has been completed by the Space Shuttle's Reusable Solid Rocket Motor (KSKM) nozzle program to characterize the failure behavior of two epoxy adhesives (TIGA 321 and EA946). As part of this effort, a general failure model, the "Multi-Axial, Temperature, and Time Dependent" or MATT failure criterion was developed. In the initial development of this failure criterion, tests were conducted to provide validation of the theory under a wide range of test conditions. The purpose of this paper is to present additional verification of the MATT failure criterion, under new loading conditions for the adhesives TIGA 321 and EA946. In many cases, the loading conditions involve an extrapolation from the conditions under which the material models were originally developed. Testing was conducted using three loading conditions: multi-axial tension, torsional shear, and non-uniform tension in a bondline condition. Tests were conducted at constant and cyclic loading rates ranging over four orders of magnitude. Tests were conducted under environmental conditions of primary interest to the RSRM program. The temperature range was not extreme, but the loading ranges were extreme (varying by four orders of magnitude). It should be noted that the testing was conducted at temperatures below the glass transition temperature of the TIGA 321 adhesive. However for the EA946, the testing was conducted at temperatures that bracketed the glass transition temperature.
Modelling methane fluxes from managed and restored peatlands
NASA Astrophysics Data System (ADS)
Cresto Aleina, F.; Rasche, L.; Hermans, R.; Subke, J. A.; Schneider, U. A.; Brovkin, V.
2015-12-01
European peatlands have been extensively managed over past centuries. Typical management activities consisted of drainage and afforestation, which lead to considerable damage to the peat and potentially significant carbon loss. Recent efforts to restore previously managed peatlands have been carried out throughout Europe. These restoration efforts have direct implications for water table depth and greenhouse gas emissions, thus impacting on the ecosystem services provided by peatland areas. In order to quantify the impact of peatland restoration on water table depth and greenhouse gas budget, We coupled the Environmental Policy Integrated Climate (EPIC) model to a process-based model for methane emissions (Walter and Heimann, 2000). The new model (EPIC-M) can potentially be applied at the European and even at the global scale, but it is yet to be tested and evaluated. We present results of this new tool from different peatlands in the Flow Country, Scotland. Large parts of the peatlands of the region have been drained and afforested during the 1980s, but since the late 1990s, programs to restore peatlands in the Flow Country have been enforced. This region offers therefore a range of peatlands, from near pristine, to afforested and drained, with different resoration ages in between, where we can apply the EPIC-M model and validate it against experimental data from all land stages of restoration. Goals of this study are to evaluate the EPIC-M model and its performances against in situ measurements of methane emissions and water table changes in drained peatlands and in restored ones. Secondly, our purpose is to study the environmental impact of peatland restoration, including methane emissions, due to the rewetting of drained surfaces. To do so, we forced the EPIC-M model with local meteorological and soil data, and simulated soil temperatures, water table dynamics, and greenhouse gas emissions. This is the first step towards a European-wide application of the EPIC-M model for the assessment of the environmental impact of peatland restoration.
Modeling water table dynamics in managed and restored peatlands
NASA Astrophysics Data System (ADS)
Cresto Aleina, Fabio; Rasche, Livia; Hermans, Renée; Subke, Jens-Arne; Schneider, Uwe; Brovkin, Victor
2016-04-01
European peatlands have been extensively managed over past centuries. Typical management activities consisted of drainage and afforestation, which lead to considerable damage to the peat and potentially significant carbon loss. Recent efforts to restore previously managed peatlands have been carried out throughout Europe. These restoration efforts have direct implications for water table depth and greenhouse gas emissions, thus impacting on the ecosystem services provided by peatland areas. In order to quantify the impact of peatland restoration on water table depth and greenhouse gas budget, We coupled the Environmental Policy Integrated Climate (EPIC) model to a process-based model for methane emissions (Walter and Heimann, 2000). The new model (EPIC-M) can potentially be applied at the European and even at the global scale, but it is yet to be tested and evaluated. We present results of this new tool from different peatlands in the Flow Country, Scotland. Large parts of the peatlands of the region have been drained and afforested during the 1980s, but since the late 1990s, programs to restore peatlands in the Flow Country have been enforced. This region offers therefore a range of peatlands, from near pristine, to afforested and drained, with different resoration ages in between, where we can apply the EPIC-M model and validate it against experimental data from all land stages of restoration Goals of this study are to evaluate the EPIC-M model and its performances against in situ measurements of methane emissions and water table changes in drained peatlands and in restored ones. Secondly, our purpose is to study the environmental impact of peatland restoration, including methane emissions, due to the rewetting of drained surfaces. To do so, we forced the EPIC-M model with local meteorological and soil data, and simulated soil temperatures, water table dynamics, and greenhouse gas emissions. This is the first step towards a European-wide application of the EPIC-M model for the assessment of the environmental impact of peatland restoration.
Using aircraft and satellite observations to improve regulatory air quality models
NASA Astrophysics Data System (ADS)
Canty, T. P.; Vinciguerra, T.; Anderson, D. C.; Carpenter, S. F.; Goldberg, D. L.; Hembeck, L.; Montgomery, L.; Liu, X.; Salawitch, R. J.; Dickerson, R. R.
2014-12-01
Federal and state agencies rely on EPA approved models to develop attainment strategies that will bring states into compliance with the National Ambient Air Quality Standards (NAAQS). We will describe modifications to the Community Multi-Scale Air Quality (CMAQ) model and Comprehensive Air Quality Model with Extensions (CAMx) frameworks motivated by analysis of NASA satellite and aircraft measurements. Observations of tropospheric column NO2 from OMI have already led to the identification of an important deficiency in the chemical mechanisms used by models; data collected during the DISCOVER-AQ field campaign has been instrumental in devising an improved representation of the chemistry of nitrogen species. Our recent work has focused on the use of: OMI observations of tropospheric O3 to assess and improve the representation of boundary conditions used by AQ models, OMI NO2 to derive a top down NOx emission inventory from commercial shipping vessels that affect air quality in the Eastern U.S., and OMI HCHO to assess the C5H8 emission inventories provided by bioegenic emissions models. We will describe how these OMI-driven model improvements are being incorporated into the State Implementation Plans (SIPs) being prepared for submission to EPA in summer 2015 and how future modeling efforts may be impacted by our findings.
Kretsinger, Katrina; Strebel, Peter; Kezaala, Robert; Goodson, James L
2017-07-01
The Global Polio Eradication Initiative has built an extensive infrastructure with capabilities and resources that should be transitioned to measles and rubella elimination efforts. Measles continues to be a major cause of child mortality globally, and rubella continues to be the leading infectious cause of birth defects. Measles and rubella eradication is feasible and cost saving. The obvious similarities in strategies between polio elimination and measles and rubella elimination include the use of an extensive surveillance and laboratory network, outbreak preparedness and response, extensive communications and social mobilization networks, and the need for periodic supplementary immunization activities. Polio staff and resources are already connected with those of measles and rubella, and transitioning existing capabilities to measles and rubella elimination efforts allows for optimized use of resources and the best opportunity to incorporate important lessons learned from polio eradication, and polio resources are concentrated in the countries with the highest burden of measles and rubella. Measles and rubella elimination strategies rely heavily on achieving and maintaining high vaccination coverage through the routine immunization activity infrastructure, thus creating synergies with immunization systems approaches, in what is termed a "diagonal approach." © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
A New Extension Model: The Memorial Middle School Agricultural Extension and Education Center
ERIC Educational Resources Information Center
Skelton, Peter; Seevers, Brenda
2010-01-01
The Memorial Middle School Agricultural Extension and Education Center is a new model for Extension. The center applies the Cooperative Extension Service System philosophy and mission to developing public education-based programs. Programming primarily serves middle school students and teachers through agricultural and natural resource science…
A hierarchical nest survival model integrating incomplete temporally varying covariates
Converse, Sarah J; Royle, J Andrew; Adler, Peter H; Urbanek, Richard P; Barzen, Jeb A
2013-01-01
Nest success is a critical determinant of the dynamics of avian populations, and nest survival modeling has played a key role in advancing avian ecology and management. Beginning with the development of daily nest survival models, and proceeding through subsequent extensions, the capacity for modeling the effects of hypothesized factors on nest survival has expanded greatly. We extend nest survival models further by introducing an approach to deal with incompletely observed, temporally varying covariates using a hierarchical model. Hierarchical modeling offers a way to separate process and observational components of demographic models to obtain estimates of the parameters of primary interest, and to evaluate structural effects of ecological and management interest. We built a hierarchical model for daily nest survival to analyze nest data from reintroduced whooping cranes (Grus americana) in the Eastern Migratory Population. This reintroduction effort has been beset by poor reproduction, apparently due primarily to nest abandonment by breeding birds. We used the model to assess support for the hypothesis that nest abandonment is caused by harassment from biting insects. We obtained indices of blood-feeding insect populations based on the spatially interpolated counts of insects captured in carbon dioxide traps. However, insect trapping was not conducted daily, and so we had incomplete information on a temporally variable covariate of interest. We therefore supplemented our nest survival model with a parallel model for estimating the values of the missing insect covariates. We used Bayesian model selection to identify the best predictors of daily nest survival. Our results suggest that the black fly Simulium annulus may be negatively affecting nest survival of reintroduced whooping cranes, with decreasing nest survival as abundance of S. annulus increases. The modeling framework we have developed will be applied in the future to a larger data set to evaluate the biting-insect hypothesis and other hypotheses for nesting failure in this reintroduced population; resulting inferences will support ongoing efforts to manage this population via an adaptive management approach. Wider application of our approach offers promise for modeling the effects of other temporally varying, but imperfectly observed covariates on nest survival, including the possibility of modeling temporally varying covariates collected from incubating adults. PMID:24340185
A hierarchical nest survival model integrating incomplete temporally varying covariates
Converse, Sarah J.; Royle, J. Andrew; Adler, Peter H.; Urbanek, Richard P.; Barzan, Jeb A.
2013-01-01
Nest success is a critical determinant of the dynamics of avian populations, and nest survival modeling has played a key role in advancing avian ecology and management. Beginning with the development of daily nest survival models, and proceeding through subsequent extensions, the capacity for modeling the effects of hypothesized factors on nest survival has expanded greatly. We extend nest survival models further by introducing an approach to deal with incompletely observed, temporally varying covariates using a hierarchical model. Hierarchical modeling offers a way to separate process and observational components of demographic models to obtain estimates of the parameters of primary interest, and to evaluate structural effects of ecological and management interest. We built a hierarchical model for daily nest survival to analyze nest data from reintroduced whooping cranes (Grus americana) in the Eastern Migratory Population. This reintroduction effort has been beset by poor reproduction, apparently due primarily to nest abandonment by breeding birds. We used the model to assess support for the hypothesis that nest abandonment is caused by harassment from biting insects. We obtained indices of blood-feeding insect populations based on the spatially interpolated counts of insects captured in carbon dioxide traps. However, insect trapping was not conducted daily, and so we had incomplete information on a temporally variable covariate of interest. We therefore supplemented our nest survival model with a parallel model for estimating the values of the missing insect covariates. We used Bayesian model selection to identify the best predictors of daily nest survival. Our results suggest that the black fly Simulium annulus may be negatively affecting nest survival of reintroduced whooping cranes, with decreasing nest survival as abundance of S. annulus increases. The modeling framework we have developed will be applied in the future to a larger data set to evaluate the biting-insect hypothesis and other hypotheses for nesting failure in this reintroduced population; resulting inferences will support ongoing efforts to manage this population via an adaptive management approach. Wider application of our approach offers promise for modeling the effects of other temporally varying, but imperfectly observed covariates on nest survival, including the possibility of modeling temporally varying covariates collected from incubating adults.
Great Basin paleontological database
Zhang, N.; Blodgett, R.B.; Hofstra, A.H.
2008-01-01
The U.S. Geological Survey has constructed a paleontological database for the Great Basin physiographic province that can be served over the World Wide Web for data entry, queries, displays, and retrievals. It is similar to the web-database solution that we constructed for Alaskan paleontological data (www.alaskafossil.org). The first phase of this effort was to compile a paleontological bibliography for Nevada and portions of adjacent states in the Great Basin that has recently been completed. In addition, we are also compiling paleontological reports (Known as E&R reports) of the U.S. Geological Survey, which are another extensive source of l,egacy data for this region. Initial population of the database benefited from a recently published conodont data set and is otherwise focused on Devonian and Mississippian localities because strata of this age host important sedimentary exhalative (sedex) Au, Zn, and barite resources and enormons Carlin-type An deposits. In addition, these strata are the most important petroleum source rocks in the region, and record the transition from extension to contraction associated with the Antler orogeny, the Alamo meteorite impact, and biotic crises associated with global oceanic anoxic events. The finished product will provide an invaluable tool for future geologic mapping, paleontological research, and mineral resource investigations in the Great Basin, making paleontological data acquired over nearly the past 150 yr readily available over the World Wide Web. A description of the structure of the database and the web interface developed for this effort are provided herein. This database is being used ws a model for a National Paleontological Database (which we am currently developing for the U.S. Geological Survey) as well as for other paleontological databases now being developed in other parts of the globe. ?? 2008 Geological Society of America.
QuakeML - An XML Schema for Seismology
NASA Astrophysics Data System (ADS)
Wyss, A.; Schorlemmer, D.; Maraini, S.; Baer, M.; Wiemer, S.
2004-12-01
We propose an extensible format-definition for seismic data (QuakeML). Sharing data and seismic information efficiently is one of the most important issues for research and observational seismology in the future. The eXtensible Markup Language (XML) is playing an increasingly important role in the exchange of a variety of data. Due to its extensible definition capabilities, its wide acceptance and the existing large number of utilities and libraries for XML, a structured representation of various types of seismological data should in our opinion be developed by defining a 'QuakeML' standard. Here we present the QuakeML definitions for parameter databases and further efforts, e.g. a central QuakeML catalog database and a web portal for exchanging codes and stylesheets.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-15
... FEDERAL COMMUNICATIONS COMMISSION Notice of Public Information Collection Being Reviewed by the Federal Communications Commission for Extension Under Delegated Authority, Comments Requested March 5, 2010. SUMMARY: The Federal Communications Commission, as part of its continuing effort to reduce...
A REVIEW OF NICKEL PLATING BATH LIFE EXTENSION, NICKEL RECOVERY & COPPER RECOVERY FROM NICKEL BATHS
For metal finishing operations to remain competitive and in compliance with environmental requirements, companies must focus their efforts on pollution prevention to reduce waste generation and disposal costs, limit liability and restore maximum profits. By applying the pollutio...
A Bridge to the Future: Observations on Building a Digital Library.
ERIC Educational Resources Information Center
Gaunt, Marianne I.
2002-01-01
The experience of Rutgers University Libraries illustrates the extensive planning, work effort, possibilities, and investment required to develop the digital library. Examines these key areas: organizational structure; staff development needs; facilities and the new digital infrastructure; metadata standards/interoperability; digital collection…
Challenges and opportunities with standardized monitoring for management decison-making
USDA-ARS?s Scientific Manuscript database
The importance of monitoring for adaptive management of rangelands has been well established. However, the actual use of monitoring data in rangeland management decisions has been modest despite extensive efforts to develop and implement monitoring programs from local to national scales. More effect...
20180318 - Curating and sharing structures and spectra for the environmental community (ACS Spring)
The increasing popularity of high mass accuracy non-target mass spectrometry methods has yielded extensive identification efforts based on spectral and chemical compound databases in the environmental community and beyond. Increasingly, new methods are relying on open data resour...
Regional sustainable environmental management: sustainability metrics research for decision makers
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...
Development of a multidisciplinary approach to assess regional sustainability
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute the metrics. Moreover, individual metrics do not capture all aspects of a system that are relev...
ERIC Educational Resources Information Center
California Agriculture, 1994
1994-01-01
This special issue focuses on problems and challenges confronting the California family and on research and extension efforts to provide at least partial answers. Research briefs by staff include "Challenges Confront the California Family" (state trends in poverty, divorce, single-parent families, child abuse, delinquency, teen births,…
Aggregate freeze-thaw testing and d-cracking field performance : 30 years later : [summary].
DOT National Transportation Integrated Search
2014-09-01
Premature deterioration of concrete pavement due to D-cracking has been a problem in : Kansas since the 1930s. The Kansas Department of Transportation (KDOT) has made : significant efforts, including five extensive studies into the phenomenon of D-Cr...
EFFECTS OF CONAZOLE FUNGICIDES ON DEVELOPMENT AND PARTURITION IN THE RAT
Conazoles are fungicides used extensively in agriculture and as pharmaceuticals. As part of an effort to evaluate the changes in gene expression corresponding to reproductive toxicity, we examined the effects of three conazoles on pregnancy and neonates. Wistar Han rats were expo...
7 CFR 3403.4 - Three-phase program.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE SMALL BUSINESS INNOVATION RESEARCH GRANTS PROGRAM Program Description § 3403.4 Three-phase program. The Small Business Innovation Research Grants Program is... technical merit and feasibility of the proposed effort and the quality of performance of the small business...
Accessing Electronic Theses: Progress?
ERIC Educational Resources Information Center
Tennant, Roy
2000-01-01
Describes various ways by which universities provide access to their electronic theses and dissertations (ETDs), discussing UMI (University Microfilms International), XML (eXtensible Markup Language), and other formats. Discusses key leaders--national and international--in the ETD effort. Outlines the two main methods for locating ETDs. Presents a…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... extensive scientific studies, conferences and workshops. To guide the efforts of the GLRI, EPA and its... Web site at http://www.epa.gov/sab in advance of the meeting. Procedures for Providing Public Input...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
... Information Collection Activity Under OMB Review: Aviation Security Customer Satisfaction Performance... surveying travelers to measure customer satisfaction of aviation security in an effort to more efficiently... Title: Aviation Security Customer Satisfaction Performance Measurement Passenger Survey. Type of Request...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-26
... Information Collection Activity Under OMB Review: Aviation Security Customer Satisfaction Performance... surveying travelers to measure customer satisfaction of aviation security in an effort to more efficiently.... Information Collection Requirement OMB Control Number 1652-0013; Aviation Security Customer Satisfaction...
Methane (CH4), a potent greenhouse gas, is known to be produced and emitted from freshwater systems. Recently, extensive efforts have been directed toward quantifyingmethane emissions fromthese ecosystems, while additional research has focused on factors that may influence emissi...
Estimating carbon fluxes on small rotationally grazed pastures
USDA-ARS?s Scientific Manuscript database
Satellite-based Normalized Difference Vegetation Index (NDVI) data have been extensively used for estimating gross primary productivity (GPP) and yield of grazing lands throughout the world. Large-scale estimates of GPP are a necessary component of efforts to monitor the soil carbon balance of grazi...
Towards Cooperative Learning in Elementary School Physical Education
ERIC Educational Resources Information Center
Kirchner, Glenn
2005-01-01
The extensive amount of research evidence, at all levels of education and with all subject areas, consistently indicates that cooperative learning results in higher achievement, increased positive interpersonal relationships, and higher self-esteem than competitive or individualistic efforts. In physical education, individualistic learning is an…
Landmarks in the historical development of twenty first century food processing technologies.
Misra, N N; Koubaa, Mohamed; Roohinejad, Shahin; Juliano, Pablo; Alpas, Hami; Inácio, Rita S; Saraiva, Jorge A; Barba, Francisco J
2017-07-01
Over a course of centuries, various food processing technologies have been explored and implemented to provide safe, fresher-tasting and nutritive food products. Among these technologies, application of emerging food processes (e.g., cold plasma, pressurized fluids, pulsed electric fields, ohmic heating, radiofrequency electric fields, ultrasonics and megasonics, high hydrostatic pressure, high pressure homogenization, hyperbaric storage, and negative pressure cavitation extraction) have attracted much attention in the past decades. This is because, compared to their conventional counterparts, novel food processes allow a significant reduction in the overall processing times with savings in energy consumption, while ensuring food safety, and ample benefits for the industry. Noteworthily, industry and university teams have made extensive efforts for the development of novel technologies, with sound scientific knowledge of their effects on different food materials. The main objective of this review is to provide a historical account of the extensive efforts and inventions in the field of emerging food processing technologies since their inception to present day. Copyright © 2017 Elsevier Ltd. All rights reserved.
Heat transfer with hockey-stick steam generator. [LMFBR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, E; Gabler, M J
1977-11-01
The hockey-stick modular design concept is a good answer to future needs for reliable, economic LMFBR steam generators. The concept was successfully demonstrated in the 30 Mwt MSG test unit; scaled up versions are currently in fabrication for CRBRP usage, and further scaling has been accomplished for PLBR applications. Design and performance characteristics are presented for the three generations of hockey-stick steam generators. The key features of the design are presented based on extensive analytical effort backed up by extensive ancillary test data. The bases for and actual performance evaluations are presented with emphasis on the CRBRP design. The designmore » effort on these units has resulted in the development of analytical techniques that are directly applicable to steam generators for any LMFBR application. In conclusion, the hockey-stick steam generator concept has been proven to perform both thermally and hydraulically as predicted. The heat transfer characteristics are well defined, and proven analytical techniques are available as are personnel experienced in their use.« less