Data-driven Modelling for decision making under uncertainty
NASA Astrophysics Data System (ADS)
Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus
2018-01-01
The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.
Effects of a Data-Driven District-Level Reform Model
ERIC Educational Resources Information Center
Slavin, Robert E.; Holmes, GwenCarol; Madden, Nancy A.; Chamberlain, Anne; Cheung, Alan
2010-01-01
Despite a quarter-century of reform, US schools serving students in poverty continue to lag far behind other schools. There are proven programs, but these are not widely used. This large-scale experiment evaluated a district-level reform model created by the Center for DataDriven Reform in Education (CDDRE). The CDDRE model provided consultation…
Zhang, Huaguang; Cui, Lili; Zhang, Xin; Luo, Yanhong
2011-12-01
In this paper, a novel data-driven robust approximate optimal tracking control scheme is proposed for unknown general nonlinear systems by using the adaptive dynamic programming (ADP) method. In the design of the controller, only available input-output data is required instead of known system dynamics. A data-driven model is established by a recurrent neural network (NN) to reconstruct the unknown system dynamics using available input-output data. By adding a novel adjustable term related to the modeling error, the resultant modeling error is first guaranteed to converge to zero. Then, based on the obtained data-driven model, the ADP method is utilized to design the approximate optimal tracking controller, which consists of the steady-state controller and the optimal feedback controller. Further, a robustifying term is developed to compensate for the NN approximation errors introduced by implementing the ADP method. Based on Lyapunov approach, stability analysis of the closed-loop system is performed to show that the proposed controller guarantees the system state asymptotically tracking the desired trajectory. Additionally, the obtained control input is proven to be close to the optimal control input within a small bound. Finally, two numerical examples are used to demonstrate the effectiveness of the proposed control scheme.
Data to Decisions: Creating a Culture of Model-Driven Drug Discovery.
Brown, Frank K; Kopti, Farida; Chang, Charlie Zhenyu; Johnson, Scott A; Glick, Meir; Waller, Chris L
2017-09-01
Merck & Co., Inc., Kenilworth, NJ, USA, is undergoing a transformation in the way that it prosecutes R&D programs. Through the adoption of a "model-driven" culture, enhanced R&D productivity is anticipated, both in the form of decreased attrition at each stage of the process and by providing a rational framework for understanding and learning from the data generated along the way. This new approach focuses on the concept of a "Design Cycle" that makes use of all the data possible, internally and externally, to drive decision-making. These data can take the form of bioactivity, 3D structures, genomics, pathway, PK/PD, safety data, etc. Synthesis of high-quality data into models utilizing both well-established and cutting-edge methods has been shown to yield high confidence predictions to prioritize decision-making and efficiently reposition resources within R&D. The goal is to design an adaptive research operating plan that uses both modeled data and experiments, rather than just testing, to drive project decision-making. To support this emerging culture, an ambitious information management (IT) program has been initiated to implement a harmonized platform to facilitate the construction of cross-domain workflows to enable data-driven decision-making and the construction and validation of predictive models. These goals are achieved through depositing model-ready data, agile persona-driven access to data, a unified cross-domain predictive model lifecycle management platform, and support for flexible scientist-developed workflows that simplify data manipulation and consume model services. The end-to-end nature of the platform, in turn, not only supports but also drives the culture change by enabling scientists to apply predictive sciences throughout their work and over the lifetime of a project. This shift in mindset for both scientists and IT was driven by an early impactful demonstration of the potential benefits of the platform, in which expert-level early discovery predictive models were made available from familiar desktop tools, such as ChemDraw. This was built using a workflow-driven service-oriented architecture (SOA) on top of the rigorous registration of all underlying model entities.
A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling
NASA Astrophysics Data System (ADS)
Moore, Chandler; Akiki, Georges; Balachandar, S.
2017-11-01
This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
Implementing a Successful Faculty, Data Driven Model for Program Review.
ERIC Educational Resources Information Center
Beal, Suzanne; Davis, Shirley
Frederick Community College (Maryland) utilizes both the Instructional Accountability Program Review (IAPR) and the Career Program Review (CPR) to assess program outcomes and determine progress in meeting goals and objectives. The IAPR is a comprehensive review procedure conducted by faculty and associate deans to evaluate all transfer, career,…
A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aiken, Alex
The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less
USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, James R.
The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by eachmore » of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.« less
Lazy evaluation of FP programs: A data-flow approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Y.H.; Gaudiot, J.L.
1988-12-31
This paper presents a lazy evaluation system for the list-based functional language, Backus` FP in data-driven environment. A superset language of FP, called DFP (Demand-driven FP), is introduced. FP eager programs are transformed into DFP lazy programs which contain the notions of demands. The data-driven execution of DFP programs has the same effects of lazy evaluation. DFP lazy programs have the property of always evaluating a sufficient and necessary result. The infinite sequence generator is used to demonstrate the eager-lazy program transformation and the execution of the lazy programs.
Using New Remotely-sensed Biomass To Estimate Co2 Fluxes Over Siberia
NASA Astrophysics Data System (ADS)
Lafont, S.; Kergoat, L.; Dedieu, G.; Le Toan, T.
Two european programs recently focused on Siberia. The first one, Eurosiberian Car- bonflux was a faisability study for an observation system of the regional CO2 fluxes. The second one, SIBERIA was a big effort to develop and validate a biomass map on Siberia using radar data from satelltes (J-ERS, ERS). Here, we extend the simula- tion of NPP performed for the first program by using the biomass data of the second program. The TURC model, used here, is a global NPP model, based on light use efficiency, where photosynthetic assimilation is driven by satellite vegetation index, and au- totrophic respiration is driven by biomass. In this study, we will present a n´ zoom z on siberian region. The TURC model was run with a fine resolution (few kilometers) and a daily time step. We will discuss the impact of a new biomass dataset description on Net Primary Pro- ductivity (NPP) and CO2 fluxes estimation.
Computational challenges in modeling gene regulatory events.
Pataskar, Abhijeet; Tiwari, Vijay K
2016-10-19
Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.
NASA Astrophysics Data System (ADS)
Nelson, Kevin; Corbin, George; Blowers, Misty
2014-05-01
Machine learning is continuing to gain popularity due to its ability to solve problems that are difficult to model using conventional computer programming logic. Much of the current and past work has focused on algorithm development, data processing, and optimization. Lately, a subset of research has emerged which explores issues related to security. This research is gaining traction as systems employing these methods are being applied to both secure and adversarial environments. One of machine learning's biggest benefits, its data-driven versus logic-driven approach, is also a weakness if the data on which the models rely are corrupted. Adversaries could maliciously influence systems which address drift and data distribution changes using re-training and online learning. Our work is focused on exploring the resilience of various machine learning algorithms to these data-driven attacks. In this paper, we present our initial findings using Monte Carlo simulations, and statistical analysis, to explore the maximal achievable shift to a classification model, as well as the required amount of control over the data.
Computational challenges in modeling gene regulatory events
Pataskar, Abhijeet; Tiwari, Vijay K.
2016-01-01
ABSTRACT Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating “omics” data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology. PMID:27390891
Pandey, Daya Shankar; Pan, Indranil; Das, Saptarshi; Leahy, James J; Kwapinski, Witold
2015-03-01
A multi-gene genetic programming technique is proposed as a new method to predict syngas yield production and the lower heating value for municipal solid waste gasification in a fluidized bed gasifier. The study shows that the predicted outputs of the municipal solid waste gasification process are in good agreement with the experimental dataset and also generalise well to validation (untrained) data. Published experimental datasets are used for model training and validation purposes. The results show the effectiveness of the genetic programming technique for solving complex nonlinear regression problems. The multi-gene genetic programming are also compared with a single-gene genetic programming model to show the relative merits and demerits of the technique. This study demonstrates that the genetic programming based data-driven modelling strategy can be a good candidate for developing models for other types of fuels as well. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Klutz, Glenn
1989-01-01
A facility was established that uses collected data and feeds it into mathematical models that generate improved data arrays by correcting for various losses, base line drift, and conversion to unity scaling. These developed data arrays have headers and other identifying information affixed and are subsequently stored in a Laser Materials and Characteristics data base which is accessible to various users. The two part data base: absorption - emission spectra and tabulated data, is developed around twelve laser models. The tabulated section of the data base is divided into several parts: crystalline, optical, mechanical, and thermal properties; aborption and emission spectra information; chemical name and formulas; and miscellaneous. A menu-driven, language-free graphing program will reduce and/or remove the requirement that users become competent FORTRAN programmers and the concomitant requirement that they also spend several days to a few weeks becoming conversant with the GEOGRAF library and sequence of calls and the continual refreshers of both. The work included becoming thoroughly conversant with or at least very familiar with GEOGRAF by GEOCOMP Corp. The development of the graphing program involved trial runs of the various callable library routines on dummy data in order to become familiar with actual implementation and sequencing. This was followed by trial runs with actual data base files and some additional data from current research that was not in the data base but currently needed graphs. After successful runs, with dummy and real data, using actual FORTRAN instructions steps were undertaken to develop the menu-driven language-free implementation of a program which would require the user only know how to use microcomputers. The user would simply be responding to items displayed on the video screen. To assist the user in arriving at the optimum values needed for a specific graph, a paper, and pencil check list was made available to use on the trial runs.
Community College Dual Enrollment Faculty Orientation: A Utilization-Focused Approach
ERIC Educational Resources Information Center
Charlier, Hara D.; Duggan, Molly H.
2010-01-01
The current climate of accountability demands that institutions engage in data-driven program evaluation. In order to promote quality dual enrollment (DE) programs, institutions must support the adjunct faculty teaching college courses in high schools. This study uses Patton's utilization-focused model (1997) to conduct a formative evaluation of a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram
Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less
Software For Computing Reliability Of Other Software
NASA Technical Reports Server (NTRS)
Nikora, Allen; Antczak, Thomas M.; Lyu, Michael
1995-01-01
Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.
The application of domain-driven design in NMS
NASA Astrophysics Data System (ADS)
Zhang, Jinsong; Chen, Yan; Qin, Shengjun
2011-12-01
In the traditional design approach of data-model-driven, system analysis and design phases are often separated which makes the demand information can not be expressed explicitly. The method is also easy to lead developer to the process-oriented programming, making codes between the modules or between hierarchies disordered. So it is hard to meet requirement of system scalability. The paper proposes a software hiberarchy based on rich domain model according to domain-driven design named FHRDM, then the Webwork + Spring + Hibernate (WSH) framework is determined. Domain-driven design aims to construct a domain model which not only meets the demand of the field where the software exists but also meets the need of software development. In this way, problems in Navigational Maritime System (NMS) development like big system business volumes, difficulty of requirement elicitation, high development costs and long development cycle can be resolved successfully.
A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty
Friedel, Michael J.
2011-01-01
This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.
2015-03-01
domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model
Thermal Ablation Modeling for Silicate Materials
NASA Technical Reports Server (NTRS)
Chen, Yih-Kanq
2016-01-01
A general thermal ablation model for silicates is proposed. The model includes the mass losses through the balance between evaporation and condensation, and through the moving molten layer driven by surface shear force and pressure gradient. This model can be applied in the ablation simulation of the meteoroid and the glassy ablator for spacecraft Thermal Protection Systems. Time-dependent axisymmetric computations are performed by coupling the fluid dynamics code, Data-Parallel Line Relaxation program, with the material response code, Two-dimensional Implicit Thermal Ablation simulation program, to predict the mass lost rates and shape change. The predicted mass loss rates will be compared with available data for model validation, and parametric studies will also be performed for meteoroid earth entry conditions.
NASA Astrophysics Data System (ADS)
Dugan, H.; Hanson, P. C.; Weathers, K. C.
2016-12-01
In the water sciences there is a massive need for graduate students who possess the analytical and technical skills to deal with large datasets and function in the new paradigm of open, collaborative -science. The Global Lake Ecological Observatory Network (GLEON) graduate fellowship program (GFP) was developed as an interdisciplinary training program to supplement the intensive disciplinary training of traditional graduate education. The primary goal of the GFP was to train a diverse cohort of graduate students in network science, open-web technologies, collaboration, and data analytics, and importantly to provide the opportunity to use these skills to conduct collaborative research resulting in publishable scientific products. The GFP is run as a series of three week-long workshops over two years that brings together a cohort of twelve students. In addition, fellows are expected to attend and contribute to at least one international GLEON all-hands' meeting. Here, we provide examples of training modules in the GFP (model building, data QA/QC, information management, bayesian modeling, open coding/version control, national data programs), as well as scientific outputs (manuscripts, software products, and new global datasets) produced by the fellows, as well as the process by which this team science was catalyzed. Data driven education that lets students apply learned skills to real research projects reinforces concepts, provides motivation, and can benefit their publication record. This program design is extendable to other institutions and networks.
Scholarly Concentration Program Development: A Generalizable, Data-Driven Approach.
Burk-Rafel, Jesse; Mullan, Patricia B; Wagenschutz, Heather; Pulst-Korenberg, Alexandra; Skye, Eric; Davis, Matthew M
2016-11-01
Scholarly concentration programs-also known as scholarly projects, pathways, tracks, or pursuits-are increasingly common in U.S. medical schools. However, systematic, data-driven program development methods have not been described. The authors examined scholarly concentration programs at U.S. medical schools that U.S. News & World Report ranked as top 25 for research or primary care (n = 43 institutions), coding concentrations and mission statements. Subsequently, the authors conducted a targeted needs assessment via a student-led, institution-wide survey, eliciting learners' preferences for 10 "Pathways" (i.e., concentrations) and 30 "Topics" (i.e., potential content) augmenting core curricula at their institution. Exploratory factor analysis (EFA) and a capacity optimization algorithm characterized best institutional options for learner-focused Pathway development. The authors identified scholarly concentration programs at 32 of 43 medical schools (74%), comprising 199 distinct concentrations (mean concentrations per program: 6.2, mode: 5, range: 1-16). Thematic analysis identified 10 content domains; most common were "Global/Public Health" (30 institutions; 94%) and "Clinical/Translational Research" (26 institutions; 81%). The institutional needs assessment (n = 468 medical students; response rate 60% overall, 97% among first-year students) demonstrated myriad student preferences for Pathways and Topics. EFA of Topic preferences identified eight factors, systematically related to Pathway preferences, informing content development. Capacity modeling indicated that offering six Pathways could guarantee 95% of first-year students (162/171) their first- or second-choice Pathway. This study demonstrates a generalizable, data-driven approach to scholarly concentration program development that reflects student preferences and institutional strengths, while optimizing program diversity within capacity constraints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
NASA Astrophysics Data System (ADS)
Khankhasayev, Zhanat B.; Kurmanov, Hans; Plendl, Mikhail Kh.
1996-12-01
The Table of Contents for the full book PDF is as follows: * Preface * I. Review of Current Status of Nuclear Transmutation Projects * Accelerator-Driven Systems — Survey of the Research Programs in the World * The Los Alamos Accelerator-Driven Transmutation of Nuclear Waste Concept * Nuclear Waste Transmutation Program in the Czech Republic * Tentative Results of the ISTC Supported Study of the ADTT Plutonium Disposition * Recent Neutron Physics Investigations for the Back End of the Nuclear Fuel Cycle * Optimisation of Accelerator Systems for Transmutation of Nuclear Waste * Proton Linac of the Moscow Meson Factory for the ADTT Experiments * II. Computer Modeling of Nuclear Waste Transmutation Methods and Systems * Transmutation of Minor Actinides in Different Nuclear Facilities * Monte Carlo Modeling of Electro-nuclear Processes with Nonlinear Effects * Simulation of Hybrid Systems with a GEANT Based Program * Computer Study of 90Sr and 137Cs Transmutation by Proton Beam * Methods and Computer Codes for Burn-Up and Fast Transients Calculations in Subcritical Systems with External Sources * New Model of Calculation of Fission Product Yields for the ADTT Problem * Monte Carlo Simulation of Accelerator-Reactor Systems * III. Data Basis for Transmutation of Actinides and Fission Products * Nuclear Data in the Accelerator Driven Transmutation Problem * Nuclear Data to Study Radiation Damage, Activation, and Transmutation of Materials Irradiated by Particles of Intermediate and High Energies * Radium Institute Investigations on the Intermediate Energy Nuclear Data on Hybrid Nuclear Technologies * Nuclear Data Requirements in Intermediate Energy Range for Improvement of Calculations of ADTT Target Processes * IV. Experimental Studies and Projects * ADTT Experiments at the Los Alamos Neutron Science Center * Neutron Multiplicity Distributions for GeV Proton Induced Spallation Reactions on Thin and Thick Targets of Pb and U * Solid State Nuclear Track Detector and Radiochemical Studies on the Transmutation of Nuclei Using Relativistic Heavy Ions * Experimental and Theoretical Study of Radionuclide Production on the Electronuclear Plant Target and Construction Materials Irradiated by 1.5 GeV and 130 MeV Protons * Neutronics and Power Deposition Parameters of the Targets Proposed in the ISTC Project 17 * Multicycle Irradiation of Plutonium in Solid Fuel Heavy-Water Blanket of ADS * Compound Neutron Valve of Accelerator-Driven System Sectioned Blanket * Subcritical Channel-Type Reactor for Weapon Plutonium Utilization * Accelerator Driven Molten-Fluoride Reactor with Modular Heat Exchangers on PB-BI Eutectic * A New Conception of High Power Ion Linac for ADTT * Pions and Accelerator-Driven Transmutation of Nuclear Waste? * V. Problems and Perspectives * Accelerator-Driven Transmutation Technologies for Resolution of Long-Term Nuclear Waste Concerns * Closing the Nuclear Fuel-Cycle and Moving Toward a Sustainable Energy Development * Workshop Summary * List of Participants
NASA Astrophysics Data System (ADS)
Fu, Linyun; Ma, Xiaogang; Zheng, Jin; Goldstein, Justin; Duggan, Brian; West, Patrick; Aulenbach, Steve; Tilmes, Curt; Fox, Peter
2014-05-01
This poster will show how we used a case-driven iterative methodology to develop an ontology to represent the content structure and the associated provenance information in a National Climate Assessment (NCA) report of the US Global Change Research Program (USGCRP). We applied the W3C PROV-O ontology to implement a formal representation of provenance. We argue that the use case-driven, iterative development process and the application of a formal provenance ontology help efficiently incorporate domain knowledge from earth and environmental scientists in a well-structured model interoperable in the context of the Web of Data.
Texas Automated Buoy System 1995-2005 and Beyond
NASA Astrophysics Data System (ADS)
Guinasso, N. L.; Bender, L. C.; Walpert, J. N.; Lee, L. L.; Campbell, L.; Hetland, R. D.; Howard, M. K.; Martin, R. D.
2005-05-01
TABS was established in l995 to provide data to assess oil spill movement along Texas coast for the Texas General Land Office Oil Spill Prevention and Response Program. A system of nine automated buoys provide wind and current data in near real time. Two of these buoys are supported by the Flower Garden Banks Joint Industry Program. A TABS web site provides a public interface to view and download the data. A real time data analysis web page presents a wide variety of useful data products derived from the field measurements. Integration efforts now underway include transfer of buoy data to the National Data Buoy Center for quality control and incorporation into the Global Telecommunications Stream. The TGLO ocean circulation nowcast/forecast modeling system has been in continuous operation since 1998. Two models, POM and ROMS, are used to produce forecasts of near-surface wind driven currents up to 48 hours into the future. Both models are driven using wind fields obtained from the NAM (formerly Eta) forecast models operated by NOAA NCEP. Wind and current fields are displayed on websites in both static and animated forms and are updated four times per day. Under funding from the SURA/SCOOP program we are; 1) revamping the system to conform with the evolving Data Management and Communications (DMAC) framework adopted by the NSF Orion and OCEAN.US IOOS programs, 2) producing model-data comparisons, and 3) integrating the wind and current fields into the GNOME oil trajectory model used by NOAA/Hazmat. Academic research is planned to assimilate near real-time observations from TABS buoys and some 30-40 ADCP instruments scheduled to be mounted on offshore oil platforms in early 2005. Texas Automated Buoy System (TABS) and its associated modeling efforts provide a reliable source of accurate, up-to-date information on currents along the Texas coast. As the nation embarks on the development of an Integrated Ocean Observing System (IOOS), TABS will be an active participant as a foundational regional component to the national backbone of ocean observations.
Modeling Laser-Driven Laboratory Astrophysics Experiments Using the CRASH Code
NASA Astrophysics Data System (ADS)
Grosskopf, Michael; Keiter, P.; Kuranz, C. C.; Malamud, G.; Trantham, M.; Drake, R.
2013-06-01
Laser-driven, laboratory astrophysics experiments can provide important insight into the physical processes relevant to astrophysical systems. The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density laboratory astrophysics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. The CRASH model has been used on many applications including: radiative shocks, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL) collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.
Genetic Programming for Automatic Hydrological Modelling
NASA Astrophysics Data System (ADS)
Chadalawada, Jayashree; Babovic, Vladan
2017-04-01
One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach for conceptual hydrological modeling: 1. Motivation and theoretical development, Water Resources Research, 47(11).
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Model-Driven Development for scientific computing. An upgrade of the RHEEDGr program
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej
2009-11-01
Model-Driven Engineering (MDE) is the software engineering discipline, which considers models as the most important element for software development, and for the maintenance and evolution of software, through model transformation. Model-Driven Architecture (MDA) is the approach for software development under the Model-Driven Engineering framework. This paper surveys the core MDA technology that was used to upgrade of the RHEEDGR program to C++0x language standards. New version program summaryProgram title: RHEEDGR-09 Catalogue identifier: ADUY_v3_0 Program summary URL:
Real-Time MENTAT programming language and architecture
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.
1989-01-01
Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1988-01-01
The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.
Re-Defining Language Teacher Cognition through a Data-Driven Model: The Case of Three EFL Teachers
ERIC Educational Resources Information Center
Öztürk, Gökhan; Gürbüz, Nurdan
2017-01-01
This study examined the main sources of the participant English as a foreign language (EFL) teachers' cognitions, their classroom practices and the impact of institutional context on these practices. The participants included three Turkish EFL instructors working at English preparatory programs at university level. The data were collected through…
Functional language and data flow architectures
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Patel, D. R.; Lang, T.
1983-01-01
This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.
NASA Astrophysics Data System (ADS)
Rock, B. N.; Hale, S. R.; Graham, K. J.; Hayden, L.; Barber, L.; Perry, C.; Schloss, J.; Sullivan, E.; Yuan, J.; Abebe, E.; Mitchell, L.; Abrams, E.; Gagnon, M.
2008-12-01
Watershed Watch (NSF 0525433) engages early undergraduate students from two-year and four-year colleges in student-driven full inquiry-based instruction in the biogeosciences. Program goals for Watershed Watch are to test if inquiry-rich student-driven projects sufficiently engage undeclared students (or noncommittal STEM majors) to declare a STEM major (or remain with their STEM major). A significant component of this program is an intensive two-week Summer course, in which undeclared freshmen research various aspects of a local watershed. Students develop their own research questions and study design, collect and analyze data, and produce a scientific or an oral poster presentation. The course objectives, curriculum and schedule are presented as a model for dissemination for other institutions and programs seeking to develop inquiry-rich courses designed to attract students into biogeoscience disciplines. Data from self-reported student feedback indicated the most important factors explaining high-levels of student motivation and research excellence in the course are 1) working with committed, energetic, and enthusiastic faculty mentors; and 2) faculty mentors demonstrating high degrees of teamwork and coordination.
Validation of buoyancy driven spectral tensor model using HATS data
NASA Astrophysics Data System (ADS)
Chougule, A.; Mann, J.; Kelly, M.; Larsen, G. C.
2016-09-01
We present a homogeneous spectral tensor model for wind velocity and temperature fluctuations, driven by mean vertical shear and mean temperature gradient. Results from the model, including one-dimensional velocity and temperature spectra and the associated co-spectra, are shown in this paper. The model also reproduces two-point statistics, such as coherence and phases, via cross-spectra between two points separated in space. Model results are compared with observations from the Horizontal Array Turbulence Study (HATS) field program (Horst et al. 2004). The spectral velocity tensor in the model is described via five parameters: the dissipation rate (ɛ), length scale of energy-containing eddies (L), a turbulence anisotropy parameter (Γ), gradient Richardson number (Ri) representing the atmospheric stability and the rate of destruction of temperature variance (ηθ).
Logic integer programming models for signaling networks.
Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert
2009-05-01
We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.
NASA Astrophysics Data System (ADS)
Wasser, L. A.; Gold, A. U.
2017-12-01
There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.
Data-Driven Hint Generation in Vast Solution Spaces: A Self-Improving Python Programming Tutor
ERIC Educational Resources Information Center
Rivers, Kelly; Koedinger, Kenneth R.
2017-01-01
To provide personalized help to students who are working on code-writing problems, we introduce a data-driven tutoring system, ITAP (Intelligent Teaching Assistant for Programming). ITAP uses state abstraction, path construction, and state reification to automatically generate personalized hints for students, even when given states that have not…
A Monthly Water-Balance Model Driven By a Graphical User Interface
McCabe, Gregory J.; Markstrom, Steven L.
2007-01-01
This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.
Pharmacy Educator Motives to Pursue Pedagogical Knowledge.
Baia, Patricia; Strang, Aimee F
2016-10-25
Objective. To investigate motives of pharmacy educators who pursue pedagogical knowledge through professional development programs and to develop a model of motivation to inform future development. Methods. A mixed-methods approach was used to study both qualitative and quantitative data. Written narratives, postmodule quizzes, and survey data were collected during a 5-year period (2010-2014) from pharmacy educators who participated in an online professional development program titled Helping Educators Learn Pedagogy (HELP). Grounded theory was used to create a model of motivation for why pharmacy educators might pursue pedagogical knowledge. Results. Participants reported being driven intrinsically by a passion for their own learning (self-centered motivation) and by the need to improve student learning (student-centered motivation) and extrinsically by program design, funding, and administrator encouragement. Conclusion. A new model of pharmacy educator motivation to pursue pedagogy knowledge, Pedagogical Knowledge Acquisition Theory (PKAT), emerged as a blended intrinsic and extrinsic model, which may have value in developing future professional development programs.
Pharmacy Educator Motives to Pursue Pedagogical Knowledge
Strang, Aimee F.
2016-01-01
Objective. To investigate motives of pharmacy educators who pursue pedagogical knowledge through professional development programs and to develop a model of motivation to inform future development. Methods. A mixed-methods approach was used to study both qualitative and quantitative data. Written narratives, postmodule quizzes, and survey data were collected during a 5-year period (2010-2014) from pharmacy educators who participated in an online professional development program titled Helping Educators Learn Pedagogy (HELP). Grounded theory was used to create a model of motivation for why pharmacy educators might pursue pedagogical knowledge. Results. Participants reported being driven intrinsically by a passion for their own learning (self-centered motivation) and by the need to improve student learning (student-centered motivation) and extrinsically by program design, funding, and administrator encouragement. Conclusion. A new model of pharmacy educator motivation to pursue pedagogy knowledge, Pedagogical Knowledge Acquisition Theory (PKAT), emerged as a blended intrinsic and extrinsic model, which may have value in developing future professional development programs. PMID:27899828
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Data-free and data-driven spectral perturbations for RANS UQ
NASA Astrophysics Data System (ADS)
Edeling, Wouter; Mishra, Aashwin; Iaccarino, Gianluca
2017-11-01
Despite recent developments in high-fidelity turbulent flow simulations, RANS modeling is still vastly used by industry, due to its inherent low cost. Since accuracy is a concern in RANS modeling, model-form UQ is an essential tool for assessing the impacts of this uncertainty on quantities of interest. Applying the spectral decomposition to the modeled Reynolds-Stress Tensor (RST) allows for the introduction of decoupled perturbations into the baseline intensity (kinetic energy), shape (eigenvalues), and orientation (eigenvectors). This constitutes a natural methodology to evaluate the model form uncertainty associated to different aspects of RST modeling. In a predictive setting, one frequently encounters an absence of any relevant reference data. To make data-free predictions with quantified uncertainty we employ physical bounds to a-priori define maximum spectral perturbations. When propagated, these perturbations yield intervals of engineering utility. High-fidelity data opens up the possibility of inferring a distribution of uncertainty, by means of various data-driven machine-learning techniques. We will demonstrate our framework on a number of flow problems where RANS models are prone to failure. This research was partially supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo), and the DOE PSAAP-II program.
Nguyen, Quoc-Thang; Miledi, Ricardo
2003-09-30
Current computer programs for intracellular recordings often lack advanced data management, are usually incompatible with other applications and are also difficult to adapt to new experiments. We have addressed these shortcomings in e-Phys, a suite of electrophysiology applications for intracellular recordings. The programs in e-Phys use Component Object Model (COM) technologies available in the Microsoft Windows operating system to provide enhanced data storage, increased interoperability between e-Phys and other COM-aware applications, and easy customization of data acquisition and analysis thanks to a script-based integrated programming environment. Data files are extensible, hierarchically organized and integrated in the Windows shell by using the Structured Storage technology. Data transfers to and from other programs are facilitated by implementing the ActiveX Automation standard and distributed COM (DCOM). ActiveX Scripting allows experimenters to write their own event-driven acquisition and analysis programs in the VBScript language from within e-Phys. Scripts can reuse components available from other programs on other machines to create distributed meta-applications. This paper describes the main features of e-Phys and how this package was used to determine the effect of the atypical antipsychotic drug clozapine on synaptic transmission at the neuromuscular junction.
Macro-actor execution on multilevel data-driven architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaudiot, J.L.; Najjar, W.
1988-12-31
The data-flow model of computation brings to multiprocessors high programmability at the expense of increased overhead. Applying the model at a higher level leads to better performance but also introduces loss of parallelism. We demonstrate here syntax directed program decomposition methods for the creation of large macro-actors in numerical algorithms. In order to alleviate some of the problems introduced by the lower resolution interpretation, we describe a multi-level of resolution and analyze the requirements for its actual hardware and software integration.
Data-Driven Hint Generation from Peer Debugging Solutions
ERIC Educational Resources Information Center
Liu, Zhongxiu
2015-01-01
Data-driven methods have been a successful approach to generating hints for programming problems. However, the majority of previous studies are focused on procedural hints that aim at moving students to the next closest state to the solution. In this paper, I propose a data-driven method to generate remedy hints for BOTS, a game that teaches…
Air-Breathing Hypersonic Vehicle Tracking Control Based on Adaptive Dynamic Programming.
Mu, Chaoxu; Ni, Zhen; Sun, Changyin; He, Haibo
2017-03-01
In this paper, we propose a data-driven supplementary control approach with adaptive learning capability for air-breathing hypersonic vehicle tracking control based on action-dependent heuristic dynamic programming (ADHDP). The control action is generated by the combination of sliding mode control (SMC) and the ADHDP controller to track the desired velocity and the desired altitude. In particular, the ADHDP controller observes the differences between the actual velocity/altitude and the desired velocity/altitude, and then provides a supplementary control action accordingly. The ADHDP controller does not rely on the accurate mathematical model function and is data driven. Meanwhile, it is capable to adjust its parameters online over time under various working conditions, which is very suitable for hypersonic vehicle system with parameter uncertainties and disturbances. We verify the adaptive supplementary control approach versus the traditional SMC in the cruising flight, and provide three simulation studies to illustrate the improved performance with the proposed approach.
Varying execution discipline to increase performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, P.L.; Maccabe, A.B.
1993-12-22
This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less
An Effective Assessment Model for Implementing Change and Improving Learning
ERIC Educational Resources Information Center
Mince, Rose; Ebersole, Tara
2008-01-01
Assessment at Community College of Baltimore County (CCBC) involves asking the right questions and using data to determine what changes should be implemented to enhance student learning. Guided by a 5-stage design, CCBC's assessment program is faculty-driven, risk-free, and externally validated. Curricular and pedagogical changes have resulted in…
Propeller aircraft interior noise model: User's manual for computer program
NASA Technical Reports Server (NTRS)
Wilby, E. G.; Pope, L. D.
1985-01-01
A computer program entitled PAIN (Propeller Aircraft Interior Noise) has been developed to permit calculation of the sound levels in the cabin of a propeller-driven airplane. The fuselage is modeled as a cylinder with a structurally integral floor, the cabin sidewall and floor being stiffened by ring frames, stringers and floor beams of arbitrary configurations. The cabin interior is covered with acoustic treatment and trim. The propeller noise consists of a series of tones at harmonics of the blade passage frequency. Input data required by the program include the mechanical and acoustical properties of the fuselage structure and sidewall trim. Also, the precise propeller noise signature must be defined on a grid that lies in the fuselage skin. The propeller data are generated with a propeller noise prediction program such as the NASA Langley ANOPP program. The program PAIN permits the calculation of the space-average interior sound levels for the first ten harmonics of a propeller rotating alongside the fuselage. User instructions for PAIN are given in the report. Development of the analytical model is presented in NASA CR 3813.
Goel, Purva; Bapat, Sanket; Vyas, Renu; Tambe, Amruta; Tambe, Sanjeev S
2015-11-13
The development of quantitative structure-retention relationships (QSRR) aims at constructing an appropriate linear/nonlinear model for the prediction of the retention behavior (such as Kovats retention index) of a solute on a chromatographic column. Commonly, multi-linear regression and artificial neural networks are used in the QSRR development in the gas chromatography (GC). In this study, an artificial intelligence based data-driven modeling formalism, namely genetic programming (GP), has been introduced for the development of quantitative structure based models predicting Kovats retention indices (KRI). The novelty of the GP formalism is that given an example dataset, it searches and optimizes both the form (structure) and the parameters of an appropriate linear/nonlinear data-fitting model. Thus, it is not necessary to pre-specify the form of the data-fitting model in the GP-based modeling. These models are also less complex, simple to understand, and easy to deploy. The effectiveness of GP in constructing QSRRs has been demonstrated by developing models predicting KRIs of light hydrocarbons (case study-I) and adamantane derivatives (case study-II). In each case study, two-, three- and four-descriptor models have been developed using the KRI data available in the literature. The results of these studies clearly indicate that the GP-based models possess an excellent KRI prediction accuracy and generalization capability. Specifically, the best performing four-descriptor models in both the case studies have yielded high (>0.9) values of the coefficient of determination (R(2)) and low values of root mean squared error (RMSE) and mean absolute percent error (MAPE) for training, test and validation set data. The characteristic feature of this study is that it introduces a practical and an effective GP-based method for developing QSRRs in gas chromatography that can be gainfully utilized for developing other types of data-driven models in chromatography science. Copyright © 2015 Elsevier B.V. All rights reserved.
2014-06-01
from the ODM standard. Leveraging SPARX EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL...server MySQL creates the physical schema that enables a user to store and retrieve data conforming to the vocabulary of the JC3IEDM. 6. GENERATING AN
Souza, W.R.
1987-01-01
This report documents a graphical display program for the U. S. Geological Survey finite-element groundwater flow and solute transport model. Graphic features of the program, SUTRA-PLOT (SUTRA-PLOT = saturated/unsaturated transport), include: (1) plots of the finite-element mesh, (2) velocity vector plots, (3) contour plots of pressure, solute concentration, temperature, or saturation, and (4) a finite-element interpolator for gridding data prior to contouring. SUTRA-PLOT is written in FORTRAN 77 on a PRIME 750 computer system, and requires Version 9.0 or higher of the DISSPLA graphics library. The program requires two input files: the SUTRA input data list and the SUTRA simulation output listing. The program is menu driven and specifications for individual types of plots are entered and may be edited interactively. Installation instruction, a source code listing, and a description of the computer code are given. Six examples of plotting applications are used to demonstrate various features of the plotting program. (Author 's abstract)
Interactive graphical system for small-angle scattering analysis of polydisperse systems
NASA Astrophysics Data System (ADS)
Konarev, P. V.; Volkov, V. V.; Svergun, D. I.
2016-09-01
A program suite for one-dimensional small-angle scattering analysis of polydisperse systems and multiple data sets is presented. The main program, POLYSAS, has a menu-driven graphical user interface calling computational modules from ATSAS package to perform data treatment and analysis. The graphical menu interface allows one to process multiple (time, concentration or temperature-dependent) data sets and interactively change the parameters for the data modelling using sliders. The graphical representation of the data is done via the Winteracter-based program SASPLOT. The package is designed for the analysis of polydisperse systems and mixtures, and permits one to obtain size distributions and evaluate the volume fractions of the components using linear and non-linear fitting algorithms as well as model-independent singular value decomposition. The use of the POLYSAS package is illustrated by the recent examples of its application to study concentration-dependent oligomeric states of proteins and time kinetics of polymer micelles for anticancer drug delivery.
Zhang, Yong; Huo, Meirong; Zhou, Jianping; Xie, Shaofei
2010-09-01
This study presents PKSolver, a freely available menu-driven add-in program for Microsoft Excel written in Visual Basic for Applications (VBA), for solving basic problems in pharmacokinetic (PK) and pharmacodynamic (PD) data analysis. The program provides a range of modules for PK and PD analysis including noncompartmental analysis (NCA), compartmental analysis (CA), and pharmacodynamic modeling. Two special built-in modules, multiple absorption sites (MAS) and enterohepatic circulation (EHC), were developed for fitting the double-peak concentration-time profile based on the classical one-compartment model. In addition, twenty frequently used pharmacokinetic functions were encoded as a macro and can be directly accessed in an Excel spreadsheet. To evaluate the program, a detailed comparison of modeling PK data using PKSolver and professional PK/PD software package WinNonlin and Scientist was performed. The results showed that the parameters estimated with PKSolver were satisfactory. In conclusion, the PKSolver simplified the PK and PD data analysis process and its output could be generated in Microsoft Word in the form of an integrated report. The program provides pharmacokinetic researchers with a fast and easy-to-use tool for routine and basic PK and PD data analysis with a more user-friendly interface. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
An Integrated Analysis-Test Approach
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2003-01-01
This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.
Thermal Ablation Modeling for Silicate Materials
NASA Technical Reports Server (NTRS)
Chen, Yih-Kanq
2016-01-01
A thermal ablation model for silicates is proposed. The model includes the mass losses through the balance between evaporation and condensation, and through the moving molten layer driven by surface shear force and pressure gradient. This model can be applied in ablation simulations of the meteoroid or glassy Thermal Protection Systems for spacecraft. Time-dependent axi-symmetric computations are performed by coupling the fluid dynamics code, Data-Parallel Line Relaxation program, with the material response code, Two-dimensional Implicit Thermal Ablation simulation program, to predict the mass lost rates and shape change. For model validation, the surface recession of fused amorphous quartz rod is computed, and the recession predictions reasonably agree with available data. The present parametric studies for two groups of meteoroid earth entry conditions indicate that the mass loss through moving molten layer is negligibly small for heat-flux conditions at around 1 MW/cm(exp. 2).
Data-Driven Engineering of Social Dynamics: Pattern Matching and Profit Maximization
Peng, Huan-Kai; Lee, Hao-Chih; Pan, Jia-Yu; Marculescu, Radu
2016-01-01
In this paper, we define a new problem related to social media, namely, the data-driven engineering of social dynamics. More precisely, given a set of observations from the past, we aim at finding the best short-term intervention that can lead to predefined long-term outcomes. Toward this end, we propose a general formulation that covers two useful engineering tasks as special cases, namely, pattern matching and profit maximization. By incorporating a deep learning model, we derive a solution using convex relaxation and quadratic-programming transformation. Moreover, we propose a data-driven evaluation method in place of the expensive field experiments. Using a Twitter dataset, we demonstrate the effectiveness of our dynamics engineering approach for both pattern matching and profit maximization, and study the multifaceted interplay among several important factors of dynamics engineering, such as solution validity, pattern-matching accuracy, and intervention cost. Finally, the method we propose is general enough to work with multi-dimensional time series, so it can potentially be used in many other applications. PMID:26771830
Data-Driven Engineering of Social Dynamics: Pattern Matching and Profit Maximization.
Peng, Huan-Kai; Lee, Hao-Chih; Pan, Jia-Yu; Marculescu, Radu
2016-01-01
In this paper, we define a new problem related to social media, namely, the data-driven engineering of social dynamics. More precisely, given a set of observations from the past, we aim at finding the best short-term intervention that can lead to predefined long-term outcomes. Toward this end, we propose a general formulation that covers two useful engineering tasks as special cases, namely, pattern matching and profit maximization. By incorporating a deep learning model, we derive a solution using convex relaxation and quadratic-programming transformation. Moreover, we propose a data-driven evaluation method in place of the expensive field experiments. Using a Twitter dataset, we demonstrate the effectiveness of our dynamics engineering approach for both pattern matching and profit maximization, and study the multifaceted interplay among several important factors of dynamics engineering, such as solution validity, pattern-matching accuracy, and intervention cost. Finally, the method we propose is general enough to work with multi-dimensional time series, so it can potentially be used in many other applications.
Johnson, Douglas H.; Cook, R.D.
2013-01-01
In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.
Data-driven modeling, control and tools for cyber-physical energy systems
NASA Astrophysics Data System (ADS)
Behl, Madhur
Energy systems are experiencing a gradual but substantial change in moving away from being non-interactive and manually-controlled systems to utilizing tight integration of both cyber (computation, communications, and control) and physical representations guided by first principles based models, at all scales and levels. Furthermore, peak power reduction programs like demand response (DR) are becoming increasingly important as the volatility on the grid continues to increase due to regulation, integration of renewables and extreme weather conditions. In order to shield themselves from the risk of price volatility, end-user electricity consumers must monitor electricity prices and be flexible in the ways they choose to use electricity. This requires the use of control-oriented predictive models of an energy system's dynamics and energy consumption. Such models are needed for understanding and improving the overall energy efficiency and operating costs. However, learning dynamical models using grey/white box approaches is very cost and time prohibitive since it often requires significant financial investments in retrofitting the system with several sensors and hiring domain experts for building the model. We present the use of data-driven methods for making model capture easy and efficient for cyber-physical energy systems. We develop Model-IQ, a methodology for analysis of uncertainty propagation for building inverse modeling and controls. Given a grey-box model structure and real input data from a temporary set of sensors, Model-IQ evaluates the effect of the uncertainty propagation from sensor data to model accuracy and to closed-loop control performance. We also developed a statistical method to quantify the bias in the sensor measurement and to determine near optimal sensor placement and density for accurate data collection for model training and control. Using a real building test-bed, we show how performing an uncertainty analysis can reveal trends about inverse model accuracy and control performance, which can be used to make informed decisions about sensor requirements and data accuracy. We also present DR-Advisor, a data-driven demand response recommender system for the building's facilities manager which provides suitable control actions to meet the desired load curtailment while maintaining operations and maximizing the economic reward. We develop a model based control with regression trees algorithm (mbCRT), which allows us to perform closed-loop control for DR strategy synthesis for large commercial buildings. Our data-driven control synthesis algorithm outperforms rule-based demand response methods for a large DoE commercial reference building and leads to a significant amount of load curtailment (of 380kW) and over $45,000 in savings which is 37.9% of the summer energy bill for the building. The performance of DR-Advisor is also evaluated for 8 buildings on Penn's campus; where it achieves 92.8% to 98.9% prediction accuracy. We also compare DR-Advisor with other data driven methods and rank 2nd on ASHRAE's benchmarking data-set for energy prediction.
A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction
NASA Astrophysics Data System (ADS)
Danandeh Mehr, Ali; Kahya, Ercan
2017-06-01
Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.
Model-Driven Engineering of Machine Executable Code
NASA Astrophysics Data System (ADS)
Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira
Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.
NASA Astrophysics Data System (ADS)
Deines, J. M.; Kendall, A. D.; Butler, J. J., Jr.; Hyndman, D. W.
2017-12-01
Irrigation greatly enhances agricultural yields and stabilizes farmer incomes, but overexploitation of water resources has depleted groundwater aquifers around the globe. In much of the High Plains Aquifer (HPA) in the United States, water-level declines threaten the continued viability of agricultural operations reliant on irrigation. Policy and management institutions to address this sustainability challenge differ widely across the HPA and the world. In Kansas, grassroots-driven legislation in 2012 allowed local stakeholder groups to establish Local Enhanced Management Areas (LEMAs) and work with state officials to generate enforceable and monitored water use reduction programs. The pioneering LEMA was formed in 2013, following a popular vote by farmers within a 256 km2 region in northwestern Kansas. The group sought to reduce groundwater pumping by 20% through 2017 in order to stabilize water levels while minimally reducing crop productivity. Initial statistical estimates indicate the LEMA has been successful; planning is underway to extend it for five years (2018-2022) and to implement additional LEMAs in the wider groundwater management district. Here, we assess the efficacy of this first LEMA with coupled crop-hydrology models to quantify water budget impacts and any associated trade-offs in crop productivity. We drive these models with a novel data fusion of water use data and our recent remotely sensed Annual Irrigation Maps (AIM) dataset, allowing detailed tracking of irrigation water in space and time. Results from these process-based models provide detailed insights into changes in the physical system resulting from the LEMA program that can inform future stakeholder-driven management in Kansas and in stressed aquifers around the world.
Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian
2014-09-01
Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Xin; Baker, Kyri A.; Christensen, Dane T.
This paper presents a user-preference-driven home energy management system (HEMS) for demand response (DR) with residential building loads and battery storage. The HEMS is based on a multi-objective model predictive control algorithm, where the objectives include energy cost, thermal comfort, and carbon emission. A multi-criterion decision making method originating from social science is used to quickly determine user preferences based on a brief survey and derive the weights of different objectives used in the optimization process. Besides the residential appliances used in the traditional DR programs, a home battery system is integrated into the HEMS to improve the flexibility andmore » reliability of the DR resources. Simulation studies have been performed on field data from a residential building stock data set. Appliance models and usage patterns were learned from the data to predict the DR resource availability. Results indicate the HEMS was able to provide a significant amount of load reduction with less than 20% prediction error in both heating and cooling cases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Xin; Baker, Kyri A; Isley, Steven C
This paper presents a user-preference-driven home energy management system (HEMS) for demand response (DR) with residential building loads and battery storage. The HEMS is based on a multi-objective model predictive control algorithm, where the objectives include energy cost, thermal comfort, and carbon emission. A multi-criterion decision making method originating from social science is used to quickly determine user preferences based on a brief survey and derive the weights of different objectives used in the optimization process. Besides the residential appliances used in the traditional DR programs, a home battery system is integrated into the HEMS to improve the flexibility andmore » reliability of the DR resources. Simulation studies have been performed on field data from a residential building stock data set. Appliance models and usage patterns were learned from the data to predict the DR resource availability. Results indicate the HEMS was able to provide a significant amount of load reduction with less than 20% prediction error in both heating and cooling cases.« less
A Collaborative Data Chat: Teaching Summative Assessment Data Use in Pre-Service Teacher Education
ERIC Educational Resources Information Center
Piro, Jody S.; Dunlap, Karen; Shutt, Tammy
2014-01-01
As the quality of educational outputs has been problematized, accountability systems have driven reform based upon summative assessment data. These policies impact the ways that educators use data within schools and subsequently, how teacher education programs may adjust their curricula to teach data-driven decision-making to inform instruction.…
ERIC Educational Resources Information Center
Avanzino, Susan
2010-01-01
Communication departments are expected to conduct program level assessment, as well as assessment of communication in general education. Although the expectation for data-driven student learning assessment is growing, relatively few examples exist for doing so effectively. This article serves as a model to help faculty conduct effective assessment…
Harman, Elena; Azzam, Tarek
2018-02-01
This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.
Tsui, Fu-Chiang; Espino, Jeremy U; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M
2005-01-01
The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance-systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions.
Propeller aircraft interior noise model. II - Scale-model and flight-test comparisons
NASA Technical Reports Server (NTRS)
Willis, C. M.; Mayes, W. H.
1987-01-01
A program for predicting the sound levels inside propeller driven aircraft arising from sidewall transmission of airborne exterior noise is validated through comparisons of predictions with both scale-model test results and measurements obtained in flight tests on a turboprop aircraft. The program produced unbiased predictions for the case of the scale-model tests, with a standard deviation of errors of about 4 dB. For the case of the flight tests, the predictions revealed a bias of 2.62-4.28 dB (depending upon whether or not the data for the fourth harmonic were included) and the standard deviation of the errors ranged between 2.43 and 4.12 dB. The analytical model is shown to be capable of taking changes in the flight environment into account.
ERIC Educational Resources Information Center
Bandy, Tawana; Burkhauser, Mary; Metz, Allison J. R.
2009-01-01
Although many program managers look to data to inform decision-making and manage their programs, high-quality program data may not always be available. Yet such data are necessary for effective program implementation. The use of high-quality data facilitates program management, reduces reliance on anecdotal information, and ensures that data are…
Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E
2015-06-16
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .
It's All About the Data: Workflow Systems and Weather
NASA Astrophysics Data System (ADS)
Plale, B.
2009-05-01
Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.
Using Data-Driven Model-Brain Mappings to Constrain Formal Models of Cognition
Borst, Jelmer P.; Nijboer, Menno; Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John R.
2015-01-01
In this paper we propose a method to create data-driven mappings from components of cognitive models to brain regions. Cognitive models are notoriously hard to evaluate, especially based on behavioral measures alone. Neuroimaging data can provide additional constraints, but this requires a mapping from model components to brain regions. Although such mappings can be based on the experience of the modeler or on a reading of the literature, a formal method is preferred to prevent researcher-based biases. In this paper we used model-based fMRI analysis to create a data-driven model-brain mapping for five modules of the ACT-R cognitive architecture. We then validated this mapping by applying it to two new datasets with associated models. The new mapping was at least as powerful as an existing mapping that was based on the literature, and indicated where the models were supported by the data and where they have to be improved. We conclude that data-driven model-brain mappings can provide strong constraints on cognitive models, and that model-based fMRI is a suitable way to create such mappings. PMID:25747601
Buonaccorsi, Giovanni A; Roberts, Caleb; Cheung, Sue; Watson, Yvonne; O'Connor, James P B; Davies, Karen; Jackson, Alan; Jayson, Gordon C; Parker, Geoff J M
2006-09-01
The quantitative analysis of dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) data is subject to model fitting errors caused by motion during the time-series data acquisition. However, the time-varying features that occur as a result of contrast enhancement can confound motion correction techniques based on conventional registration similarity measures. We have therefore developed a heuristic, locally controlled tracer kinetic model-driven registration procedure, in which the model accounts for contrast enhancement, and applied it to the registration of abdominal DCE-MRI data at high temporal resolution. Using severely motion-corrupted data sets that had been excluded from analysis in a clinical trial of an antiangiogenic agent, we compared the results obtained when using different models to drive the tracer kinetic model-driven registration with those obtained when using a conventional registration against the time series mean image volume. Using tracer kinetic model-driven registration, it was possible to improve model fitting by reducing the sum of squared errors but the improvement was only realized when using a model that adequately described the features of the time series data. The registration against the time series mean significantly distorted the time series data, as did tracer kinetic model-driven registration using a simpler model of contrast enhancement. When an appropriate model is used, tracer kinetic model-driven registration influences motion-corrupted model fit parameter estimates and provides significant improvements in localization in three-dimensional parameter maps. This has positive implications for the use of quantitative DCE-MRI for example in clinical trials of antiangiogenic or antivascular agents.
The Role of Guided Induction in Paper-Based Data-Driven Learning
ERIC Educational Resources Information Center
Smart, Jonathan
2014-01-01
This study examines the role of guided induction as an instructional approach in paper-based data-driven learning (DDL) in the context of an ESL grammar course during an intensive English program at an American public university. Specifically, it examines whether corpus-informed grammar instruction is more effective through inductive, data-driven…
Consistent data-driven computational mechanics
NASA Astrophysics Data System (ADS)
González, D.; Chinesta, F.; Cueto, E.
2018-05-01
We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.
Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L
2006-11-01
Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.
NASA Astrophysics Data System (ADS)
Cunningham, Jessica D.
Newton's Universe (NU), an innovative teacher training program, strives to obtain measures from rural, middle school science teachers and their students to determine the impact of its distance learning course on understanding of temperature. No consensus exists on the most appropriate and useful method of analysis to measure change in psychological constructs over time. Several item response theory (IRT) models have been deemed useful in measuring change, which makes the choice of an IRT model not obvious. The appropriateness and utility of each model, including a comparison to a traditional analysis of variance approach, was investigated using middle school science student performance on an assessment over an instructional period. Predetermined criteria were outlined to guide model selection based on several factors including research questions, data properties, and meaningful interpretations to determine the most appropriate model for this study. All methods employed in this study reiterated one common interpretation of the data -- specifically, that the students of teachers with any NU course experience had significantly greater gains in performance over the instructional period. However, clear distinctions were made between an analysis of variance and the racked and stacked analysis using the Rasch model. Although limited research exists examining the usefulness of the Rasch model in measuring change in understanding over time, this study applied these methods and detailed plausible implications for data-driven decisions based upon results for NU and others. Being mindful of the advantages and usefulness of each method of analysis may help others make informed decisions about choosing an appropriate model to depict changes to evaluate other programs. Results may encourage other researchers to consider the meaningfulness of using IRT for this purpose. Results have implications for data-driven decisions for future professional development courses, in science education and other disciplines. KEYWORDS: Item Response Theory, Rasch Model, Racking and Stacking, Measuring Change in Student Performance, Newton's Universe teacher training
A 3D visualization system for molecular structures
NASA Technical Reports Server (NTRS)
Green, Terry J.
1989-01-01
The properties of molecules derive in part from their structures. Because of the importance of understanding molecular structures various methodologies, ranging from first principles to empirical technique, were developed for computing the structure of molecules. For large molecules such as polymer model compounds, the structural information is difficult to comprehend by examining tabulated data. Therefore, a molecular graphics display system, called MOLDS, was developed to help interpret the data. MOLDS is a menu-driven program developed to run on the LADC SNS computer systems. This program can read a data file generated by the modeling programs or data can be entered using the keyboard. MOLDS has the following capabilities: draws the 3-D representation of a molecule using stick, ball and ball, or space filled model from Cartesian coordinates, draws different perspective views of the molecule; rotates the molecule on the X, Y, Z axis or about some arbitrary line in space, zooms in on a small area of the molecule in order to obtain a better view of a specific region; and makes hard copy representation of molecules on a graphic printer. In addition, MOLDS can be easily updated and readily adapted to run on most computer systems.
Wasserman, Deborah L
2010-05-01
This paper offers a framework for using a systems orientation and "foundational theory" to enhance theory-driven evaluations and logic models. The framework guides the process of identifying and explaining operative relationships and perspectives within human service program systems. Self-Determination Theory exemplifies how a foundational theory can be used to support the framework in a wide range of program evaluations. Two examples illustrate how applications of the framework have improved the evaluators' abilities to observe and explain program effect. In both exemplars improvements involved addressing and organizing into a single logic model heretofore seemingly disparate evaluation issues regarding valuing (by whose values); the role of organizational and program context; and evaluation anxiety and utilization. Copyright 2009 Elsevier Ltd. All rights reserved.
Toward Computational Cumulative Biology by Combining Models of Biological Datasets
Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel
2014-01-01
A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations—for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database. PMID:25427176
Toward computational cumulative biology by combining models of biological datasets.
Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel
2014-01-01
A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.
General Purpose Data-Driven Monitoring for Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.
2009-01-01
As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault and anomaly detection algorithms and diagnosis tools with executive and adaptive planning functions contained in the flight software on-board the Air Force Research Laboratory TacSat-3 satellite. The TVSM software package will be uploaded after launch to monitor spacecraft subsystems such as power and guidance, navigation, and control (GN&C). It will analyze data in real-time to demonstrate detection of faults and unusual conditions, diagnose problems, and react to threats to spacecraft health and mission goals. The experiment will demonstrate the feasibility and effectiveness of integrated system health management (ISHM) technologies with both ground and on-board experiments.
Enhancing Extensive Reading with Data-Driven Learning
ERIC Educational Resources Information Center
Hadley, Gregory; Charles, Maggie
2017-01-01
This paper investigates using data-driven learning (DDL) as a means of stimulating greater lexicogrammatical knowledge and reading speed among lower proficiency learners in an extensive reading program. For 16 weekly 90-minute sessions, an experimental group (12 students) used DDL materials created from a corpus developed from the Oxford Bookworms…
NASA Astrophysics Data System (ADS)
Rock, B. N.; Hale, S.; Graham, K.; Hayden, L. B.
2009-12-01
Watershed Watch (NSF 0525433) engages early undergraduate students from two-year and four-year colleges in student-driven full inquiry-based instruction in the biogeosciences. Program goals for Watershed Watch are to test if inquiry-rich student-driven projects sufficiently engage undeclared students (or noncommittal STEM majors) to declare a STEM major (or remain with their STEM major). The program is a partnership between two four-year campuses - the University of New Hampshire (UNH), and Elizabeth City State University (ECSU, in North Carolina); and two two-year campuses - Great Bay Community College (GBCC, in New Hampshire) and the College of the Albemarle (COA, in North Carolina). The program focuses on two watersheds: the Merrimack Ricer Watershed in New Hampshire and Massachusetts, and the Pasquotank River Watershed in Virginia and North Carolina. Both the terrestrial and aquatic components of both watersheds are evaluated using the student-driven projects. A significant component of this program is an intensive two-week Summer Research Institute (SRI), in which undeclared freshmen and sophomores investigate various aspects of their local watershed. Two Summer Research Institutes have been held on the UNH campus (2006 and 2008) and two on the ECSU campus (2007 and 2009). Students develop their own research questions and study design, collect and analyze data, and produce a scientific oral or poster presentation on the last day of the SRI. The course objectives, curriculum and schedule are presented as a model for dissemination for other institutions and programs seeking to develop inquiry-rich programs or courses designed to attract students into biogeoscience disciplines. Data from self-reported student feedback indicate the most important factors explaining high-levels of student motivation and research excellence in the program are: 1) working with committed, energetic, and enthusiastic faculty mentors, and 2) faculty mentors demonstrating high degrees of teamwork and coordination. The past four Summer Research Institutes have engaged over 100 entry-level undergraduate students in the process of learning science by doing it, and approximately 50% of those participating have declared majors in a wide range of science fields. A total of eight Watershed Watch students have presented findings from their SRI research projects at AGU meetings in 2007, 2008, and 2009. This presentation will highlight the lessons learned over the past four years in the Watershed Watch program.
Model-Driven Approach for Body Area Network Application Development.
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-05-12
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.
Model-Driven Approach for Body Area Network Application Development
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-01-01
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394
NASA Technical Reports Server (NTRS)
Sainsbury-Carter, J. B.; Conaway, J. H.
1973-01-01
The development and implementation of a preprocessor system for the finite element analysis of helicopter fuselages is described. The system utilizes interactive graphics for the generation, display, and editing of NASTRAN data for fuselage models. It is operated from an IBM 2250 cathode ray tube (CRT) console driven by an IBM 370/145 computer. Real time interaction plus automatic data generation reduces the nominal 6 to 10 week time for manual generation and checking of data to a few days. The interactive graphics system consists of a series of satellite programs operated from a central NASTRAN Systems Monitor. Fuselage structural models including the outer shell and internal structure may be rapidly generated. All numbering systems are automatically assigned. Hard copy plots of the model labeled with GRID or elements ID's are also available. General purpose programs for displaying and editing NASTRAN data are included in the system. Utilization of the NASTRAN interactive graphics system has made possible the multiple finite element analysis of complex helicopter fuselage structures within design schedules.
Solving Common Mathematical Problems
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Mathematical Solutions Toolset is a collection of five software programs that rapidly solve some common mathematical problems. The programs consist of a set of Microsoft Excel worksheets. The programs provide for entry of input data and display of output data in a user-friendly, menu-driven format, and for automatic execution once the input data has been entered.
Kidman, Rachel; Nice, Johanna; Taylor, Tory; Thurman, Tonya R
2014-10-02
Home visiting is a popular component of programs for HIV-affected children in sub-Saharan Africa, but its implementation varies widely. While some home visitors are lay volunteers, other programs invest in more highly trained paraprofessional staff. This paper describes a study investigating whether additional investment in paraprofessional staffing translated into higher quality service delivery in one program context. Beneficiary children and caregivers at sites in KwaZulu-Natal, South Africa were interviewed after 2 years of program enrollment and asked to report about their experiences with home visiting. Analysis focused on intervention exposure, including visit intensity, duration and the kinds of emotional, informational and tangible support provided. Few beneficiaries reported receiving home visits in program models primarily driven by lay volunteers; when visits did occur, they were shorter and more infrequent. Paraprofessional-driven programs not only provided significantly more home visits, but also provided greater interaction with the child, communication on a larger variety of topics, and more tangible support to caregivers. These results suggest that programs that invest in compensation and extensive training for home visitors are better able to serve and retain beneficiaries, and they support a move toward establishing a professional workforce of home visitors to support vulnerable children and families in South Africa.
Leading Change: A Case Study of Alamo Academies--An Industry-Driven Workforce Partnership Program
ERIC Educational Resources Information Center
Hu, Xiaodan; Bowman, Gene
2016-01-01
In this study, the authors focus on the initiation and development of the Alamo Academies, aiming to illustrate an exemplary industry-driven model that addresses workforce development in local community. After a brief introduction of the context, the authors summarized major factors that contribute to the success of the collaboration model,…
Data-based adjoint and H2 optimal control of the Ginzburg-Landau equation
NASA Astrophysics Data System (ADS)
Banks, Michael; Bodony, Daniel
2017-11-01
Equation-free, reduced-order methods of control are desirable when the governing system of interest is of very high dimension or the control is to be applied to a physical experiment. Two-phase flow optimal control problems, our target application, fit these criteria. Dynamic Mode Decomposition (DMD) is a data-driven method for model reduction that can be used to resolve the dynamics of very high dimensional systems and project the dynamics onto a smaller, more manageable basis. We evaluate the effectiveness of DMD-based forward and adjoint operator estimation when applied to H2 optimal control approaches applied to the linear and nonlinear Ginzburg-Landau equation. Perspectives on applying the data-driven adjoint to two phase flow control will be given. Office of Naval Research (ONR) as part of the Multidisciplinary University Research Initiatives (MURI) Program, under Grant Number N00014-16-1-2617.
Assessment of Proton Deflectometry for Exploding Wire Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beg, Farhat Nadeem
2013-09-25
This project provides the first demonstration of the application of proton deflectometry for the diagnosis of electromagnetic field topology and current-carrying regions in Z-pinch plasma experiments. Over the course of this project several milestones were achieved. High-energy proton beam generation was demonstrated on the short-pulse high-intensity Leopard laser, (10 Joules in ~350 femtoseconds, and the proton beam generation was shown to be reproducible. Next, protons were used to probe the electromagnetic field structure of short circuit loads in order to benchmark the two numerical codes, the resistive-magnetohydrodynamics (MHD) code, Gorgon, and the hybrid particle-in-cell code, LSP for the interpretation ofmore » results. Lastly, the proton deflectometry technique was used to map the magnetic field structure of pulsed-power-driven plasma loads including wires and supersonic jets formed with metallic foils. Good agreement between the modeling and experiments has been obtained. The demonstrated technique holds great promise to significantly improve the understanding of current flow and electromagnetic field topology in pulsed power driven high energy density plasmas. Proton probing with a high intensity laser was for the first time implemented in the presence of the harsh debris and x-ray producing z-pinch environment driven by a mega-ampere-scale pulsed-power machine. The intellectual merit of the program was that it investigated strongly driven MHD systems and the influence of magnetic field topology on plasma evolution in pulsed power driven plasmas. The experimental program involved intense field-matter interaction in the generation of the proton probe, as well as the generation of plasma subjected to 1 MegaGauss scale magnetic fields. The computational aspect included two well-documented codes, in combination for the first time to provide accurate interpretation of the experimental results. The broader impact included the support of 2 graduate students, one at UCSD and one at NTF, who were exposed to both the experimental physics work, the MHD and PIC modeling of the system. A first generation college undergraduate student was employed to assist in experiments and data analysis throughout the project. Data resulting from the research program were broadly disseminated by publication in scientific journals, and presentation at international and national conferences and workshops.« less
Experimental and analytical research on the aerodynamics of wind driven turbines. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrbach, C.; Wainauski, H.; Worobel, R.
1977-12-01
This aerodynamic research program was aimed at providing a reliable, comprehensive data base on a series of wind turbine models covering a broad range of the prime aerodynamic and geometric variables. Such data obtained under controlled laboratory conditions on turbines designed by the same method, of the same size, and tested in the same wind tunnel had not been available in the literature. Moreover, this research program was further aimed at providing a basis for evaluating the adequacy of existing wind turbine aerodynamic design and performance methodology, for assessing the potential of recent advanced theories and for providing a basismore » for further method development and refinement.« less
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
From Population Databases to Research and Informed Health Decisions and Policy.
Machluf, Yossy; Tal, Orna; Navon, Amir; Chaiter, Yoram
2017-01-01
In the era of big data, the medical community is inspired to maximize the utilization and processing of the rapidly expanding medical datasets for clinical-related and policy-driven research. This requires a medical database that can be aggregated, interpreted, and integrated at both the individual and population levels. Policymakers seek data as a lever for wise, evidence-based decision-making and information-driven policy. Yet, bridging the gap between data collection, research, and policymaking, is a major challenge. To bridge this gap, we propose a four-step model: (A) creating a conjoined task force of all relevant parties to declare a national program to promote collaborations; (B) promoting a national digital records project, or at least a network of synchronized and integrated databases, in an accessible transparent manner; (C) creating an interoperative national research environment to enable the analysis of the organized and integrated data and to generate evidence; and (D) utilizing the evidence to improve decision-making, to support a wisely chosen national policy. For the latter purpose, we also developed a novel multidimensional set of criteria to illuminate insights and estimate the risk for future morbidity based on current medical conditions. Used by policymakers, providers of health plans, caregivers, and health organizations, we presume this model will assist transforming evidence generation to support the design of health policy and programs, as well as improved decision-making about health and health care, at all levels: individual, communal, organizational, and national.
Wong, Jessica J; McGregor, Marion; Mior, Silvano A; Loisel, Patrick
2014-01-01
The purpose of this study was to develop a model that evaluates the impact of policy changes on the number of workers' compensation lost-time back claims in Ontario, Canada, over a 30-year timeframe. The model was used to test the hypothesis that a theory- and policy-driven model would be sufficient in reproducing historical claims data in a robust manner and that policy changes would have a major impact on modeled data. The model was developed using system dynamics methods in the Vensim simulation program. The theoretical effects of policies for compensation benefit levels and experience rating fees were modeled. The model was built and validated using historical claims data from 1980 to 2009. Sensitivity analysis was used to evaluate the modeled data at extreme end points of variable input and timeframes. The degree of predictive value of the modeled data was measured by the coefficient of determination, root mean square error, and Theil's inequality coefficients. Correlation between modeled data and actual data was found to be meaningful (R(2) = 0.934), and the modeled data were stable at extreme end points. Among the effects explored, policy changes were found to be relatively minor drivers of back claims data, accounting for a 13% improvement in error. Simulation results suggested that unemployment, number of no-lost-time claims, number of injuries per worker, and recovery rate from back injuries outside of claims management to be sensitive drivers of back claims data. A robust systems-based model was developed and tested for use in future policy research in Ontario's workers' compensation. The study findings suggest that certain areas within and outside the workers' compensation system need to be considered when evaluating and changing policies around back claims. © 2014. Published by National University of Health Sciences All rights reserved.
ERIC Educational Resources Information Center
Ponder, Gerald, Ed.; Strahan, David, Ed.
2005-01-01
This book presents cases of schools (Part One) and programs at the district level and beyond (Part Two) in which reform, while driven by high-stakes accountability, became larger and deeper through data-driven dialogue, culture change, organizational learning, and other elements of high performing cultures. Commentaries on cross-case patterns by…
Morrato, Elaine H; Smith, Meredith Y
2015-01-01
Pharmaceutical risk minimization programs are now an established requirement in the regulatory landscape. However, pharmaceutical companies have been slow to recognize and embrace the significant potential these programs offer in terms of enhancing trust with health care professionals and patients, and for providing a mechanism for bringing products to the market that might not otherwise have been approved. Pitfalls of the current drug development process include risk minimization programs that are not data driven; missed opportunities to incorporate pragmatic methods and market-based insights, outmoded tools and data sources, lack of rapid evaluative learning to support timely adaption, lack of systematic approaches for patient engagement, and questions on staffing and organizational infrastructure. We propose better integration of risk minimization with clinical drug development and commercialization work streams throughout the product lifecycle. We articulate a vision and propose broad adoption of organizational models for incorporating risk minimization expertise into the drug development process. Three organizational models are discussed and compared: outsource/external vendor, embedded risk management specialist model, and Center of Excellence. PMID:25750537
Tsui, Fu-Chiang; Espino, Jeremy U.; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M.
2005-01-01
The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance—systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions. PMID:16779138
The Station Community Mental Health Centre Inc: nurturing and empowering.
Taylor, Judy; Jones, Rosalind M; O'Reilly, Peta; Oldfield, Wayne; Blackburn, Anne
2010-01-01
Consumer-driven community mental health services play an important role in rehabilitation, recovery, and advocacy in rural and remote Australia. The origins of services often lie in the need to provide options for people with mental illness and their carers when there is a lack of on-the-ground support. This article adds to the information about the strengths and limitations of consumer-driven mental health services by presenting the findings of an evaluation of The Station Inc. in rural South Australia. This consumer-driven mental health service provides a safe and supportive environment, social connections, and activities for its members (those with a lived experience of mental illness). Using a realist evaluation approach, the evaluation identified the contextual factors and the program mechanisms that produce positive outcomes for members. The evaluation was conducted as participatory action research with The Station members, volunteers, management committee members, and staff involved in all phases of the research process. Because of the complexity of The Station's functioning a realist evaluation using qualitative data was conducted to identify how the program worked, for whom, and in what circumstances. Twenty-five in-depth interviews were conducted with participants who were randomly selected from within the groups identified above. Interviews focused on The Station's role in assisting recovery from mental illness, the limitations and strengths of the program, and relationships with the mental health system. The Station's goals, policies and procedures, and the role of stakeholders were analysed in order to identify any links among these contextual factors, program mechanisms, and program outcomes. Qualitative data were entered into descriptive categories in N6 software (QSR; www.qsr.international.com). Data from the stakeholder analysis were entered into Microsoft Excel. Using an iterative approach to include the three data sets, a model was developed that identified important contextual factors that linked with two groups of program mechanisms that produced positive outcomes for members. Program mechanisms are categorised by descriptive themes referred to as 'nurturing' and 'empowering'. Nurturing' is experienced as feeling of belonging and being accepted 'as one is' and 'empowerment' mechanisms engender a belief in oneself. Respondents identified features of The Station's program, policies, atmosphere, connections and networks, stakeholder relationships, and staff and volunteers that are nurturing and empowering. Five key contextual factors enable the program mechanisms to work. The Station's coordinators ensure that nurturing and empowerment processes are highlighted through careful facilitation. The governance arrangements, policies, and administrative systems at The Station are well developed but flexibly implemented so that they support the nurturing and empowerment processes. Support and legitimacy for the program is obtained from the mental health system at state and local levels. The Station obtains resources and connections to its rural community through key stakeholders and a peak organisation One Voice Network acts as an advocate. Information about the benefits and limitations of consumer-driven mental health services in rural and remote Australia is in short supply. Increasing the available information about the contribution these services make may result in services being legitimised, understood, and resourced within mental health systems thus making the services sustainable. The benefits of consumer-driven services are that they provide flexibility and adaptation, an ability to capture the energy and passion of rural communities to improve the wellbeing of community members, and they overcome the power differential that exists between professionals and 'patients' or 'clients'.
Predictive modeling of mosquito abundance and dengue transmission in Kenya
NASA Astrophysics Data System (ADS)
Caldwell, J.; Krystosik, A.; Mutuku, F.; Ndenga, B.; LaBeaud, D.; Mordecai, E.
2017-12-01
Approximately 390 million people are exposed to dengue virus every year, and with no widely available treatments or vaccines, predictive models of disease risk are valuable tools for vector control and disease prevention. The aim of this study was to modify and improve climate-driven predictive models of dengue vector abundance (Aedes spp. mosquitoes) and viral transmission to people in Kenya. We simulated disease transmission using a temperature-driven mechanistic model and compared model predictions with vector trap data for larvae, pupae, and adult mosquitoes collected between 2014 and 2017 at four sites across urban and rural villages in Kenya. We tested predictive capacity of our models using four temperature measurements (minimum, maximum, range, and anomalies) across daily, weekly, and monthly time scales. Our results indicate seasonal temperature variation is a key driving factor of Aedes mosquito abundance and disease transmission. These models can help vector control programs target specific locations and times when vectors are likely to be present, and can be modified for other Aedes-transmitted diseases and arboviral endemic regions around the world.
CIS Program Redesign Driven by IS2010 Model: A Case Study
ERIC Educational Resources Information Center
Surendran, Ken; Amer, Suhair; Schwieger, Dana
2012-01-01
The release of the IS2010 Model Curriculum has triggered review of existing Information Systems (IS) programs. It also provides an opportunity to replace low enrollment IS programs with flexible ones that focus on specific application domains. In this paper, the authors present a case study of their redesigned Computer Information Systems (CIS)…
Sauaia, Angela; Tuitt, Nicole R; Kaufman, Carol E; Hunt, Cerise; Ledezma-Amorosi, Mariana; Byers, Tim
2016-01-01
Project TEACH (Teaching Equity to Advance Community Health) is a capacity-building training program to empower community-based organizations and regional public health agencies to develop data-driven, evidence-based, outcomes-focused public health interventions. TEACH delivers training modules on topics such as logic models, health data, social determinants of health, evidence-based interventions, and program evaluation. Cohorts of 7 to 12 community-based organizations and regional public health agencies in each of the 6 Colorado Area Health Education Centers service areas participate in a 2-day training program tailored to their specific needs. From July 2008 to December 2011, TEACH trained 94 organizations and agencies across Colorado. Training modules were well received and resulted in significant improvement in knowledge in core content areas, as well as accomplishment of self-proposed organizational goals, grant applications/awards, and several community-academic partnerships.
NASA Astrophysics Data System (ADS)
Weyer, K. U.
2017-12-01
Coastal groundwater flow investigations at the Biscayne Bay, south of Miami, Florida, gave rise to the concept of density-driven flow of seawater into coastal aquifers creating a saltwater wedge. Within that wedge, convection-driven return flow of seawater and a dispersion zone were assumed by Cooper et al. (1964) to be the cause of the Biscayne aquifer `sea water wedge'. This conclusion was based on the chloride distribution within the aquifer and on an analytical model concept assuming convection flow within a confined aquifer without taking non-chemical field data into consideration. This concept was later labelled the `Henry Problem', which any numerical variable density flow program must be able to simulate to be considered acceptable. Both, `density-driven flow' and Tothian `groundwater flow systems' (with or without variable density conditions) are driven by gravitation. The difference between the two are the boundary conditions. 'Density-driven flow' occurs under hydrostatic boundary conditions while Tothian `groundwater flow systems' occur under hydrodynamic boundary conditions. Revisiting the Cooper et al. (1964) publication with its record of piezometric field data (heads) showed that the so-called sea water wedge has been caused by discharging deep saline groundwater driven by gravitational flow and not by denser sea water. Density driven flow of seawater into the aquifer was not found reflected in the head measurements for low and high tide conditions which had been taken contemporaneously with the chloride measurements. These head measurements had not been included in the flow interpretation. The very same head measurements indicated a clear dividing line between shallow local fresh groundwater flow and saline deep groundwater flow without the existence of a dispersion zone or a convection cell. The Biscayne situation emphasizes the need for any chemical interpretation of flow pattern to be supported by head data as energy indicators of flow fields. At the Biscayne site density-driven flow of seawater did and does not exist. Instead this site and the Florida coast line in general are the end points of local fresh and regional saline groundwater flow systems driven by gravity forces and not by density differences.
NASA Astrophysics Data System (ADS)
Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao
2017-03-01
Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction.
Wu, Siqi; Joseph, Antony; Hammonds, Ann S; Celniker, Susan E; Yu, Bin; Frise, Erwin
2016-04-19
Spatial gene expression patterns enable the detection of local covariability and are extremely useful for identifying local gene interactions during normal development. The abundance of spatial expression data in recent years has led to the modeling and analysis of regulatory networks. The inherent complexity of such data makes it a challenge to extract biological information. We developed staNMF, a method that combines a scalable implementation of nonnegative matrix factorization (NMF) with a new stability-driven model selection criterion. When applied to a set ofDrosophilaearly embryonic spatial gene expression images, one of the largest datasets of its kind, staNMF identified 21 principal patterns (PP). Providing a compact yet biologically interpretable representation ofDrosophilaexpression patterns, PP are comparable to a fate map generated experimentally by laser ablation and show exceptional promise as a data-driven alternative to manual annotations. Our analysis mapped genes to cell-fate programs and assigned putative biological roles to uncharacterized genes. Finally, we used the PP to generate local transcription factor regulatory networks. Spatially local correlation networks were constructed for six PP that span along the embryonic anterior-posterior axis. Using a two-tail 5% cutoff on correlation, we reproduced 10 of the 11 links in the well-studied gap gene network. The performance of PP with theDrosophiladata suggests that staNMF provides informative decompositions and constitutes a useful computational lens through which to extract biological insight from complex and often noisy gene expression data.
Calculation and analysis of cross-sections for p+184W reactions up to 200 MeV
NASA Astrophysics Data System (ADS)
Sun, Jian-Ping; Zhang, Zheng-Jun; Han, Yin-Lu
2015-08-01
A set of optimal proton optical potential parameters for p+ 184W reactions are obtained at incident proton energy up to 250 MeV. Based on these parameters, the reaction cross-sections, elastic scattering angular distributions, energy spectra and double differential cross sections of proton-induced reactions on 184W are calculated and analyzed by using theoretical models which integrate the optical model, distorted Born wave approximation theory, intra-nuclear cascade model, exciton model, Hauser-Feshbach theory and evaporation model. The calculated results are compared with existing experimental data and good agreement is achieved. Supported by National Basic Research Program of China, Technology Research of Accelerator Driven Sub-critical System for Nuclear Waste Transmutation (2007CB209903) and Strategic Priority Research Program of Chinese Academy of Sciences, Thorium Molten Salt Reactor Nuclear Energy System (XDA02010100)
Performance Analysis of a Ring Current Model Driven by Global MHD
NASA Astrophysics Data System (ADS)
Falasca, A.; Keller, K. A.; Fok, M.; Hesse, M.; Gombosi, T.
2003-12-01
Effectively modeling the high-energy particles in Earth's inner magnetosphere has the potential to improve safety in both manned and unmanned spacecraft. One model of this environment is the Fok Ring Current Model. This model can utilize as inputs both solar wind data, and empirical ionospheric electric field and magnetic field models. Alternatively, we have a procedure which allows the model to be driven by outputs from the BATS-R-US global MHD model. By using in-situ satellite data we will compare the predictive capability of this model in its original stand-alone form, to that of the model when driven by the BATS-R-US Global Magnetosphere Model. As a basis for comparison we use the April 2002 and May 2003 storms where suitable LANL geosynchronous data are available.
Harris, Eric S J; Erickson, Sean D; Tolopko, Andrew N; Cao, Shugeng; Craycroft, Jane A; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E; Eisenberg, David M
2011-05-17
Ethnobotanically driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically driven natural product collection and drug-discovery programs. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Harris, Eric S. J.; Erickson, Sean D.; Tolopko, Andrew N.; Cao, Shugeng; Craycroft, Jane A.; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E.; Eisenberg, David M.
2011-01-01
Aim of the study. Ethnobotanically-driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine-Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. Materials and Methods. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. Results. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. Conclusions. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically-driven natural product collection and drug-discovery programs. PMID:21420479
Allen, Victoria W; Shirasu-Hiza, Mimi
2018-01-01
Despite being pervasive, the control of programmed grooming is poorly understood. We addressed this gap by developing a high-throughput platform that allows long-term detection of grooming in Drosophila melanogaster. In our method, a k-nearest neighbors algorithm automatically classifies fly behavior and finds grooming events with over 90% accuracy in diverse genotypes. Our data show that flies spend ~13% of their waking time grooming, driven largely by two major internal programs. One of these programs regulates the timing of grooming and involves the core circadian clock components cycle, clock, and period. The second program regulates the duration of grooming and, while dependent on cycle and clock, appears to be independent of period. This emerging dual control model in which one program controls timing and another controls duration, resembles the two-process regulatory model of sleep. Together, our quantitative approach presents the opportunity for further dissection of mechanisms controlling long-term grooming in Drosophila. PMID:29485401
Kindt, Merel; van den Hout, Marcel; Arntz, Arnoud; Drost, Jolijn
2008-12-01
Ehlers and Clark [(2000). A cognitive model of posttraumatic stress disorder. Behaviour Research and Therapy, 38, 319-345] propose that a predominance of data-driven processing during the trauma predicts subsequent PTSD. We wondered whether, apart from data-driven encoding, sustained data-driven processing after the trauma is also crucial for the development of PTSD. Both hypotheses were tested in two analogue experiments. Experiment 1 demonstrated that relative to conceptually-driven processing (n=20), data-driven processing after the film (n=14), resulted in more intrusions. Experiment 2 demonstrated that relative to the neutral condition (n=24) and the data-driven encoding condition (n=24), conceptual encoding (n=25) reduced suppression of intrusions and a trend emerged for memory fragmentation. The difference between the two encoding styles was due to the beneficial effect of induced conceptual encoding and not to the detrimental effect of data-driven encoding. The data support the viability of the distinction between data-driven/conceptually-driven processing for the understanding of the development of PTSD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Siqi; Joseph, Antony; Hammonds, Ann S.
Spatial gene expression patterns enable the detection of local covariability and are extremely useful for identifying local gene interactions during normal development. The abundance of spatial expression data in recent years has led to the modeling and analysis of regulatory networks. The inherent complexity of such data makes it a challenge to extract biological information. We developed staNMF, a method that combines a scalable implementation of nonnegative matrix factorization (NMF) with a new stability-driven model selection criterion. When applied to a set of Drosophila early embryonic spatial gene expression images, one of the largest datasets of its kind, staNMF identifiedmore » 21 principal patterns (PP). Providing a compact yet biologically interpretable representation of Drosophila expression patterns, PP are comparable to a fate map generated experimentally by laser ablation and show exceptional promise as a data-driven alternative to manual annotations. Our analysis mapped genes to cell-fate programs and assigned putative biological roles to uncharacterized genes. Finally, we used the PP to generate local transcription factor regulatory networks. Spatially local correlation networks were constructed for six PP that span along the embryonic anterior-posterior axis. Using a two-tail 5% cutoff on correlation, we reproduced 10 of the 11 links in the well-studied gap gene network. In conclusion, the performance of PP with the Drosophila data suggests that staNMF provides informative decompositions and constitutes a useful computational lens through which to extract biological insight from complex and often noisy gene expression data.« less
Wu, Siqi; Joseph, Antony; Hammonds, Ann S.; ...
2016-04-06
Spatial gene expression patterns enable the detection of local covariability and are extremely useful for identifying local gene interactions during normal development. The abundance of spatial expression data in recent years has led to the modeling and analysis of regulatory networks. The inherent complexity of such data makes it a challenge to extract biological information. We developed staNMF, a method that combines a scalable implementation of nonnegative matrix factorization (NMF) with a new stability-driven model selection criterion. When applied to a set of Drosophila early embryonic spatial gene expression images, one of the largest datasets of its kind, staNMF identifiedmore » 21 principal patterns (PP). Providing a compact yet biologically interpretable representation of Drosophila expression patterns, PP are comparable to a fate map generated experimentally by laser ablation and show exceptional promise as a data-driven alternative to manual annotations. Our analysis mapped genes to cell-fate programs and assigned putative biological roles to uncharacterized genes. Finally, we used the PP to generate local transcription factor regulatory networks. Spatially local correlation networks were constructed for six PP that span along the embryonic anterior-posterior axis. Using a two-tail 5% cutoff on correlation, we reproduced 10 of the 11 links in the well-studied gap gene network. In conclusion, the performance of PP with the Drosophila data suggests that staNMF provides informative decompositions and constitutes a useful computational lens through which to extract biological insight from complex and often noisy gene expression data.« less
ERIC Educational Resources Information Center
Marsh, Julie A.; McCombs, Jennifer Sloan; Martorell, Francisco
2010-01-01
This article examines the convergence of two popular school improvement policies: instructional coaching and data-driven decision making (DDDM). Drawing on a mixed methods study of a statewide reading coach program in Florida middle schools, the article examines how coaches support DDDM and how this support relates to student and teacher outcomes.…
Parameterized data-driven fuzzy model based optimal control of a semi-batch reactor.
Kamesh, Reddi; Rani, K Yamuna
2016-09-01
A parameterized data-driven fuzzy (PDDF) model structure is proposed for semi-batch processes, and its application for optimal control is illustrated. The orthonormally parameterized input trajectories, initial states and process parameters are the inputs to the model, which predicts the output trajectories in terms of Fourier coefficients. Fuzzy rules are formulated based on the signs of a linear data-driven model, while the defuzzification step incorporates a linear regression model to shift the domain from input to output domain. The fuzzy model is employed to formulate an optimal control problem for single rate as well as multi-rate systems. Simulation study on a multivariable semi-batch reactor system reveals that the proposed PDDF modeling approach is capable of capturing the nonlinear and time-varying behavior inherent in the semi-batch system fairly accurately, and the results of operating trajectory optimization using the proposed model are found to be comparable to the results obtained using the exact first principles model, and are also found to be comparable to or better than parameterized data-driven artificial neural network model based optimization results. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Solving Partial Differential Equations in a data-driven multiprocessor environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaudiot, J.L.; Lin, C.M.; Hosseiniyar, M.
1988-12-31
Partial differential equations can be found in a host of engineering and scientific problems. The emergence of new parallel architectures has spurred research in the definition of parallel PDE solvers. Concurrently, highly programmable systems such as data-how architectures have been proposed for the exploitation of large scale parallelism. The implementation of some Partial Differential Equation solvers (such as the Jacobi method) on a tagged token data-flow graph is demonstrated here. Asynchronous methods (chaotic relaxation) are studied and new scheduling approaches (the Token No-Labeling scheme) are introduced in order to support the implementation of the asychronous methods in a data-driven environment.more » New high-level data-flow language program constructs are introduced in order to handle chaotic operations. Finally, the performance of the program graphs is demonstrated by a deterministic simulation of a message passing data-flow multiprocessor. An analysis of the overhead in the data-flow graphs is undertaken to demonstrate the limits of parallel operations in dataflow PDE program graphs.« less
[Development of a program theory as a basis for the evaluation of a dementia special care unit].
Adlbrecht, Laura; Bartholomeyczik, Sabine; Mayer, Hanna
2018-06-01
Background: An existing dementia special care unit should be evaluated. In order to build a sound foundation of the evaluation a deep theoretical understanding of the implemented intervention is needed, which has not been explicated yet. One possibility to achieve this is the development of a program theory. Aim: The aim is to present a method to develop a program theory for the existing living and care concept of the dementia special care unit, which is used in a larger project to evaluate the concept theory-drivenly. Method: The evaluation is embedded in the framework of van Belle et al. (2010) and an action model and a change model (Chen, 2015) is created. For the specification of the change model the contribution analysis (Mayne, 2011) is applied. Data were collected in workshops with the developers and the nurses of the dementia special care unit and a literature research concerning interventions and outcomes was carried out. The results were synthesized in a consens workshop. Results: The action model describes the interventions of the dementia special care unit, the implementers, the organization and the context. The change model compromises the mechanisms through which interventions achieve outcomes. Conclusions: The results of the program theory can be employed to choose data collection methods and instruments for the evaluation. On the basis of the results of the evaluation the program theory can be refined and adapted.
Event-driven simulation in SELMON: An overview of EDSE
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.
1992-01-01
EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej
2010-03-01
Scientific computing is the field of study concerned with constructing mathematical models, numerical solution techniques and with using computers to analyse and solve scientific and engineering problems. Model-Driven Development (MDD) has been proposed as a means to support the software development process through the use of a model-centric approach. This paper surveys the core MDD technology that was used to develop an application that allows computation of the RHEED intensities dynamically for a disordered surface. New version program summaryProgram title: RHEED1DProcess Catalogue identifier: ADUY_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUY_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 31 971 No. of bytes in distributed program, including test data, etc.: 3 039 820 Distribution format: tar.gz Programming language: Embarcadero C++ Builder Computer: Intel Core Duo-based PC Operating system: Windows XP, Vista, 7 RAM: more than 1 GB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADUY_v3_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2394 Does the new version supersede the previous version?: No Nature of problem: An application that implements numerical simulations should be constructed according to the CSFAR rules: clear and well-documented, simple, fast, accurate, and robust. A clearly written, externally and internally documented program is much easier to understand and modify. A simple program is much less prone to error and is more easily modified than one that is complicated. Simplicity and clarity also help make the program flexible. Making the program fast has economic benefits. It also allows flexibility because some of the features that make a program efficient can be traded off for greater accuracy. Making the program fast also has the benefit of allowing longer calculations with better resolution. The compromise between speed and accuracy has always posted one of the most troublesome challenges for the programmer. Almost all advances in numerical analysis have come about trying to reach these twin goals. Change in the basic algorithms will give greater improvements in accuracy and speed than using special numerical tricks or changing programming language. A robust program works correctly over a broad spectrum of input data. Solution method: The computational model of the program is based on the use of a dynamical diffraction theory in which the electrons are taken to be diffracted by a potential, which is periodic in the dimension perpendicular to the surface. In the case of a disordered surface we can use the proportional model of the scattering potential, in which the potential of a partially filled layer is taken to be the product of the coverage of this layer and the potential of a fully filled layer: U(θ,z)=∑ θ(t/τ)U(1,z), where U(1,z) stands for the potential for the full nth layer, and U(θ,z) the potential of the growing layer. Reasons for new version: Responding to the user feedback the RHEEDGr_09 program has been upgraded to a standard that allows carrying out computations of the RHEED intensities for a disordered surface. Also, functionality and documentation of the program have been improved. Summary of revisions:The logical structure of the Platform-Specific Model of the RHEEDGr_09 program has been modified according to the scheme showed in Fig. 1*. The class diagram in Fig. 1* is a static view of the main platform-specific elements of the RHEED1DProcess architecture. Fig. 2* provides a dynamic view by showing the creation and destruction simplistic sequence diagram for the process. Fig. 3* shows the RHEED1DProcess use case model. As can be seen in Figs. 2-3* the RHEED1DProcess has been designed as a slave process that runs as a separate thread inside each transaction generated by the master Growth09 program (see pii:S0010-4655(09)00386-5 A. Daniluk, Model-Driven Development for scientific computing. Computations of RHEED intensities for a disordered surface. Part II The RHEED1DProcess requires the user to provide the appropriate parameters for the crystal structure under investigation. These parameters are loaded from the parameters.ini file at run-time. Instructions on the preparation of the .ini files can be found in the new distribution. The RHEED1DProcess requires the user to provide the appropriate values of the layers of coverage profiles. The CoverageProfiles.dat file (generated by Growth09 master application) at run-time loads these values. The RHEED1DProcess enables carrying out one-dimensional dynamical calculations for the fcc lattice, with a two-atoms basis and fcc lattice, with one atom basis but yet the zeroth Fourier component of the scattering potential in the TRHEED1D::crystPotUg() function can be modified according to users' specific application requirements. * The figures mentioned can be downloaded, see "Supplementary material" below. Unusual features: The program is distributed in the form of main projects RHEED1DProcess.cbproj and Graph2D0x.cbproj with associated files, and should be compiled using Embarcadero RAD Studio 2010 along with Together visual-modelling platform. The program should be compiled with English/USA regional and language options. Additional comments: This version of the RHEED program is designed to run in conjunction with the GROWTH09 (ADVL_v3_0) program. It does not replace the previous, stand alone, RHEEDGR-09 (ADUY_v3_0) version. Running time: The typical running time is machine and user-parameters dependent. References:[1] OMG, Model Driven Architecture Guide Version 1.0.1, 2003.
Radiation Modeling with Direct Simulation Monte Carlo
NASA Technical Reports Server (NTRS)
Carlson, Ann B.; Hassan, H. A.
1991-01-01
Improvements in the modeling of radiation in low density shock waves with direct simulation Monte Carlo (DSMC) are the subject of this study. A new scheme to determine the relaxation collision numbers for excitation of electronic states is proposed. This scheme attempts to move the DSMC programs toward a more detailed modeling of the physics and more reliance on available rate data. The new method is compared with the current modeling technique and both techniques are compared with available experimental data. The differences in the results are evaluated. The test case is based on experimental measurements from the AVCO-Everett Research Laboratory electric arc-driven shock tube of a normal shock wave in air at 10 km/s and .1 Torr. The new method agrees with the available data as well as the results from the earlier scheme and is more easily extrapolated to di erent ow conditions.
Identifying Seizure Onset Zone From the Causal Connectivity Inferred Using Directed Information
NASA Astrophysics Data System (ADS)
Malladi, Rakesh; Kalamangalam, Giridhar; Tandon, Nitin; Aazhang, Behnaam
2016-10-01
In this paper, we developed a model-based and a data-driven estimator for directed information (DI) to infer the causal connectivity graph between electrocorticographic (ECoG) signals recorded from brain and to identify the seizure onset zone (SOZ) in epileptic patients. Directed information, an information theoretic quantity, is a general metric to infer causal connectivity between time-series and is not restricted to a particular class of models unlike the popular metrics based on Granger causality or transfer entropy. The proposed estimators are shown to be almost surely convergent. Causal connectivity between ECoG electrodes in five epileptic patients is inferred using the proposed DI estimators, after validating their performance on simulated data. We then proposed a model-based and a data-driven SOZ identification algorithm to identify SOZ from the causal connectivity inferred using model-based and data-driven DI estimators respectively. The data-driven SOZ identification outperforms the model-based SOZ identification algorithm when benchmarked against visual analysis by neurologist, the current clinical gold standard. The causal connectivity analysis presented here is the first step towards developing novel non-surgical treatments for epilepsy.
Overview of ASC Capability Computing System Governance Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott W.
This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.
A realist review of family-based interventions for children of substance abusing parents.
Usher, Amelia M; McShane, Kelly E; Dwyer, Candice
2015-12-18
Millions of children across North America and Europe live in families with alcohol or drug abusing parents. These children are at risk for a number of negative social, emotional and developmental outcomes, including an increased likelihood of developing a substance use disorder later in life. Family-based intervention programs for children with substance abusing parents can yield positive outcomes. This study is a realist review of evaluations of family-based interventions aimed at improving psychosocial outcomes for children of substance abusing parents (COSAPs). The primary objectives were to uncover patterns of contextual factors and mechanisms that generate program outcomes, and advance program theory in this field. Realist review methodology was chosen as the most appropriate method of systematic review because it is a theory-driven approach that seeks to explore mechanisms underlying program effectiveness (or lack thereof). A systematic and comprehensive search of academic and grey literature uncovered 32 documents spanning 7 different intervention programs. Data was extracted from the included documents using abstraction templates designed to code for contexts, mechanisms and outcomes of each program. Two candidate program theories of family addiction were used to guide data analysis: the family disease model and the family prevention model. Data analysis was undertaken by a research team using an iterative process of comparison and checking with original documents to determine patterns within the data. Programs originating in both the family disease model and the family prevention model were uncovered, along with hybrid programs that successfully included components from each candidate program theory. Four demi-regularities were found to account for the effectiveness of programs included in this review: (1) opportunities for positive parent-child interactions, (2) supportive peer-to-peer relationships, (3) the power of knowledge, and (4) engaging hard to reach families using strategies that are responsive to socio-economic needs and matching services to client lived experience. This review yielded new findings that had not otherwise been explored in COSAP program research and are discussed in order to help expand program theory. Implications for practice and evaluation are further discussed.
Combining Model-driven and Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Denney, Ewen; Whittle, John
2004-01-01
We describe ongoing work which aims to extend the schema-based program synthesis paradigm with explicit models. In this context, schemas can be considered as model-to-model transformations. The combination of schemas with explicit models offers a number of advantages, namely, that building synthesis systems becomes much easier since the models can be used in verification and in adaptation of the synthesis systems. We illustrate our approach using an example from signal processing.
NASA Astrophysics Data System (ADS)
Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.
2004-12-01
Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition, remote multi-channel data access, local time-series data management, interactive multi-window waveform display and time-series analysis with centralized meta-data control. Formally integrating educational seismology into the K-12 science curriculum with an overall "positive" impact to science education practices necessarily requires a collaborative effort between professional educators and seismologists yet driven exclusively by teacher needs.
PSA: A program to streamline orbit determination for launch support operations
NASA Technical Reports Server (NTRS)
Legerton, V. N.; Mottinger, N. A.
1988-01-01
An interactive, menu driven computer program was written to streamline the orbit determination process during the critical launch support phase of a mission. Residing on a virtual memory minicomputer, this program retains the quantities in-core needed to obtain a least squares estimate of the spacecraft trajectory with interactive displays to assist in rapid radio metric data evaluation. Menu-driven displays allow real time filter and data strategy development. Graphical and tabular displays can be sent to a laser printer for analysis without exiting the program. Products generated by this program feed back to the main orbit determination program in order to further refine the estimate of the trajectory. The final estimate provides a spacecraft ephemeris which is transmitted to the mission control center and used for antenna pointing and frequency predict generation by the Deep Space Network. The development and implementation process of this program differs from that used for most other navigation software by allowing the users to check important operating features during development and have changes made as needed.
NASA Astrophysics Data System (ADS)
Garno, Joshua; Ouellet, Frederick; Koneru, Rahul; Balachandar, Sivaramakrishnan; Rollin, Bertrand
2017-11-01
An analytic model to describe the hydrodynamic forces on an explosively driven particle is not currently available. The Maxey-Riley-Gatignol (MRG) particle force equation generalized for compressible flows is well-studied in shock-tube applications, and captures the evolution of particle force extracted from controlled shock-tube experiments. In these experiments only the shock-particle interaction was examined, and the effects of the contact line were not investigated. In the present work, the predictive capability of this model is considered for the case where a particle is explosively ejected from a rigid barrel into ambient air. Particle trajectory information extracted from simulations is compared with experimental data. This configuration ensures that both the shock and contact produced by the detonation will influence the motion of the particle. The simulations are carried out using a finite volume, Euler-Lagrange code using the JWL equation of state to handle the explosive products. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program,under Contract No. DE-NA0002378.
Data-driven model-independent searches for long-lived particles at the LHC
NASA Astrophysics Data System (ADS)
Coccaro, Andrea; Curtin, David; Lubatti, H. J.; Russell, Heather; Shelton, Jessie
2016-12-01
Neutral long-lived particles (LLPs) are highly motivated by many beyond the Standard Model scenarios, such as theories of supersymmetry, baryogenesis, and neutral naturalness, and present both tremendous discovery opportunities and experimental challenges for the LHC. A major bottleneck for current LLP searches is the prediction of Standard Model backgrounds, which are often impossible to simulate accurately. In this paper, we propose a general strategy for obtaining differential, data-driven background estimates in LLP searches, thereby notably extending the range of LLP masses and lifetimes that can be discovered at the LHC. We focus on LLPs decaying in the ATLAS muon system, where triggers providing both signal and control samples are available at LHC run 2. While many existing searches require two displaced decays, a detailed knowledge of backgrounds will allow for very inclusive searches that require just one detected LLP decay. As we demonstrate for the h →X X signal model of LLP pair production in exotic Higgs decays, this results in dramatic sensitivity improvements for proper lifetimes ≳10 m . In theories of neutral naturalness, this extends reach to glueball masses far below the b ¯b threshold. Our strategy readily generalizes to other signal models and other detector subsystems. This framework therefore lends itself to the development of a systematic, model-independent LLP search program, in analogy to the highly successful simplified-model framework of prompt searches.
Long, Tammy M.; Ebert-May, Diane
2014-01-01
Graduate teaching assistants (TAs) are increasingly responsible for instruction in undergraduate science, technology, engineering, and mathematics (STEM) courses. Various professional development (PD) programs have been developed and implemented to prepare TAs for this role, but data about effectiveness are lacking and are derived almost exclusively from self-reported surveys. In this study, we describe the design of a reformed PD (RPD) model and apply Kirkpatrick's Evaluation Framework to evaluate multiple outcomes of TA PD before, during, and after implementing RPD. This framework allows evaluation that includes both direct measures and self-reported data. In RPD, TAs created and aligned learning objectives and assessments and incorporated more learner-centered instructional practices in their teaching. However, these data are inconsistent with TAs’ self-reported perceptions about RPD and suggest that single measures are insufficient to evaluate TA PD programs. PMID:26086654
Model-Driven Energy Intelligence
2015-03-01
building information model ( BIM ) for operations...estimate of the potential impact on energy performance at Fort Jackson. 15. SUBJECT TERMS Building Information Modeling ( BIM ), Energy, ECMs, monitoring...dimensional AHU Air Handling Unit API Application Programming Interface BIM building information model BLCC Building Life Cycle Cost
A Distributed Laboratory for Event-Driven Coastal Prediction and Hazard Planning
NASA Astrophysics Data System (ADS)
Bogden, P.; Allen, G.; MacLaren, J.; Creager, G. J.; Flournoy, L.; Sheng, Y. P.; Graber, H.; Graves, S.; Conover, H.; Luettich, R.; Perrie, W.; Ramakrishnan, L.; Reed, D. A.; Wang, H. V.
2006-12-01
The 2005 Atlantic hurricane season was the most active in recorded history. Collectively, 2005 hurricanes caused more than 2,280 deaths and record damages of over 100 billion dollars. Of the storms that made landfall, Dennis, Emily, Katrina, Rita, and Wilma caused most of the destruction. Accurate predictions of storm-driven surge, wave height, and inundation can save lives and help keep recovery costs down, provided the information gets to emergency response managers in time. The information must be available well in advance of landfall so that responders can weigh the costs of unnecessary evacuation against the costs of inadequate preparation. The SURA Coastal Ocean Observing and Prediction (SCOOP) Program is a multi-institution collaboration implementing a modular, distributed service-oriented architecture for real time prediction and visualization of the impacts of extreme atmospheric events. The modular infrastructure enables real-time prediction of multi- scale, multi-model, dynamic, data-driven applications. SURA institutions are working together to create a virtual and distributed laboratory integrating coastal models, simulation data, and observations with computational resources and high speed networks. The loosely coupled architecture allows teams of computer and coastal scientists at multiple institutions to innovate complex system components that are interconnected with relatively stable interfaces. The operational system standardizes at the interface level to enable substantial innovation by complementary communities of coastal and computer scientists. This architectural philosophy solves a long-standing problem associated with the transition from research to operations. The SCOOP Program thereby implements a prototype laboratory consistent with the vision of a national, multi-agency initiative called the Integrated Ocean Observing System (IOOS). Several service- oriented components of the SCOOP enterprise architecture have already been designed and implemented, including data archive and transport services, metadata registry and retrieval (catalog), resource management, and portal interfaces. SCOOP partners are integrating these at the service level and implementing reconfigurable workflows for several kinds of user scenarios, and are working with resource providers to prototype new policies and technologies for on-demand computing.
Real-time Specification and Forecasting for HF Links During Disturbed Conditions
NASA Astrophysics Data System (ADS)
Rice, D.; Hunsuker, R. D.; Eccles, J.; Sojka, J. J.
2004-05-01
The HF communications community has long been dependent on climatological ionosphere descriptions to support HF propagation programs. Additionally, these programs include solar zenith angle and frequency-squared variation of HF absorption but do not include space weather effects due to solar x-ray events and sporadic E layers. The usefulness of real-time specification and forecasting of HF links is desired in programs such as Operational Space Environment Network Display (OpSend). The creation of HF illumination maps requires proper specifications of D, E and F regions of the ionosphere. We present results and validation efforts of the Data-Driven D region (DDDR) model of HF absorption for mid-latitude HF paths. The DDDR programs assimilate real-time data such as the NOAA/GOES 12 x-ray measurements to produce space weather related absorption predictions. The data-driven model is being validated with observations from the HF Investigation of D-Region Ionospheric Variation Experiment (HIDIVE). Monitoring of standard time-frequency HF stations has been employed for the past three decades. The passive monitoring technique used in HIDIVE was mainly applied for studies of the high-latitude and equatorial ionosphere, thus long-term, quantitative data on the mid-latitude ionosphere are difficult to find in archival literature. HIDIVE is a careful examination of long-term observations HF absorption to study seasonal variation and space weather events. Simultaneous continuous measurements of NOAA/GOES 12 solar x-ray flux and calibrated HF signal strength were initiated in December 2002 to provide validation data for the DDDR model. Continuous recording of transmissions of standard time-frequency stations (WWV and WWVH) over the range of 2.5 to 20.0 MHz and 5-minute averages of 1.0 to 8.0 nm solar x-ray flux have been studied for 35 solar flares ranging from Class C to Class X from March through August 2003 during the descending phase of solar cycle 23. The monitoring stations are located at Providence, Utah and at Klamath Falls, Oregon and continuous recordings are planned through August 2005. In particular, we will examine the extreme solar events of October-November 2003 as an example of the Societal Impact of Space Weather. This is timely because of renewed interest in the use of HF circuits by the Military and by commercial airlines.
Implementation and evaluation of a monthly water balance model over the US on an 800 m grid
Hostetler, Steven W.; Alder, Jay R.
2016-01-01
We simulate the 1950–2010 water balance for the conterminous U.S. (CONUS) with a monthly water balance model (MWBM) using the 800 m Parameter-elevation Regression on Independent Slopes Model (PRISM) data set as model input. We employed observed snow and streamflow data sets to guide modification of the snow and potential evapotranspiration components in the default model and to evaluate model performance. Based on various metrics and sensitivity tests, the modified model yields reasonably good simulations of seasonal snowpack in the West (range of bias of ±50 mm at 68% of 713 SNOTEL sites), the gradients and magnitudes of actual evapotranspiration, and runoff (median correlation of 0.83 and median Nash-Sutcliff efficiency of 0.6 between simulated and observed annual time series at 1427 USGS gage sites). The model generally performs well along the Pacific Coast, the high elevations of the Basin and Range and over the Midwest and East, but not as well over the dry areas of the Southwest and upper Plains regions due, in part, to the apportioning of direct versus delayed runoff. Sensitivity testing and application of the MWBM to simulate the future water balance at four National Parks when driven by 30 climate models from the Climate Model Intercomparison Program Phase 5 (CMIP5) demonstrate that the model is useful for evaluating first-order, climate driven hydrologic change on monthly and annual time scales.
ERIC Educational Resources Information Center
Stevenson, Joseph Martin; Payne, Alfredda Hunt
2016-01-01
This chapter describes how data analysis and data-driven decision making were critical for designing, developing, and assessing a new academic program. The authors--one, the program's founder; the other, an alumna--begin by highlighting some of the elements in the program's incubation and, subsequently, describe some of the components for data…
The wind of the M-type AGB star RT Virginis probed by VLTI/MIDI
NASA Astrophysics Data System (ADS)
Sacuto, S.; Ramstedt, S.; Höfner, S.; Olofsson, H.; Bladh, S.; Eriksson, K.; Aringer, B.; Klotz, D.; Maercker, M.
2013-03-01
Aims: We study the circumstellar environment of the M-type AGB star RT Vir using mid-infrared high spatial resolution observations from the ESO-VLTI focal instrument MIDI. The aim of this study is to provide observational constraints on theoretical prediction that the winds of M-type AGB objects can be driven by photon scattering on iron-free silicate grains located in the close environment (about 2 to 3 stellar radii) of the star. Methods: We interpreted spectro-interferometric data, first using wavelength-dependent geometric models. We then used a self-consistent dynamic model atmosphere containing a time-dependent description of grain growth for pure forsterite dust particles to reproduce the photometric, spectrometric, and interferometric measurements of RT Vir. Since the hydrodynamic computation needs stellar parameters as input, a considerable effort was first made to determine these parameters. Results: MIDI differential phases reveal the presence of an asymmetry in the stellar vicinity. Results from the geometrical modeling give us clues to the presence of aluminum and silicate dust in the close circumstellar environment (<5 stellar radii). Comparison between spectro-interferometric data and a self-consistent dust-driven wind model reveals that silicate dust has to be present in the region between 2 to 3 stellar radii to reproduce the 59 and 63 m baseline visibility measurements around 9.8 μm. This gives additional observational evidence in favor of winds driven by photon scattering on iron-free silicate grains located in the close vicinity of an M-type star. However, other sources of opacity are clearly missing to reproduce the 10-13 μm visibility measurements for all baselines. Conclusions: This study is a first attempt to understand the wind mechanism of M-type AGB stars by comparing photometric, spectrometric, and interferometric measurements with state-of-the-art, self-consistent dust-driven wind models. The agreement of the dynamic model atmosphere with interferometric measurements in the 8-10 μm spectral region gives additional observational evidence that the winds of M-type stars can be driven by photon scattering on iron-free silicate grains. Finally, a larger statistical study and progress in advanced self-consistent 3D modeling are still required to solve the remaining problems. Based on observations made with the Very Large Telescope Interferometer at Paranal Observatory under programs 083.D-0234 and 086.D-0737 (Open Time Observations).
Data Driven Program Planning for GIS Instruction
ERIC Educational Resources Information Center
Scarletto, Edith
2013-01-01
This study used both focus groups (qualitative) and survey data (quantitative) to develop and expand an instruction program for GIS services. It examined the needs and preferences faculty and graduate students have for learning about GIS applications for teaching and research. While faculty preferred in person workshops and graduate students…
The Future of Family Engagement in Residential Care Settings
ERIC Educational Resources Information Center
Affronti, Melissa L.; Levison-Johnson, Jody
2009-01-01
Residential programs for children and youth are increasingly implementing engagement strategies to promote family-centered and family-driven models of care (Leichtman, 2008). The practice of engagement is a fairly new area of research, especially in residential care. Driven by their goal to increase the use of state-of-the-art family engagement…
Closing the Loop: How We Better Serve Our Students through a Comprehensive Assessment Process
ERIC Educational Resources Information Center
Arcario, Paul; Eynon, Bret; Klages, Marisa; Polnariev, Bernard A.
2013-01-01
Outcomes assessment is often driven by demands for accountability. LaGuardia Community College's outcomes assessment model has advanced student learning, shaped academic program development, and created an impressive culture of faculty-driven assessment. Our inquiry-based approach uses ePortfolios for collection of student work and demonstrates…
ERIC Educational Resources Information Center
Streifer, Philip A.; Schumann, Jeffrey A.
2005-01-01
The implementation of No Child Left Behind (NCLB) presents important challenges for schools across the nation to identify problems that lead to poor performance. Yet schools must intervene with instructional programs that can make a difference and evaluate the effectiveness of such programs. New advances in artificial intelligence (AI) data-mining…
Theory-Based Stakeholder Evaluation
ERIC Educational Resources Information Center
Hansen, Morten Balle; Vedung, Evert
2010-01-01
This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…
Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam
2015-04-01
We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Worssam, J. B.
2017-12-01
Field research finally within classroom walls, data driven, hands on with students using a series of electronic projects to show evidence of scientific mentor collaboration. You do not want to miss this session in which I will be sharing the steps to develop an interactive mentor program between scientists in the field and students in the classroom. Using next generation science standards and common core language skills you will be able to blend scientific exploration with scientific writing and communication skills. Learn how to make connections in your own community with STEM businesses, agencies and organizations. Learn how to connect with scientists across the globe to make your classroom instruction interactive and live for all students. Scientists, you too will want to participate, see how you can reach out and be a part of the K-12 educational system with students learning about YOUR science, a great component for NSF grants! "Scientists in the Classroom," a model program for all, bringing real time science, data and knowledge into the classroom.
Determination of the Parameter Sets for the Best Performance of IPS-driven ENLIL Model
NASA Astrophysics Data System (ADS)
Yun, Jongyeon; Choi, Kyu-Cheol; Yi, Jonghyuk; Kim, Jaehun; Odstrcil, Dusan
2016-12-01
Interplanetary scintillation-driven (IPS-driven) ENLIL model was jointly developed by University of California, San Diego (UCSD) and National Aeronaucics and Space Administration/Goddard Space Flight Center (NASA/GSFC). The model has been in operation by Korean Space Weather Cetner (KSWC) since 2014. IPS-driven ENLIL model has a variety of ambient solar wind parameters and the results of the model depend on the combination of these parameters. We have conducted researches to determine the best combination of parameters to improve the performance of the IPS-driven ENLIL model. The model results with input of 1,440 combinations of parameters are compared with the Advanced Composition Explorer (ACE) observation data. In this way, the top 10 parameter sets showing best performance were determined. Finally, the characteristics of the parameter sets were analyzed and application of the results to IPS-driven ENLIL model was discussed.
Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J
2018-03-01
Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.
Petri net model for analysis of concurrently processed complex algorithms
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1986-01-01
This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.
García-Sancho, Miguel
2012-03-01
This paper argues that the history of the computer, of the practice of computation and of the notions of 'data' and 'programme' are essential for a critical account of the emergence and implications of data-driven research. In order to show this, I focus on the transition that the investigations on the worm C. elegans experienced in the Laboratory of Molecular Biology of Cambridge (UK). Throughout the 1980s, this research programme evolved from a study of the genetic basis of the worm's development and behaviour to a DNA mapping and sequencing initiative. By examining the changing computing technologies which were used at the Laboratory, I demonstrate that by the time of this transition researchers shifted from modelling the worm's genetic programme on a mainframe apparatus to writing minicomputer programs aimed at providing map and sequence data which was then circulated to other groups working on the genetics of C. elegans. The shift in the worm research should thus not be simply explained in the application of computers which transformed the project from hypothesis-driven to a data-intensive endeavour. The key factor was rather a historically specific technology-in-house and easy programmable minicomputers-which redefined the way of achieving the project's long-standing goal, leading the genetic programme to co-evolve with the practices of data production and distribution. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Rutherford, Helena J. V.; Mayes, Linda C.; Fisher, Philip A.
2016-01-01
The use of theory-driven models to develop and evaluate family-based intervention programs has a long history in psychology. Some of the first evidence-based parenting programs to address child problem behavior, developed in the 1970s, were grounded in causal models derived from longitudinal developmental research. The same translational…
NASA Astrophysics Data System (ADS)
Weyer, K. U.
2016-12-01
Coastal groundwater flow investigations at the Cutler site of the Biscayne Bay south of Miami, Florida, gave rise to the dominating concept of density-driven flow of sea water into coastal aquifers indicated as a saltwater wedge. Within that wedge convection type return flow of seawater and a dispersion zone were concluded by Cooper et al. (1964, USGS Water Supply Paper 1613-C) to be the cause of the Biscayne aquifer `sea water wedge'. This conclusion was merely based on the chloride distribution within the aquifer and on an analytical model concept assuming convection flow within a confined aquifer without taking non-chemical field data into consideration. This concept was later labelled the `Henry Problem', which any numerical variable density flow program has to be able to simulate to be considered acceptable. Revisiting the above summarizing publication with its record of piezometric field data (heads) showed that the so-called sea water wedge was actually caused by discharging deep saline groundwater driven by gravitational flow and not by denser sea water. Density driven flow of seawater into the aquifer was not found reflected in the head measurements for low and high tide conditions which had been taken contemporaneously with the chloride measurements. These head measurements had not been included in the flow interpretation. The very same head measurements indicated a clear dividing line between shallow local fresh groundwater flow and saline deep groundwater flow without the existence of a dispersion zone or a convection cell. The Biscayne situation emphasizes the need for any chemical interpretation of flow pattern to be backed up by head data as energy indicators of flow fields. At the Biscayne site density driven flow of seawater did and does not exist. Instead this site and the Florida coast line in general are the end points of local fresh and regional saline groundwater flow systems driven by gravity forces and not by density differences.
A data driven control method for structure vibration suppression
NASA Astrophysics Data System (ADS)
Xie, Yangmin; Wang, Chao; Shi, Hang; Shi, Junwei
2018-02-01
High radio-frequency space applications have motivated continuous research on vibration suppression of large space structures both in academia and industry. This paper introduces a novel data driven control method to suppress vibrations of flexible structures and experimentally validates the suppression performance. Unlike model-based control approaches, the data driven control method designs a controller directly from the input-output test data of the structure, without requiring parametric dynamics and hence free of system modeling. It utilizes the discrete frequency response via spectral analysis technique and formulates a non-convex optimization problem to obtain optimized controller parameters with a predefined controller structure. Such approach is then experimentally applied on an end-driving flexible beam-mass structure. The experiment results show that the presented method can achieve competitive disturbance rejections compared to a model-based mixed sensitivity controller under the same design criterion but with much less orders and design efforts, demonstrating the proposed data driven control is an effective approach for vibration suppression of flexible structures.
Lower extremity EMG-driven modeling of walking with automated adjustment of musculoskeletal geometry
Meyer, Andrew J.; Patten, Carolynn
2017-01-01
Neuromusculoskeletal disorders affecting walking ability are often difficult to manage, in part due to limited understanding of how a patient’s lower extremity muscle excitations contribute to the patient’s lower extremity joint moments. To assist in the study of these disorders, researchers have developed electromyography (EMG) driven neuromusculoskeletal models utilizing scaled generic musculoskeletal geometry. While these models can predict individual muscle contributions to lower extremity joint moments during walking, the accuracy of the predictions can be hindered by errors in the scaled geometry. This study presents a novel EMG-driven modeling method that automatically adjusts surrogate representations of the patient’s musculoskeletal geometry to improve prediction of lower extremity joint moments during walking. In addition to commonly adjusted neuromusculoskeletal model parameters, the proposed method adjusts model parameters defining muscle-tendon lengths, velocities, and moment arms. We evaluated our EMG-driven modeling method using data collected from a high-functioning hemiparetic subject walking on an instrumented treadmill at speeds ranging from 0.4 to 0.8 m/s. EMG-driven model parameter values were calibrated to match inverse dynamic moments for five degrees of freedom in each leg while keeping musculoskeletal geometry close to that of an initial scaled musculoskeletal model. We found that our EMG-driven modeling method incorporating automated adjustment of musculoskeletal geometry predicted net joint moments during walking more accurately than did the same method without geometric adjustments. Geometric adjustments improved moment prediction errors by 25% on average and up to 52%, with the largest improvements occurring at the hip. Predicted adjustments to musculoskeletal geometry were comparable to errors reported in the literature between scaled generic geometric models and measurements made from imaging data. Our results demonstrate that with appropriate experimental data, joint moment predictions for walking generated by an EMG-driven model can be improved significantly when automated adjustment of musculoskeletal geometry is included in the model calibration process. PMID:28700708
Activation of the PD-1 pathway contributes to immune escape in EGFR-driven lung tumors
Akbay, Esra A; Koyama, Shohei; Carretero, Julian; Altabef, Abigail; Tchaicha, Jeremy H; Christensen, Camilla L; Mikse, Oliver R; Cherniack, Andrew D; Beauchamp, Ellen M; Pugh, Trevor J; Wilkerson, Matthew D; Fecci, Peter E; Butaney, Mohit; Reibel, Jacob B; Soucheray, Margaret; Cohoon, Travis J; Janne, Pasi A; Meyerson, Matthew; Hayes, D. Neil; Shapiro, Geoffrey I; Shimamura, Takeshi; Sholl, Lynette M; Rodig, Scott J; Freeman, Gordon J; Hammerman, Peter S; Dranoff, Glenn; Wong, Kwok-Kin
2013-01-01
The success in lung cancer therapy with Programmed Death (PD)-1 blockade suggests that immune escape mechanisms contribute to lung tumor pathogenesis. We identified a correlation between Epidermal Growth Factor Receptor (EGFR) pathway activation and a signature of immunosuppression manifested by upregulation of PD-1, PD-L1, cytotoxic T lymphocyte antigen-4 (CTLA-4), and multiple tumor-promoting inflammatory cytokines. We observed decreased cytotoxic T cells and increased markers of T cell exhaustion in mouse models of EGFR-driven lung cancer. PD-1 antibody blockade improved the survival of mice with EGFR-driven adenocarcinomas by enhancing effector T cell function and lowering the levels of tumor-promoting cytokines. Expression of mutant EGFR in bronchial epithelial cells induced PD-L1, and PD-L1 expression was reduced by EGFR inhibitors in non-small cell lung cancer cell lines with activated EGFR. These data suggest that oncogenic EGFR signaling remodels the tumor microenvironment to trigger immune escape, and mechanistically link treatment response to PD-1 inhibition. PMID:24078774
ERIC Educational Resources Information Center
Senger, Karen
2012-01-01
Purpose: The purposes of this study were to investigate and describe how elementary teachers in exited Program Improvement-Safe Harbor schools acquire student achievement data through assessments, the strategies and reflections utilized to make sense of the data to improve student achievement, ensure curriculum and instructional goals are aligned,…
Testing the Accuracy of Data-driven MHD Simulations of Active Region Evolution and Eruption
NASA Astrophysics Data System (ADS)
Leake, J. E.; Linton, M.; Schuck, P. W.
2017-12-01
Models for the evolution of the solar coronal magnetic field are vital for understanding solar activity, yet the best measurements of the magnetic field lie at the photosphere, necessitating the recent development of coronal models which are "data-driven" at the photosphere. Using magnetohydrodynamic simulations of active region formation and our recently created validation framework we investigate the source of errors in data-driven models that use surface measurements of the magnetic field, and derived MHD quantities, to model the coronal magnetic field. The primary sources of errors in these studies are the temporal and spatial resolution of the surface measurements. We will discuss the implications of theses studies for accurately modeling the build up and release of coronal magnetic energy based on photospheric magnetic field observations.
Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay
2017-11-01
Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.
A system-level model for the microbial regulatory genome.
Brooks, Aaron N; Reiss, David J; Allard, Antoine; Wu, Wei-Ju; Salvanha, Diego M; Plaisier, Christopher L; Chandrasekaran, Sriram; Pan, Min; Kaur, Amardeep; Baliga, Nitin S
2014-07-15
Microbes can tailor transcriptional responses to diverse environmental challenges despite having streamlined genomes and a limited number of regulators. Here, we present data-driven models that capture the dynamic interplay of the environment and genome-encoded regulatory programs of two types of prokaryotes: Escherichia coli (a bacterium) and Halobacterium salinarum (an archaeon). The models reveal how the genome-wide distributions of cis-acting gene regulatory elements and the conditional influences of transcription factors at each of those elements encode programs for eliciting a wide array of environment-specific responses. We demonstrate how these programs partition transcriptional regulation of genes within regulons and operons to re-organize gene-gene functional associations in each environment. The models capture fitness-relevant co-regulation by different transcriptional control mechanisms acting across the entire genome, to define a generalized, system-level organizing principle for prokaryotic gene regulatory networks that goes well beyond existing paradigms of gene regulation. An online resource (http://egrin2.systemsbiology.net) has been developed to facilitate multiscale exploration of conditional gene regulation in the two prokaryotes. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.
Data-Driven Learning of Q-Matrix
ERIC Educational Resources Information Center
Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2012-01-01
The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known "Q"-matrix, which specifies the item-attribute relationships. This article proposes a data-driven approach to identification of the "Q"-matrix and estimation of…
NASA Astrophysics Data System (ADS)
Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.
2018-05-01
Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used to account for the flushing of the different floodplain storages. The resulting hybrid model performs very well on approximately 3 years of daily validation data, with a Nash-Sutcliffe efficiency (NSE) of 0.89 and a root mean squared error (RMSE) of 12.62 mg L-1 (over a range from approximately 50 to 250 mg L-1). Each component of the hybrid model results in noticeable improvements in model performance corresponding to the range of flows for which they are developed. The predictive performance of the hybrid model is significantly better than that of a benchmark process-driven model (NSE = -0.14, RMSE = 41.10 mg L-1, Gbench index = 0.90) and slightly better than that of a benchmark data-driven (ANN) model (NSE = 0.83, RMSE = 15.93 mg L-1, Gbench index = 0.36). Apart from improved predictive performance, the hybrid model also has advantages over the ANN benchmark model in terms of increased capacity for improving system understanding and greater ability to support management decisions.
Data-driven Modeling of Metal-oxide Sensors with Dynamic Bayesian Networks
NASA Astrophysics Data System (ADS)
Gosangi, Rakesh; Gutierrez-Osuna, Ricardo
2011-09-01
We present a data-driven probabilistic framework to model the transient response of MOX sensors modulated with a sequence of voltage steps. Analytical models of MOX sensors are usually built based on the physico-chemical properties of the sensing materials. Although building these models provides an insight into the sensor behavior, they also require a thorough understanding of the underlying operating principles. Here we propose a data-driven approach to characterize the dynamical relationship between sensor inputs and outputs. Namely, we use dynamic Bayesian networks (DBNs), probabilistic models that represent temporal relations between a set of random variables. We identify a set of control variables that influence the sensor responses, create a graphical representation that captures the causal relations between these variables, and finally train the model with experimental data. We validated the approach on experimental data in terms of predictive accuracy and classification performance. Our results show that DBNs can accurately predict the dynamic response of MOX sensors, as well as capture the discriminatory information present in the sensor transients.
Replacement model of city bus: A dynamic programming approach
NASA Astrophysics Data System (ADS)
Arifin, Dadang; Yusuf, Edhi
2017-06-01
This paper aims to develop a replacement model of city bus vehicles operated in Bandung City. This study is driven from real cases encountered by the Damri Company in the efforts to improve services to the public. The replacement model propounds two policy alternatives: First, to maintain or keep the vehicles, and second is to replace them with new ones taking into account operating costs, revenue, salvage value, and acquisition cost of a new vehicle. A deterministic dynamic programming approach is used to solve the model. The optimization process was heuristically executed using empirical data of Perum Damri. The output of the model is to determine the replacement schedule and the best policy if the vehicle has passed the economic life. Based on the results, the technical life of the bus is approximately 20 years old, while the economic life is an average of 9 (nine) years. It means that after the bus is operated for 9 (nine) years, managers should consider the policy of rejuvenation.
The Spring of Systems Biology-Driven Breeding.
Lavarenne, Jérémy; Guyomarc'h, Soazig; Sallaud, Christophe; Gantet, Pascal; Lucas, Mikaël
2018-05-12
Genetics and molecular biology have contributed to the development of rationalized plant breeding programs. Recent developments in both high-throughput experimental analyses of biological systems and in silico data processing offer the possibility to address the whole gene regulatory network (GRN) controlling a given trait. GRN models can be applied to identify topological features helping to shortlist potential candidate genes for breeding purposes. Time-series data sets can be used to support dynamic modelling of the network. This will enable a deeper comprehension of network behaviour and the identification of the few elements to be genetically rewired to push the system towards a modified phenotype of interest. This paves the way to design more efficient, systems biology-based breeding strategies. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hancher, M.; Lieber, A.; Scott, L.
2017-12-01
The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.
Numerical model for learning concepts of streamflow simulation
DeLong, L.L.; ,
1993-01-01
Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.
A Framework to Survey the Energy Efficiency of Installed Motor Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Prakash; Hasanbeigi, Ali; McKane, Aimee
2013-08-01
While motors are ubiquitous throughout the globe, there is insufficient data to properly assess their level of energy efficiency across regional boundaries. Furthermore, many of the existing data sets focus on motor efficiency and neglect the connected drive and system. Without a comprehensive survey of the installed motor system base, a baseline energy efficiency of a country or region’s motor systems cannot be developed. The lack of data impedes government agencies, utilities, manufacturers, distributers, and energy managers when identifying where to invest resources to capture potential energy savings, creating programs aimed at reducing electrical energy consumption, or quantifying the impactsmore » of such programs. This paper will outline a data collection framework for use when conducting a survey under a variety of execution models to characterize motor system energy efficiency within a country or region. The framework is intended to standardize the data collected ensuring consistency across independently conducted surveys. Consistency allows for the surveys to be leveraged against each other enabling comparisons to motor system energy efficiencies from other regions. In creating the framework, an analysis of various motor driven systems, including compressed air, pumping, and fan systems, was conducted and relevant parameters characterizing the efficiency of these systems were identified. A database using the framework will enable policymakers and industry to better assess the improvement potential of their installed motor system base particularly with respect to other regions, assisting in efforts to promote improvements to the energy efficiency of motor driven systems.« less
Data Driven Model Development for the Supersonic Semispan Transport (S(sup 4)T)
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2011-01-01
We investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models and a data-driven system identification procedure. It is shown via analysis of experimental Super- Sonic SemiSpan Transport (S4T) wind-tunnel data using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.
Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation
Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee
2018-01-01
This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964
Odyssey® Math. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2017
2017-01-01
"Odyssey® Math" is a web-based program developed by Compass Learning® for mathematics instruction in grades K-8. The online program includes a mathematics curriculum and formative assessments designed to support differentiated and data-driven instruction. Based on assessment results, the program generates an individualized sequence of…
NASA Technical Reports Server (NTRS)
Mauldin, Lemuel E., III
1993-01-01
Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.
Bridge-scour analysis using the water surface profile (WSPRO) model
Mueller, David S.; ,
1993-01-01
A program was developed to extract hydraulic information required for bridge-scour computations, from the Water-Surface Profile computation model (WSPRO). The program is written in compiled BASIC and is menu driven. Using only ground points, the program can compute average ground elevation, cross-sectional area below a specified datum, or create a Drawing Exchange Format (DXF) fie of cross section. Using both ground points ad hydraulic information form the equal-conveyance tubes computed by WSPRO, the program can compute hydraulic parameters at a user-specified station or in a user-specified subsection of the cross section. The program can identify the maximum velocity in a cross section and the velocity and depth at a user-specified station. The program also can identify the maximum velocity in the cross section and the average velocity, average depth, average ground elevation, width perpendicular to the flow, cross-sectional area of flow, and discharge in a subsection of the cross section. This program does not include any help or suggestions as to what data should be extracted; therefore, the used must understand the scour equations and associated variables to the able to extract the proper information from the WSPRO output.
Combat Wound Initiative program.
Stojadinovic, Alexander; Elster, Eric; Potter, Benjamin K; Davis, Thomas A; Tadaki, Doug K; Brown, Trevor S; Ahlers, Stephen; Attinger, Christopher E; Andersen, Romney C; Burris, David; Centeno, Jose; Champion, Hunter; Crumbley, David R; Denobile, John; Duga, Michael; Dunne, James R; Eberhardt, John; Ennis, William J; Forsberg, Jonathan A; Hawksworth, Jason; Helling, Thomas S; Lazarus, Gerald S; Milner, Stephen M; Mullick, Florabel G; Owner, Christopher R; Pasquina, Paul F; Patel, Chirag R; Peoples, George E; Nissan, Aviram; Ring, Michael; Sandberg, Glenn D; Schaden, Wolfgang; Schultz, Gregory S; Scofield, Tom; Shawen, Scott B; Sheppard, Forest R; Stannard, James P; Weina, Peter J; Zenilman, Jonathan M
2010-07-01
The Combat Wound Initiative (CWI) program is a collaborative, multidisciplinary, and interservice public-private partnership that provides personalized, state-of-the-art, and complex wound care via targeted clinical and translational research. The CWI uses a bench-to-bedside approach to translational research, including the rapid development of a human extracorporeal shock wave therapy (ESWT) study in complex wounds after establishing the potential efficacy, biologic mechanisms, and safety of this treatment modality in a murine model. Additional clinical trials include the prospective use of clinical data, serum and wound biomarkers, and wound gene expression profiles to predict wound healing/failure and additional clinical patient outcomes following combat-related trauma. These clinical research data are analyzed using machine-based learning algorithms to develop predictive treatment models to guide clinical decision-making. Future CWI directions include additional clinical trials and study centers and the refinement and deployment of our genetically driven, personalized medicine initiative to provide patient-specific care across multiple medical disciplines, with an emphasis on combat casualty care.
ERIC Educational Resources Information Center
Moore, John W.
1986-01-01
Describes: (1) spreadheet programs (including VisiCalc) for experiments; (2) event-driven data acquisition (using ADALAB with an Acculab Infrared Spectometer); (3) microcomputer-controlled cyclic voltammetry; (4) inexpensive computerized experiments; (5) the "KC? Discoverer" program; and (6) MOLDOT (space-filling perspective diagrams of…
NASA Astrophysics Data System (ADS)
Tinoco, R. O.; Goldstein, E. B.; Coco, G.
2016-12-01
We use a machine learning approach to seek accurate, physically sound predictors, to estimate two relevant flow parameters for open-channel vegetated flows: mean velocities and drag coefficients. A genetic programming algorithm is used to find a robust relationship between properties of the vegetation and flow parameters. We use data published from several laboratory experiments covering a broad range of conditions to obtain: a) in the case of mean flow, an equation that matches the accuracy of other predictors from recent literature while showing a less complex structure, and b) for drag coefficients, a predictor that relies on both single element and array parameters. We investigate different criteria for dataset size and data selection to evaluate their impact on the resulting predictor, as well as simple strategies to obtain only dimensionally consistent equations, and avoid the need for dimensional coefficients. The results show that a proper methodology can deliver physically sound models representative of the processes involved, such that genetic programming and machine learning techniques can be used as powerful tools to study complicated phenomena and develop not only purely empirical, but "hybrid" models, coupling results from machine learning methodologies into physics-based models.
2008-11-13
Final Technical Report 4 consumption patterns, and production status. The current version of the AAVS DataMart contains apparel and textile data...which stores the summary of the activity by item; Daily Issues which contains all the issues for the day; Daily Receipts which contains all receipts...entered for the day; and, Open Requisitions which contains all open DSCP Requisitions and Local Purchase Orders. Supply and financial transactions are
High performance cellular level agent-based simulation with FLAME for the GPU.
Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela
2010-05-01
Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.
A framework for the automated data-driven constitutive characterization of composites
J.G. Michopoulos; John Hermanson; T. Furukawa; A. Iliopoulos
2010-01-01
We present advances on the development of a mechatronically and algorithmically automated framework for the data-driven identification of constitutive material models based on energy density considerations. These models can capture both the linear and nonlinear constitutive response of multiaxially loaded composite materials in a manner that accounts for progressive...
Smart Meter Driven Segmentation: What Your Consumption Says About You
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albert, A; Rajagopal, R
With the rollout of smart metering infrastructure at scale, demand-response (DR) programs may now be tailored based on users' consumption patterns as mined from sensed data. For issuing DR events it is key to understand the inter-temporal consumption dynamics as to appropriately segment the user population. We propose to infer occupancy states from consumption time series data using a hidden Markov model framework. Occupancy is characterized in this model by 1) magnitude, 2) duration, and 3) variability. We show that users may be grouped according to their consumption patterns into groups that exhibit qualitatively different dynamics that may be exploitedmore » for program enrollment purposes. We investigate empirically the information that residential energy consumers' temporal energy demand patterns characterized by these three dimensions may convey about their demographic, household, and appliance stock characteristics. Our analysis shows that temporal patterns in the user's consumption data can predict with good accuracy certain user characteristics. We use this framework to argue that there is a large degree of individual predictability in user consumption at a population level.« less
Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2010-01-01
This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust detection will allow for the achievement of pre-specified minimum false alarm and/or missed detection rates in the selection of alert thresholds. All algorithms will also be optimized with respect to an aggregation of these same criteria. Our study relies upon the use of Shuttle data to act as was a proxy for and in preparation for application to Ares I-X data, which uses a very similar hardware platform for the subsystems that are being targeted (TVC - Thrust Vector Control subsystem for the SRB (Solid Rocket Booster)).
A data acquisition and control program for axial-torsional fatigue testing
NASA Technical Reports Server (NTRS)
Kalluri, Sreeramesh; Bonacuse, Peter J.
1989-01-01
A computer program was developed for data acquisition and control of axial-torsional fatigue experiments. The multitasked, interrupt-driven program was written in Pascal and Assembly. This program is capable of dual-channel control and six-channel data acquisition. It can be utilized to perform inphase and out-of-phase axial-torsional isothermal fatigue or deformation experiments. The program was successfully used to conduct inphase axial-torsional fatigue experiments on 304 stainless steel at room temperature and on Hastelloy X at 800 C. The details of the software and some of the results generated to date are presented.
Installation effects on performance of multiple model V/STOL lift fans
NASA Technical Reports Server (NTRS)
Diedrich, J. H.; Clough, N.; Lieblein, S.
1972-01-01
An experimental program was performed in which the individual performance of multiple VTOL model lift fans was measured. The model tested consisted of three 5.5 in. diameter tip-turbine driven model VTOL lift fans mounted chordwise in a two-dimensional wing to simulate a pod-type array. The performance data provided significant insight into possible thrust variations and losses caused by the presence of cover doors, adjacent fuselage panels, and adjacent fans. The effect of a partial loss of drive air supply (simulated gas generator failure) on fan performance was also investigated. The results of the tests demonstrated that lift fan installation variables and hardware can have a significant effect on the thrust of the individual fans.
Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection
NASA Astrophysics Data System (ADS)
Raimalwala, K.; Faragalli, M.; Reid, E.
2018-04-01
The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.
Devil is in the details: Using logic models to investigate program process.
Peyton, David J; Scicchitano, Michael
2017-12-01
Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.
2016-04-01
Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.
Speed management program plan.
DOT National Transportation Integrated Search
2014-05-01
Changing public attitudes regarding speeding and speed management will require a comprehensive and concerted effort, involving a wide variety of strategies. This plan identifies six primary focus areas: : A. Data and Data-Driven Approaches, : B. Rese...
Power-Law Modeling of Cancer Cell Fates Driven by Signaling Data to Reveal Drug Effects
Zhang, Fan; Wu, Min; Kwoh, Chee Keong; Zheng, Jie
2016-01-01
Extracellular signals are captured and transmitted by signaling proteins inside a cell. An important type of cellular responses to the signals is the cell fate decision, e.g., apoptosis. However, the underlying mechanisms of cell fate regulation are still unclear, thus comprehensive and detailed kinetic models are not yet available. Alternatively, data-driven models are promising to bridge signaling data with the phenotypic measurements of cell fates. The traditional linear model for data-driven modeling of signaling pathways has its limitations because it assumes that the a cell fate is proportional to the activities of signaling proteins, which is unlikely in the complex biological systems. Therefore, we propose a power-law model to relate the activities of all the measured signaling proteins to the probabilities of cell fates. In our experiments, we compared our nonlinear power-law model with the linear model on three cancer datasets with phosphoproteomics and cell fate measurements, which demonstrated that the nonlinear model has superior performance on cell fates prediction. By in silico simulation of virtual protein knock-down, the proposed model is able to reveal drug effects which can complement traditional approaches such as binding affinity analysis. Moreover, our model is able to capture cell line specific information to distinguish one cell line from another in cell fate prediction. Our results show that the power-law data-driven model is able to perform better in cell fate prediction and provide more insights into the signaling pathways for cancer cell fates than the linear model. PMID:27764199
NASA Technical Reports Server (NTRS)
Shapiro, Bruce E.; Levchenko, Andre; Meyerowitz, Elliot M.; Wold, Barbara J.; Mjolsness, Eric D.
2003-01-01
Cellerator describes single and multi-cellular signal transduction networks (STN) with a compact, optionally palette-driven, arrow-based notation to represent biochemical reactions and transcriptional activation. Multi-compartment systems are represented as graphs with STNs embedded in each node. Interactions include mass-action, enzymatic, allosteric and connectionist models. Reactions are translated into differential equations and can be solved numerically to generate predictive time courses or output as systems of equations that can be read by other programs. Cellerator simulations are fully extensible and portable to any operating system that supports Mathematica, and can be indefinitely nested within larger data structures to produce highly scaleable models.
Studying Weather and Climate Using Atmospheric Retrospective Analyses
NASA Astrophysics Data System (ADS)
Bosilovich, M. G.
2014-12-01
Over the last 35 years, tremendous amounts of satellite observations of the Earth's atmosphere have been collected along side the much longer and diverse record of in situ measurements. The satellite data records have disparate qualities, structure and uncertainty which make comparing weather from the 80s and 2000s a challenging prospect. Likewise, in-situ data records lack complete coverage of the earth in both space and time. Atmospheric reanalyses use the observations with numerical models and data assimilation to produce continuous and consistent weather data records for periods longer than decades. The result is a simplified data format with a relatively straightforward learning curve that includes many more variables available (through the modeling component of the system), but driven by a full suite of observational data. The simplified data format allows introduction into weather and climate data analysis. Some examples are provided from undergraduate meteorology program internship projects. We will present the students progression through the projects from their initial understanding and competencies to some final results and the skills learned along the way. Reanalyses are a leading research tool in weather and climate, but can also provide an introductory experience as well, allowing students to develop an understanding of the physical system while learning basic programming and analysis skills.
Application driven interface generation for EASIE. M.S. Thesis
NASA Technical Reports Server (NTRS)
Kao, Ya-Chen
1992-01-01
The Environment for Application Software Integration and Execution (EASIE) provides a user interface and a set of utility programs which support the rapid integration and execution of analysis programs about a central relational database. EASIE provides users with two basic modes of execution. One of them is a menu-driven execution mode, called Application-Driven Execution (ADE), which provides sufficient guidance to review data, select a menu action item, and execute an application program. The other mode of execution, called Complete Control Execution (CCE), provides an extended executive interface which allows in-depth control of the design process. Currently, the EASIE system is based on alphanumeric techniques only. It is the purpose of this project to extend the flexibility of the EASIE system in the ADE mode by implementing it in a window system. Secondly, a set of utilities will be developed to assist the experienced engineer in the generation of an ADE application.
NASA Astrophysics Data System (ADS)
Adams, P. E.; Heinrichs, J. F.
2009-12-01
One of the greatest challenges facing the world is climate change. Coupled with this challenge is an under-informed population that has not received a rigorous education about climate change other than what is available through the media. Fort Hays State University is piloting a course on climate change targeted to students early in their academic careers. The course is modeled after our past work (NSF DUE-0088818) of integrating content knowledge instruction and student-driven research where there was a positive correlation between student research engagement and student knowledge gains. The current course, based on prior findings, utilizes a mix of inquiry-based instruction, problem-based learning, and student-driven research to educate and engage the students in understanding climate change. The course was collaboratively developed by a geoscientist and science educator both of whom are active in citizen science programs. The emphasis on civic engagement by students is reflected in the course structure. The course model is unique in that 50% of the course is dedicated to developing core knowledge and technical skills (e.g. critical analysis, writing, data acquisition, data representation, and research design), and 50% to conducting a research project using available data sets from federal agencies and research groups. A key element of the course is a focus on local and regional data sets to make climate change relevant to the students. The research serves as a means of civic engagement by the students as they are tasked to understand their role in communicating their research findings to the community and coping with the local and regional changes they find through their research.
NASA Astrophysics Data System (ADS)
Adams, P. E.; Heinrichs, J. F.
2010-12-01
One of the greatest challenges facing the world is climate change. Coupled with this challenge is an under-informed population that has not received a rigorous education about climate change other than what is available through the media. Fort Hays State University is in a second year of piloting a course on climate change targeted to students early in their academic careers. The course is modeled after our past work (NSF DUE-0088818) of integrating content knowledge instruction and student-driven research where there was a positive correlation between student research engagement and student knowledge gains. The second pilot offering utilizes a mix of inquiry-based instruction, problem-based learning, and student-driven research to educate and engage the students in understanding climate change. The course was collaboratively developed by a geoscientist and science educator both of whom are active in citizen science programs. The course model is unique in that 50% of the course is dedicated to developing core knowledge and technical skills (e.g. global climate change, critical analysis, writing, data acquisition, data representation, and research design), and 50% to conducting a research project using available data sets from federal agencies and research groups. A key element of the course is a focus on data sets to make climate change relevant to the students. The research serves as a means of civic engagement by the students as they are tasked to understand their role in communicating their research findings to the community and coping with the local and regional changes they find through their research. The impacts of course changes from the first offering to the second offering of the course will be reported, as well as the structure of the course.
Advancing data reuse in phyloinformatics using an ontology-driven Semantic Web approach.
Panahiazar, Maryam; Sheth, Amit P; Ranabahu, Ajith; Vos, Rutger A; Leebens-Mack, Jim
2013-01-01
Phylogenetic analyses can resolve historical relationships among genes, organisms or higher taxa. Understanding such relationships can elucidate a wide range of biological phenomena, including, for example, the importance of gene and genome duplications in the evolution of gene function, the role of adaptation as a driver of diversification, or the evolutionary consequences of biogeographic shifts. Phyloinformaticists are developing data standards, databases and communication protocols (e.g. Application Programming Interfaces, APIs) to extend the accessibility of gene trees, species trees, and the metadata necessary to interpret these trees, thus enabling researchers across the life sciences to reuse phylogenetic knowledge. Specifically, Semantic Web technologies are being developed to make phylogenetic knowledge interpretable by web agents, thereby enabling intelligently automated, high-throughput reuse of results generated by phylogenetic research. This manuscript describes an ontology-driven, semantic problem-solving environment for phylogenetic analyses and introduces artefacts that can promote phyloinformatic efforts to promote accessibility of trees and underlying metadata. PhylOnt is an extensible ontology with concepts describing tree types and tree building methodologies including estimation methods, models and programs. In addition we present the PhylAnt platform for annotating scientific articles and NeXML files with PhylOnt concepts. The novelty of this work is the annotation of NeXML files and phylogenetic related documents with PhylOnt Ontology. This approach advances data reuse in phyloinformatics.
Accurate position estimation methods based on electrical impedance tomography measurements
NASA Astrophysics Data System (ADS)
Vergara, Samuel; Sbarbaro, Daniel; Johansen, T. A.
2017-08-01
Electrical impedance tomography (EIT) is a technology that estimates the electrical properties of a body or a cross section. Its main advantages are its non-invasiveness, low cost and operation free of radiation. The estimation of the conductivity field leads to low resolution images compared with other technologies, and high computational cost. However, in many applications the target information lies in a low intrinsic dimensionality of the conductivity field. The estimation of this low-dimensional information is addressed in this work. It proposes optimization-based and data-driven approaches for estimating this low-dimensional information. The accuracy of the results obtained with these approaches depends on modelling and experimental conditions. Optimization approaches are sensitive to model discretization, type of cost function and searching algorithms. Data-driven methods are sensitive to the assumed model structure and the data set used for parameter estimation. The system configuration and experimental conditions, such as number of electrodes and signal-to-noise ratio (SNR), also have an impact on the results. In order to illustrate the effects of all these factors, the position estimation of a circular anomaly is addressed. Optimization methods based on weighted error cost functions and derivate-free optimization algorithms provided the best results. Data-driven approaches based on linear models provided, in this case, good estimates, but the use of nonlinear models enhanced the estimation accuracy. The results obtained by optimization-based algorithms were less sensitive to experimental conditions, such as number of electrodes and SNR, than data-driven approaches. Position estimation mean squared errors for simulation and experimental conditions were more than twice for the optimization-based approaches compared with the data-driven ones. The experimental position estimation mean squared error of the data-driven models using a 16-electrode setup was less than 0.05% of the tomograph radius value. These results demonstrate that the proposed approaches can estimate an object’s position accurately based on EIT measurements if enough process information is available for training or modelling. Since they do not require complex calculations it is possible to use them in real-time applications without requiring high-performance computers.
Model driven development of clinical information sytems using openEHR.
Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, Jim
2011-01-01
openEHR and the recent international standard (ISO 13606) defined a model driven software development methodology for health information systems. However there is little evidence in the literature describing implementation; especially for desktop clinical applications. This paper presents an implementation pathway using .Net/C# technology for Microsoft Windows desktop platforms. An endoscopy reporting application driven by openEHR Archetypes and Templates has been developed. A set of novel GUI directives has been defined and presented which guides the automatic graphical user interface generator to render widgets properly. We also reveal the development steps and important design decisions; from modelling to the final software product. This might provide guidance for other developers and form evidence required for the adoption of these standards for vendors and national programs alike.
ERIC Educational Resources Information Center
Varnavas, Andreas P.; Soteriou, Andreas C.
2002-01-01
Presents and discusses the approach used by the Higher Hotel Institute in Cyprus to incorporate total quality management through establishment of a customer-driven management culture in its hospitality education program. Discusses how it collects and uses service-quality related data from future employers, staff, and students in pursuing this…
NASA Astrophysics Data System (ADS)
Rath, S.; Sengupta, P. P.; Singh, A. P.; Marik, A. K.; Talukdar, P.
2013-07-01
Accurate prediction of roll force during hot strip rolling is essential for model based operation of hot strip mills. Traditionally, mathematical models based on theory of plastic deformation have been used for prediction of roll force. In the last decade, data driven models like artificial neural network have been tried for prediction of roll force. Pure mathematical models have accuracy limitations whereas data driven models have difficulty in convergence when applied to industrial conditions. Hybrid models by integrating the traditional mathematical formulations and data driven methods are being developed in different parts of world. This paper discusses the methodology of development of an innovative hybrid mathematical-artificial neural network model. In mathematical model, the most important factor influencing accuracy is flow stress of steel. Coefficients of standard flow stress equation, calculated by parameter estimation technique, have been used in the model. The hybrid model has been trained and validated with input and output data collected from finishing stands of Hot Strip Mill, Bokaro Steel Plant, India. It has been found that the model accuracy has been improved with use of hybrid model, over the traditional mathematical model.
Keselman, Alla; Ahmed, Einas A; Williamson, Deborah C; Kelly, Janice E; Dutcher, Gale A
2015-04-01
This paper describes a qualitative evaluation of a small-scale program aiming to improve health information literacy, leadership skills, and interest in health careers among high school students in a low-income, primarily minority community. Graduates participated in semi-structured interviews, transcripts of which were coded with a combination of objectives-driven and data-driven categories. The program had a positive impact on the participants' health information competency, leadership skills, academic orientation, and interest in health careers. Program enablers included a supportive network of adults, novel experiences, and strong mentorship. The study suggests that health information can provide a powerful context for enabling disadvantaged students' community engagement and academic success.
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
2011-01-01
Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854
Corporate social responsibility for nanotechnology oversight.
Kuzma, Jennifer; Kuzhabekova, Aliya
2011-11-01
Growing public concern and uncertainties surrounding emerging technologies suggest the need for socially-responsible behavior of companies in the development and implementation of oversight systems for them. In this paper, we argue that corporate social responsibility (CSR) is an important aspect of nanotechnology oversight given the role of trust in shaping public attitudes about nanotechnology and the lack of data about the health and environmental risks of nanoproducts. We argue that CSR is strengthened by the adoption of stakeholder-driven models and attention to moral principles in policies and programs. In this context, we examine drivers of CSR, contextual and leadership factors that influence CSR, and strategies for CSR. To illustrate these concepts, we discuss existing cases of CSR-like behavior in nanotechnology companies, and then provide examples of how companies producing nanomedicines can exhibit morally-driven CSR behavior.
Designing an optimal software intensive system acquisition: A game theoretic approach
NASA Astrophysics Data System (ADS)
Buettner, Douglas John
The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.
Data Driven Model Development for the SuperSonic SemiSpan Transport (S(sup 4)T)
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2011-01-01
In this report, we will investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models, and a data-driven system identification procedure. It is shown via analysis of experimental SuperSonic SemiSpan Transport (S4T) wind-tunnel data that by using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.
NASA Astrophysics Data System (ADS)
Zheng, Feifei; Maier, Holger R.; Wu, Wenyan; Dandy, Graeme C.; Gupta, Hoshin V.; Zhang, Tuqiao
2018-02-01
Hydrological models are used for a wide variety of engineering purposes, including streamflow forecasting and flood-risk estimation. To develop such models, it is common to allocate the available data to calibration and evaluation data subsets. Surprisingly, the issue of how this allocation can affect model evaluation performance has been largely ignored in the research literature. This paper discusses the evaluation performance bias that can arise from how available data are allocated to calibration and evaluation subsets. As a first step to assessing this issue in a statistically rigorous fashion, we present a comprehensive investigation of the influence of data allocation on the development of data-driven artificial neural network (ANN) models of streamflow. Four well-known formal data splitting methods are applied to 754 catchments from Australia and the U.S. to develop 902,483 ANN models. Results clearly show that the choice of the method used for data allocation has a significant impact on model performance, particularly for runoff data that are more highly skewed, highlighting the importance of considering the impact of data splitting when developing hydrological models. The statistical behavior of the data splitting methods investigated is discussed and guidance is offered on the selection of the most appropriate data splitting methods to achieve representative evaluation performance for streamflow data with different statistical properties. Although our results are obtained for data-driven models, they highlight the fact that this issue is likely to have a significant impact on all types of hydrological models, especially conceptual rainfall-runoff models.
Confessions of Former Teen Program Participants: Two Decades Later
ERIC Educational Resources Information Center
Flores, Fabrizio; Wyrick, Gabrielle; Zwicky, Calder
2014-01-01
As a companion to more data-driven articles and studies that consider the long-term impact of art museum teen programs on alumni, this article takes the form of a person to person interview with two founding teen members of important programs that emerged in the 1990s. Talking candidly about the impact of their program participation, Calder Zwicky…
NASA Technical Reports Server (NTRS)
Niiler, Peran P.
2004-01-01
The scientific objective of this research program was to utilize drifter, Jason-1 altimeter data and a variety of wind data for the determination of time mean and time variable wind driven surface currents of the global ocean. To accomplish this task has required the interpolation of 6-hourly winds on drifter tracks and the computation of the wind coherent motions of the drifters. These calculations showed that the Ekman current model proposed by Ralph and Niiler for the tropical Pacific was valid for all the oceans south of 40N latitude. Improvements to RN99 model were computed and poster presentations of the results were given in several ocean science venues, including the November 2004 GODAY meeting in St. Petersburg, FL.
NASA Astrophysics Data System (ADS)
Wang, Hexiang; Schuster, Eugenio; Rafiq, Tariq; Kritz, Arnold; Ding, Siye
2016-10-01
Extensive research has been conducted to find high-performance operating scenarios characterized by high fusion gain, good confinement, plasma stability and possible steady-state operation. A key plasma property that is related to both the stability and performance of these advanced plasma scenarios is the safety factor profile. A key component of the EAST research program is the exploration of non-inductively driven steady-state plasmas with the recently upgraded heating and current drive capabilities that include lower hybrid current drive and neutral beam injection. Anticipating the need for tight regulation of the safety factor profile in these plasma scenarios, a first-principles-driven (FPD)control-oriented model is proposed to describe the safety factor profile evolution in EAST in response to the different actuators. The TRANSP simulation code is employed to tailor the FPD model to the EAST tokamak geometry and to convert it into a form suitable for control design. The FPD control-oriented model's prediction capabilities are demonstrated by comparing predictions with experimental data from EAST. Supported by the US DOE under DE-SC0010537,DE-FG02-92ER54141 and DE-SC0013977.
Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria
2016-09-23
The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.
Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin
2016-01-01
The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035
ERIC Educational Resources Information Center
Wymbs, Cliff
2016-01-01
The designing of a new, potentially disruptive, curricular program, is not without challenges; however, it can be rewarding for students, faculty, and employers and serve as a template for other academics to follow. To be effective, the new data analytics program should be driven by business input and academic leadership that incorporates…
DEBRIS: a computer program for analyzing channel cross sections
Patrick Deenihan; Thomas E. Lisle
1988-01-01
DEBRIS is a menu-driven, interactive computer program written in FORTRAN 77 for recording and plotting survey data and for computing hydraulic variables and depths of scour and fill. It was developed for use with the USDA Forest Service's Data General computer system, with the AOS/VS operation system. By using menus, the operator does not need to know any...
DEBRIS: A computer program for analyzing channel cross sections
Patrick Deenihan; Thomas E. Lisle
1988-01-01
DEBRIS is a menu-driven, interactive computer program written in FORTRAN 77 for recording and platting survey data and for computing hydraulic variables and depths of scour and fill. It was developed for use with the USDA Forest Service's Data General computer system, with the AOS/VS operating system. By using menus, the operator does not need to know any...
Wu, Xiao; Shen, Jiong; Li, Yiguo; Lee, Kwang Y
2014-05-01
This paper develops a novel data-driven fuzzy modeling strategy and predictive controller for boiler-turbine unit using fuzzy clustering and subspace identification (SID) methods. To deal with the nonlinear behavior of boiler-turbine unit, fuzzy clustering is used to provide an appropriate division of the operation region and develop the structure of the fuzzy model. Then by combining the input data with the corresponding fuzzy membership functions, the SID method is extended to extract the local state-space model parameters. Owing to the advantages of the both methods, the resulting fuzzy model can represent the boiler-turbine unit very closely, and a fuzzy model predictive controller is designed based on this model. As an alternative approach, a direct data-driven fuzzy predictive control is also developed following the same clustering and subspace methods, where intermediate subspace matrices developed during the identification procedure are utilized directly as the predictor. Simulation results show the advantages and effectiveness of the proposed approach. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
Chaste: A test-driven approach to software development for biological modelling
NASA Astrophysics Data System (ADS)
Pitt-Francis, Joe; Pathmanathan, Pras; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Fletcher, Alexander G.; Mirams, Gary R.; Murray, Philip; Osborne, James M.; Walter, Alex; Chapman, S. Jon; Garny, Alan; van Leeuwen, Ingeborg M. M.; Maini, Philip K.; Rodríguez, Blanca; Waters, Sarah L.; Whiteley, Jonathan P.; Byrne, Helen M.; Gavaghan, David J.
2009-12-01
Chaste ('Cancer, heart and soft-tissue environment') is a software library and a set of test suites for computational simulations in the domain of biology. Current functionality has arisen from modelling in the fields of cancer, cardiac physiology and soft-tissue mechanics. It is released under the LGPL 2.1 licence. Chaste has been developed using agile programming methods. The project began in 2005 when it was reasoned that the modelling of a variety of physiological phenomena required both a generic mathematical modelling framework, and a generic computational/simulation framework. The Chaste project evolved from the Integrative Biology (IB) e-Science Project, an inter-institutional project aimed at developing a suitable IT infrastructure to support physiome-level computational modelling, with a primary focus on cardiac and cancer modelling. Program summaryProgram title: Chaste Catalogue identifier: AEFD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL 2.1 No. of lines in distributed program, including test data, etc.: 5 407 321 No. of bytes in distributed program, including test data, etc.: 42 004 554 Distribution format: tar.gz Programming language: C++ Operating system: Unix Has the code been vectorised or parallelized?: Yes. Parallelized using MPI. RAM:<90 Megabytes for two of the scenarios described in Section 6 of the manuscript (Monodomain re-entry on a slab or Cylindrical crypt simulation). Up to 16 Gigabytes (distributed across processors) for full resolution bidomain cardiac simulation. Classification: 3. External routines: Boost, CodeSynthesis XSD, CxxTest, HDF5, METIS, MPI, PETSc, Triangle, Xerces Nature of problem: Chaste may be used for solving coupled ODE and PDE systems arising from modelling biological systems. Use of Chaste in two application areas are described in this paper: cardiac electrophysiology and intestinal crypt dynamics. Solution method: Coupled multi-physics with PDE, ODE and discrete mechanics simulation. Running time: The largest cardiac simulation described in the manuscript takes about 6 hours to run on a single 3 GHz core. See results section (Section 6) of the manuscript for discussion on parallel scaling.
Curriculum Redesign in Veterinary Medicine: Part II.
Macik, Maria L; Chaney, Kristin P; Turner, Jacqueline S; Rogers, Kenita S; Scallan, Elizabeth M; Korich, Jodi A; Fowler, Debra; Keefe, Lisa M
Curricular review is considered a necessary component for growth and enhancement of academic programs and requires time, energy, creativity, and persistence from both faculty and administration. On a larger scale, a comprehensive redesign effort involves forming a dedicated faculty redesign team, developing program learning outcomes, mapping the existing curriculum, and reviewing the curriculum in light of collected stakeholder data. The faculty of the Texas A&M University College of Veterinary Medicine & Biomedical Sciences (TAMU) recently embarked on a comprehensive curriculum redesign effort through partnership with the university's Center for Teaching Excellence. Using a previously developed evidence-based model of program redesign, TAMU created a process for use in veterinary medical education, which is described in detail in the first part of this article series. An additional component of the redesign process that is understated, yet vital for success, is faculty buy-in and support. Without faculty engagement, implementation of data-driven curricular changes stemming from program evaluation may be challenging. This second part of the article series describes the methodology for encouraging faculty engagement through the final steps of the redesign initiative and the lessons learned by TAMU through the redesign process.
Accountability: A M.E.A.S.U.R.E of the Impact School Counselors Have on Student Achievement.
ERIC Educational Resources Information Center
Dahir, Carol A.; Stone, Carolyn B.
2003-01-01
Presents information on M.E.A.S.U.R.E., a process that assists school counselors in delivering a data-driven school counseling program. Highlights challenges faced by school counselors in an accountability-driven environment and provisions of the No Child Left Behind Act. (Contains 10 references and 6 tables.) (GCP)
Champagne, François; Lemieux-Charles, Louise; Duranceau, Marie-France; MacKean, Gail; Reay, Trish
2014-05-02
The impact of efforts by healthcare organizations to enhance the use of evidence to improve organizational processes through training programs has seldom been assessed. We therefore endeavored to assess whether and how the training of mid- and senior-level healthcare managers could lead to organizational change. We conducted a theory-driven evaluation of the organizational impact of healthcare leaders' participation in two training programs using a logic model based on Nonaka's theory of knowledge conversion. We analyzed six case studies nested within the two programs using three embedded units of analysis (individual, group and organization). Interviews were conducted during intensive one-week data collection site visits. A total of 84 people were interviewed. We found that the impact of training could primarily be felt in trainees' immediate work environments. The conversion of attitudes was found to be easier to achieve than the conversion of skills. Our results show that, although socialization and externalization were common in all cases, a lack of combination impeded the conversion of skills. We also identified several individual, organizational and program design factors that facilitated and/or impeded the dissemination of the attitudes and skills gained by trainees to other organizational members. Our theory-driven evaluation showed that factors before, during and after training can influence the extent of skills and knowledge transfer. Our evaluation went further than previous research by revealing the influence--both positive and negative--of specific organizational factors on extending the impact of training programs.
Cryo-EM Data Are Superior to Contact and Interface Information in Integrative Modeling.
de Vries, Sjoerd J; Chauvot de Beauchêne, Isaure; Schindler, Christina E M; Zacharias, Martin
2016-02-23
Protein-protein interactions carry out a large variety of essential cellular processes. Cryo-electron microscopy (cryo-EM) is a powerful technique for the modeling of protein-protein interactions at a wide range of resolutions, and recent developments have caused a revolution in the field. At low resolution, cryo-EM maps can drive integrative modeling of the interaction, assembling existing structures into the map. Other experimental techniques can provide information on the interface or on the contacts between the monomers in the complex. This inevitably raises the question regarding which type of data is best suited to drive integrative modeling approaches. Systematic comparison of the prediction accuracy and specificity of the different integrative modeling paradigms is unavailable to date. Here, we compare EM-driven, interface-driven, and contact-driven integrative modeling paradigms. Models were generated for the protein docking benchmark using the ATTRACT docking engine and evaluated using the CAPRI two-star criterion. At 20 Å resolution, EM-driven modeling achieved a success rate of 100%, outperforming the other paradigms even with perfect interface and contact information. Therefore, even very low resolution cryo-EM data is superior in predicting heterodimeric and heterotrimeric protein assemblies. Our study demonstrates that a force field is not necessary, cryo-EM data alone is sufficient to accurately guide the monomers into place. The resulting rigid models successfully identify regions of conformational change, opening up perspectives for targeted flexible remodeling. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Cryo-EM Data Are Superior to Contact and Interface Information in Integrative Modeling
de Vries, Sjoerd J.; Chauvot de Beauchêne, Isaure; Schindler, Christina E.M.; Zacharias, Martin
2016-01-01
Protein-protein interactions carry out a large variety of essential cellular processes. Cryo-electron microscopy (cryo-EM) is a powerful technique for the modeling of protein-protein interactions at a wide range of resolutions, and recent developments have caused a revolution in the field. At low resolution, cryo-EM maps can drive integrative modeling of the interaction, assembling existing structures into the map. Other experimental techniques can provide information on the interface or on the contacts between the monomers in the complex. This inevitably raises the question regarding which type of data is best suited to drive integrative modeling approaches. Systematic comparison of the prediction accuracy and specificity of the different integrative modeling paradigms is unavailable to date. Here, we compare EM-driven, interface-driven, and contact-driven integrative modeling paradigms. Models were generated for the protein docking benchmark using the ATTRACT docking engine and evaluated using the CAPRI two-star criterion. At 20 Å resolution, EM-driven modeling achieved a success rate of 100%, outperforming the other paradigms even with perfect interface and contact information. Therefore, even very low resolution cryo-EM data is superior in predicting heterodimeric and heterotrimeric protein assemblies. Our study demonstrates that a force field is not necessary, cryo-EM data alone is sufficient to accurately guide the monomers into place. The resulting rigid models successfully identify regions of conformational change, opening up perspectives for targeted flexible remodeling. PMID:26846888
ERIC Educational Resources Information Center
Dierker, Lisa; Ward, Nadia; Alexander, Jalen; Donate, Emmanuel
2017-01-01
Background: Upward trends in data-oriented careers threaten to further increase the underrepresentation of both females and individuals from racial minority groups in programs focused on data analysis and applied statistics. To begin to develop the necessary skills for a data-oriented career, project-based learning seems the most promising given…
NASA Astrophysics Data System (ADS)
Grosskopf, M. J.; Drake, R. P.; Trantham, M. R.; Kuranz, C. C.; Keiter, P. A.; Rutter, E. M.; Sweeney, R. M.; Malamud, G.
2012-10-01
The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density physics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. CRASH model results have shown good agreement with a experimental results from a variety of applications, including: radiative shock, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL), collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.
Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.
Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe
2018-01-01
Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.
USDA-ARS?s Scientific Manuscript database
Recent years have witnessed a call for evidence-based decisions in conservation and natural resource management, including data-driven decision-making. Adaptive management (AM) is one prevalent model for integrating scientific data into decision-making, yet AM has faced numerous challenges and limit...
Statistical and engineering methods for model enhancement
NASA Astrophysics Data System (ADS)
Chang, Chia-Jung
Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.
Data integrity systems for organ contours in radiation therapy planning.
Shah, Veeraj P; Lakshminarayanan, Pranav; Moore, Joseph; Tran, Phuoc T; Quon, Harry; Deville, Curtiland; McNutt, Todd R
2018-06-12
The purpose of this research is to develop effective data integrity models for contoured anatomy in a radiotherapy workflow for both real-time and retrospective analysis. Within this study, two classes of contour integrity models were developed: data driven models and contiguousness models. The data driven models aim to highlight contours which deviate from a gross set of contours from similar disease sites and encompass the following regions of interest (ROI): bladder, femoral heads, spinal cord, and rectum. The contiguousness models, which individually analyze the geometry of contours to detect possible errors, are applied across many different ROI's and are divided into two metrics: Extent and Region Growing over volume. After analysis, we found that 70% of detected bladder contours were verified as suspicious. The spinal cord and rectum models verified that 73% and 80% of contours were suspicious respectively. The contiguousness models were the most accurate models and the Region Growing model was the most accurate submodel. 100% of the detected noncontiguous contours were verified as suspicious, but in the cases of spinal cord, femoral heads, bladder, and rectum, the Region Growing model detected additional two to five suspicious contours that the Extent model failed to detect. When conducting a blind review to detect false negatives, it was found that all the data driven models failed to detect all suspicious contours. The Region Growing contiguousness model produced zero false negatives in all regions of interest other than prostate. With regards to runtime, the contiguousness via extent model took an average of 0.2 s per contour. On the other hand, the region growing method had a longer runtime which was dependent on the number of voxels in the contour. Both contiguousness models have potential for real-time use in clinical radiotherapy while the data driven models are better suited for retrospective use. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Open-ocean boundary conditions from interior data: Local and remote forcing of Massachusetts Bay
Bogden, P.S.; Malanotte-Rizzoli, P.; Signell, R.
1996-01-01
Massachusetts and Cape Cod Bays form a semienclosed coastal basin that opens onto the much larger Gulf of Maine. Subtidal circulation in the bay is driven by local winds and remotely driven flows from the gulf. The local-wind forced flow is estimated with a regional shallow water model driven by wind measurements. The model uses a gravity wave radiation condition along the open-ocean boundary. Results compare reasonably well with observed currents near the coast. In some offshore regions however, modeled flows are an order of magnitude less energetic than the data. Strong flows are observed even during periods of weak local wind forcing. Poor model-data comparisons are attributable, at least in part, to open-ocean boundary conditions that neglect the effects of remote forcing. Velocity measurements from within Massachusetts Bay are used to estimate the remotely forced component of the flow. The data are combined with shallow water dynamics in an inverse-model formulation that follows the theory of Bennett and McIntosh [1982], who considered tides. We extend their analysis to consider the subtidal response to transient forcing. The inverse model adjusts the a priori open-ocean boundary condition, thereby minimizing a combined measure of model-data misfit and boundary condition adjustment. A "consistency criterion" determines the optimal trade-off between the two. The criterion is based on a measure of plausibility for the inverse solution. The "consistent" inverse solution reproduces 56% of the average squared variation in the data. The local-wind-driven flow alone accounts for half of the model skill. The other half is attributable to remotely forced flows from the Gulf of Maine. The unexplained 44% comes from measurement errors and model errors that are not accounted for in the analysis.
Supervised dictionary learning for inferring concurrent brain networks.
Zhao, Shijie; Han, Junwei; Lv, Jinglei; Jiang, Xi; Hu, Xintao; Zhao, Yu; Ge, Bao; Guo, Lei; Liu, Tianming
2015-10-01
Task-based fMRI (tfMRI) has been widely used to explore functional brain networks via predefined stimulus paradigm in the fMRI scan. Traditionally, the general linear model (GLM) has been a dominant approach to detect task-evoked networks. However, GLM focuses on task-evoked or event-evoked brain responses and possibly ignores the intrinsic brain functions. In comparison, dictionary learning and sparse coding methods have attracted much attention recently, and these methods have shown the promise of automatically and systematically decomposing fMRI signals into meaningful task-evoked and intrinsic concurrent networks. Nevertheless, two notable limitations of current data-driven dictionary learning method are that the prior knowledge of task paradigm is not sufficiently utilized and that the establishment of correspondences among dictionary atoms in different brains have been challenging. In this paper, we propose a novel supervised dictionary learning and sparse coding method for inferring functional networks from tfMRI data, which takes both of the advantages of model-driven method and data-driven method. The basic idea is to fix the task stimulus curves as predefined model-driven dictionary atoms and only optimize the other portion of data-driven dictionary atoms. Application of this novel methodology on the publicly available human connectome project (HCP) tfMRI datasets has achieved promising results.
Systems Engineering for Distributed, Live, Virtual, and Constructive (LVC) Simulation
2010-12-01
programming languages like the Scala programming language (Wampler et al. 2009), provide tighter con- trol of syntax guidance and problem...Wampler, D. and A. Payne. 2009. Programming Scala . 1 st ed. O’Reilly Media 1510 Gallant and Gaughan AUTHOR BIOGRAPHIES SCOTT GALLANT is a Systems...subsequently linked to the technical design. Doing this within a data-driven systems engineering infrastructure allows generative programming techniques
Chen, Gang; Glen, Daniel R.; Saad, Ziad S.; Hamilton, J. Paul; Thomason, Moriah E.; Gotlib, Ian H.; Cox, Robert W.
2011-01-01
Vector autoregression (VAR) and structural equation modeling (SEM) are two popular brain-network modeling tools. VAR, which is a data-driven approach, assumes that connected regions exert time-lagged influences on one another. In contrast, the hypothesis-driven SEM is used to validate an existing connectivity model where connected regions have contemporaneous interactions among them. We present the two models in detail and discuss their applicability to FMRI data, and interpretational limits. We also propose a unified approach that models both lagged and contemporaneous effects. The unifying model, structural vector autoregression (SVAR), may improve statistical and explanatory power, and avoids some prevalent pitfalls that can occur when VAR and SEM are utilized separately. PMID:21975109
Satellite Data Processing System (SDPS) users manual V1.0
NASA Technical Reports Server (NTRS)
Caruso, Michael; Dunn, Chris
1989-01-01
SDPS is a menu driven interactive program designed to facilitate the display and output of image and line-based data sets common to telemetry, modeling and remote sensing. This program can be used to display up to four separate raster images and overlay line-based data such as coastlines, ship tracks and velocity vectors. The program uses multiple windows to communicate information with the user. At any given time, the program may have up to four image display windows as well as auxiliary windows containing information about each image displayed. SDPS is not a commercial program. It does not contain complete type checking or error diagnostics which may allow the program to crash. Known anomalies will be mentioned in the appropriate section as notes or cautions. SDPS was designed to be used on Sun Microsystems Workstations running SunView1 (Sun Visual/Integrated Environment for Workstations). It was primarily designed to be used on workstations equipped with color monitors, but most of the line-based functions and several of the raster-based functions can be used with monochrome monitors. The program currently runs on Sun 3 series workstations running Sun OS 4.0 and should port easily to Sun 4 and Sun 386 series workstations with SunView1. Users should also be familiar with UNIX, Sun workstations and the SunView window system.
Imaging plus X: multimodal models of neurodegenerative disease.
Oxtoby, Neil P; Alexander, Daniel C
2017-08-01
This article argues that the time is approaching for data-driven disease modelling to take centre stage in the study and management of neurodegenerative disease. The snowstorm of data now available to the clinician defies qualitative evaluation; the heterogeneity of data types complicates integration through traditional statistical methods; and the large datasets becoming available remain far from the big-data sizes necessary for fully data-driven machine-learning approaches. The recent emergence of data-driven disease progression models provides a balance between imposed knowledge of disease features and patterns learned from data. The resulting models are both predictive of disease progression in individual patients and informative in terms of revealing underlying biological patterns. Largely inspired by observational models, data-driven disease progression models have emerged in the last few years as a feasible means for understanding the development of neurodegenerative diseases. These models have revealed insights into frontotemporal dementia, Huntington's disease, multiple sclerosis, Parkinson's disease and other conditions. For example, event-based models have revealed finer graded understanding of progression patterns; self-modelling regression and differential equation models have provided data-driven biomarker trajectories; spatiotemporal models have shown that brain shape changes, for example of the hippocampus, can occur before detectable neurodegeneration; and network models have provided some support for prion-like mechanistic hypotheses of disease propagation. The most mature results are in sporadic Alzheimer's disease, in large part because of the availability of the Alzheimer's disease neuroimaging initiative dataset. Results generally support the prevailing amyloid-led hypothetical model of Alzheimer's disease, while revealing finer detail and insight into disease progression. The emerging field of disease progression modelling provides a natural mechanism to integrate different kinds of information, for example from imaging, serum and cerebrospinal fluid markers and cognitive tests, to obtain new insights into progressive diseases. Such insights include fine-grained longitudinal patterns of neurodegeneration, from early stages, and the heterogeneity of these trajectories over the population. More pragmatically, such models enable finer precision in patient staging and stratification, prediction of progression rates and earlier and better identification of at-risk individuals. We argue that this will make disease progression modelling invaluable for recruitment and end-points in future clinical trials, potentially ameliorating the high failure rate in trials of, e.g., Alzheimer's disease therapies. We review the state of the art in these techniques and discuss the future steps required to translate the ideas to front-line application.
Data-driven integration of genome-scale regulatory and metabolic network models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imam, Saheed; Schauble, Sascha; Brooks, Aaron N.
Microbes are diverse and extremely versatile organisms that play vital roles in all ecological niches. Understanding and harnessing microbial systems will be key to the sustainability of our planet. One approach to improving our knowledge of microbial processes is through data-driven and mechanism-informed computational modeling. Individual models of biological networks (such as metabolism, transcription, and signaling) have played pivotal roles in driving microbial research through the years. These networks, however, are highly interconnected and function in concert a fact that has led to the development of a variety of approaches aimed at simulating the integrated functions of two or moremore » network types. Though the task of integrating these different models is fraught with new challenges, the large amounts of high-throughput data sets being generated, and algorithms being developed, means that the time is at hand for concerted efforts to build integrated regulatory-metabolic networks in a data-driven fashion. Lastly, in this perspective, we review current approaches for constructing integrated regulatory-metabolic models and outline new strategies for future development of these network models for any microbial system.« less
Data-driven integration of genome-scale regulatory and metabolic network models
Imam, Saheed; Schauble, Sascha; Brooks, Aaron N.; ...
2015-05-05
Microbes are diverse and extremely versatile organisms that play vital roles in all ecological niches. Understanding and harnessing microbial systems will be key to the sustainability of our planet. One approach to improving our knowledge of microbial processes is through data-driven and mechanism-informed computational modeling. Individual models of biological networks (such as metabolism, transcription, and signaling) have played pivotal roles in driving microbial research through the years. These networks, however, are highly interconnected and function in concert a fact that has led to the development of a variety of approaches aimed at simulating the integrated functions of two or moremore » network types. Though the task of integrating these different models is fraught with new challenges, the large amounts of high-throughput data sets being generated, and algorithms being developed, means that the time is at hand for concerted efforts to build integrated regulatory-metabolic networks in a data-driven fashion. Lastly, in this perspective, we review current approaches for constructing integrated regulatory-metabolic models and outline new strategies for future development of these network models for any microbial system.« less
Stochastic Geometric Models with Non-stationary Spatial Correlations in Lagrangian Fluid Flows
NASA Astrophysics Data System (ADS)
Gay-Balmaz, François; Holm, Darryl D.
2018-01-01
Inspired by spatiotemporal observations from satellites of the trajectories of objects drifting near the surface of the ocean in the National Oceanic and Atmospheric Administration's "Global Drifter Program", this paper develops data-driven stochastic models of geophysical fluid dynamics (GFD) with non-stationary spatial correlations representing the dynamical behaviour of oceanic currents. Three models are considered. Model 1 from Holm (Proc R Soc A 471:20140963, 2015) is reviewed, in which the spatial correlations are time independent. Two new models, called Model 2 and Model 3, introduce two different symmetry breaking mechanisms by which the spatial correlations may be advected by the flow. These models are derived using reduction by symmetry of stochastic variational principles, leading to stochastic Hamiltonian systems, whose momentum maps, conservation laws and Lie-Poisson bracket structures are used in developing the new stochastic Hamiltonian models of GFD.
Stochastic Geometric Models with Non-stationary Spatial Correlations in Lagrangian Fluid Flows
NASA Astrophysics Data System (ADS)
Gay-Balmaz, François; Holm, Darryl D.
2018-06-01
Inspired by spatiotemporal observations from satellites of the trajectories of objects drifting near the surface of the ocean in the National Oceanic and Atmospheric Administration's "Global Drifter Program", this paper develops data-driven stochastic models of geophysical fluid dynamics (GFD) with non-stationary spatial correlations representing the dynamical behaviour of oceanic currents. Three models are considered. Model 1 from Holm (Proc R Soc A 471:20140963, 2015) is reviewed, in which the spatial correlations are time independent. Two new models, called Model 2 and Model 3, introduce two different symmetry breaking mechanisms by which the spatial correlations may be advected by the flow. These models are derived using reduction by symmetry of stochastic variational principles, leading to stochastic Hamiltonian systems, whose momentum maps, conservation laws and Lie-Poisson bracket structures are used in developing the new stochastic Hamiltonian models of GFD.
Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya
2013-01-01
Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607
Data-driven non-linear elasticity: constitutive manifold construction and problem discretization
NASA Astrophysics Data System (ADS)
Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco
2017-11-01
The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ping; Lv, Youbin; Wang, Hong
Optimal operation of a practical blast furnace (BF) ironmaking process depends largely on a good measurement of molten iron quality (MIQ) indices. However, measuring the MIQ online is not feasible using the available techniques. In this paper, a novel data-driven robust modeling is proposed for online estimation of MIQ using improved random vector functional-link networks (RVFLNs). Since the output weights of traditional RVFLNs are obtained by the least squares approach, a robustness problem may occur when the training dataset is contaminated with outliers. This affects the modeling accuracy of RVFLNs. To solve this problem, a Cauchy distribution weighted M-estimation basedmore » robust RFVLNs is proposed. Since the weights of different outlier data are properly determined by the Cauchy distribution, their corresponding contribution on modeling can be properly distinguished. Thus robust and better modeling results can be achieved. Moreover, given that the BF is a complex nonlinear system with numerous coupling variables, the data-driven canonical correlation analysis is employed to identify the most influential components from multitudinous factors that affect the MIQ indices to reduce the model dimension. Finally, experiments using industrial data and comparative studies have demonstrated that the obtained model produces a better modeling and estimating accuracy and stronger robustness than other modeling methods.« less
Assessing Argumentative Representation with Bayesian Network Models in Debatable Social Issues
ERIC Educational Resources Information Center
Zhang, Zhidong; Lu, Jingyan
2014-01-01
This study seeks to obtain argumentation models, which represent argumentative processes and an assessment structure in secondary school debatable issues in the social sciences. The argumentation model was developed based on mixed methods, a combination of both theory-driven and data-driven methods. The coding system provided a combing point by…
Koss, Mary P; Bachar, Karen J; Hopkins, C Quince; Carlson, Carolyn
2004-12-01
Problems in criminal justice system response to date-acquaintance rape and nonpenetration sexual offenses include (a) they are markers of a sexual offending career, yet are viewed as minor; (b) perpetrators are not held accountable in ways that reduce reoffense; and (c) criminal justice response disappoints and traumatizes victims. To address these problems, a collaboration of victim services, prosecutors, legal scholars, and public health professionals are implementing and evaluating RESTORE, a victim-driven, community-based restorative justice program for selected sex crimes. RESTORE prepares survivors, responsible persons (offenders), and both parties' families and friends for face-to-face dialogue to identify the harm and develop a redress plan. The program then monitors the offender's compliance for 12 months. The article summarizes empirical data on problems in criminal justice response, defines restorative justice models, and examines outcome. Then the RESTORE program processes and goals are described. The article highlights community collaboration in building and sustaining this program.
Data-based virtual unmodeled dynamics driven multivariable nonlinear adaptive switching control.
Chai, Tianyou; Zhang, Yajun; Wang, Hong; Su, Chun-Yi; Sun, Jing
2011-12-01
For a complex industrial system, its multivariable and nonlinear nature generally make it very difficult, if not impossible, to obtain an accurate model, especially when the model structure is unknown. The control of this class of complex systems is difficult to handle by the traditional controller designs around their operating points. This paper, however, explores the concepts of controller-driven model and virtual unmodeled dynamics to propose a new design framework. The design consists of two controllers with distinct functions. First, using input and output data, a self-tuning controller is constructed based on a linear controller-driven model. Then the output signals of the controller-driven model are compared with the true outputs of the system to produce so-called virtual unmodeled dynamics. Based on the compensator of the virtual unmodeled dynamics, the second controller based on a nonlinear controller-driven model is proposed. Those two controllers are integrated by an adaptive switching control algorithm to take advantage of their complementary features: one offers stabilization function and another provides improved performance. The conditions on the stability and convergence of the closed-loop system are analyzed. Both simulation and experimental tests on a heavily coupled nonlinear twin-tank system are carried out to confirm the effectiveness of the proposed method.
Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.
Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil
2012-07-01
Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.
ERIC Educational Resources Information Center
Shimpi, Priya M.; Paik, Jae H.; Wanerman, Todd; Johnson, Rebecca; Li, Hui; Duh, Shinchieh
2015-01-01
The current English-language research and educational program was driven by an initiative to create a more interactive, theme-based bilingual language education model for preschools in Chengdu, China. During a 2-week teacher education program centered at the Experimental Kindergarten of the Chinese Academy of Sciences in Chengdu, China, a team of…
NASA Astrophysics Data System (ADS)
Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten
2017-12-01
Vegetation fires affect human infrastructures, ecosystems, global vegetation distribution, and atmospheric composition. However, the climatic, environmental, and socioeconomic factors that control global fire activity in vegetation are only poorly understood, and in various complexities and formulations are represented in global process-oriented vegetation-fire models. Data-driven model approaches such as machine learning algorithms have successfully been used to identify and better understand controlling factors for fire activity. However, such machine learning models cannot be easily adapted or even implemented within process-oriented global vegetation-fire models. To overcome this gap between machine learning-based approaches and process-oriented global fire models, we introduce a new flexible data-driven fire modelling approach here (Satellite Observations to predict FIre Activity, SOFIA approach version 1). SOFIA models can use several predictor variables and functional relationships to estimate burned area that can be easily adapted with more complex process-oriented vegetation-fire models. We created an ensemble of SOFIA models to test the importance of several predictor variables. SOFIA models result in the highest performance in predicting burned area if they account for a direct restriction of fire activity under wet conditions and if they include a land cover-dependent restriction or allowance of fire activity by vegetation density and biomass. The use of vegetation optical depth data from microwave satellite observations, a proxy for vegetation biomass and water content, reaches higher model performance than commonly used vegetation variables from optical sensors. We further analyse spatial patterns of the sensitivity between anthropogenic, climate, and vegetation predictor variables and burned area. We finally discuss how multiple observational datasets on climate, hydrological, vegetation, and socioeconomic variables together with data-driven modelling and model-data integration approaches can guide the future development of global process-oriented vegetation-fire models.
Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course
ERIC Educational Resources Information Center
McGowan, Ian S.
2016-01-01
Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…
ERIC Educational Resources Information Center
Poole, Dennis L.; Nelson, Joan; Carnahan, Sharon; Chepenik, Nancy G.; Tubiak, Christine
2000-01-01
Developed and field tested the Performance Accountability Quality Scale (PAQS) on 191 program performance measurement systems developed by nonprofit agencies in central Florida. Preliminary findings indicate that the PAQS provides a structure for obtaining expert opinions based on a theory-driven model about the quality of proposed measurement…
Designing a Dynamic Data Driven Application System for Estimating Real-Time Load of DOC in a River
NASA Astrophysics Data System (ADS)
Ouyang, Y.; None
2011-12-01
Understanding the dynamics of naturally occurring dissolved organic carbon (DOC) in a river is central to estimating surface water quality, aquatic carbon cycling, and climate change. Currently, determination of DOC in surface water is primarily accomplished by manually collecting samples for laboratory analysis, which requires at least 24 hours. In other words, no effort has been devoted to monitoring real-time variations of DOC in a river due to the lack of suitable and/or cost-effective wireless sensors. However, when considering human health, carbon footprints, and effects of urbanization, industry, and agriculture on water resource supply, timely DOC information may be critical. We have developed here a new paradigm, a dynamic data driven application system (DDDAS), for estimating the real-time load of DOC into a river. This DDDAS consisted of the following four components: (1) a Visual Basic (VB) program for downloading US Geological Survey real-time chlorophyll and discharge data; (2) a STELLA model for evaluating real-time DOC load based on the relationship between chlorophyll a, DOC, and river discharge; (3) a batch file for linking the VB program and STELLA model; and (4) a Microsoft Windows Scheduled Tasks wizard for executing the model and displaying output on a computer screen at selected times. Results show that the real-time load of DOC into the St. Johns River basin near Satsuma, Putnam County, Florida, USA varied over a range from -13,143 to 29,248 kg/h at the selected site in Florida, USA. The negative loads occurred because of the back flow in the estuarine reach of the river. The cumulative load of DOC in the river for the selected site at the end of the simulation (178 hours) was about 1.2 tons. Our results support the utility of the DDDAS developed in this study for estimating the real-time variations of DOC in river ecosystems.
Just-in-time Database-Driven Web Applications
2003-01-01
"Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109
Large eddy simulations of time-dependent and buoyancy-driven channel flows
NASA Technical Reports Server (NTRS)
Cabot, William H.
1993-01-01
The primary goal of this work has been to assess the performance of the dynamic SGS model in the large eddy simulation (LES) of channel flows in a variety of situations, viz., in temporal development of channel flow turned by a transverse pressure gradient and especially in buoyancy-driven turbulent flows such as Rayleigh-Benard and internally heated channel convection. For buoyancy-driven flows, there are additional buoyant terms that are possible in the base models, and one objective has been to determine if the dynamic SGS model results are sensitive to such terms. The ultimate goal is to determine the minimal base model needed in the dynamic SGS model to provide accurate results in flows with more complicated physical features. In addition, a program of direct numerical simulation (DNS) of fully compressible channel convection has been undertaken to determine stratification and compressibility effects. These simulations are intended to provide a comparative base for performing the LES of compressible (or highly stratified, pseudo-compressible) convection at high Reynolds number in the future.
Masso, Majid; Vaisman, Iosif I
2014-01-01
The AUTO-MUTE 2.0 stand-alone software package includes a collection of programs for predicting functional changes to proteins upon single residue substitutions, developed by combining structure-based features with trained statistical learning models. Three of the predictors evaluate changes to protein stability upon mutation, each complementing a distinct experimental approach. Two additional classifiers are available, one for predicting activity changes due to residue replacements and the other for determining the disease potential of mutations associated with nonsynonymous single nucleotide polymorphisms (nsSNPs) in human proteins. These five command-line driven tools, as well as all the supporting programs, complement those that run our AUTO-MUTE web-based server. Nevertheless, all the codes have been rewritten and substantially altered for the new portable software, and they incorporate several new features based on user feedback. Included among these upgrades is the ability to perform three highly requested tasks: to run "big data" batch jobs; to generate predictions using modified protein data bank (PDB) structures, and unpublished personal models prepared using standard PDB file formatting; and to utilize NMR structure files that contain multiple models.
Odyssey Reading. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
"Odyssey Reading," published by CompassLearning[R], is a web-based K-12 reading/language arts program designed to allow for instructional differentiation and data-driven decision making. The online program includes electronic curricula and materials for individual or small-group work, assessments aligned with state curriculum standards,…
NASA Technical Reports Server (NTRS)
Roth, D. J.; Hull, D. R.
1994-01-01
IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.
Decision Aids Using Heterogeneous Intelligence Analysis
2010-08-20
developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY
NASA Astrophysics Data System (ADS)
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.
Data Center Energy Practitioner (DCEP) Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Traber, Kim; Salim, Munther; Sartor, Dale A.
2016-02-02
The main objective for the DCEP program is to raise the standards of those involved in energy assessments of data centers to accelerate energy savings. The program is driven by the fact that significant knowledge, training, and skills are required to perform accurate energy assessments. The program will raise the confidence level in energy assessments in data centers. For those who pass the exam, the program will recognize them as Data Center Energy Practitioners (DCEPs) by issuing a certificate. Hardware req.: PC, MAC; Software Req.: Windows; Related/Auxiliary software--MS Office; Type of files: executable modules, user guide; Documentation: e-user manual; Documentation:more » http://www.1.eere.energy.gov/industry/datacenters/ 12/10/15-New Documentation URL: https://datacenters.lbl.gov/dcep« less
Vodovotz, Yoram; Xia, Ashley; Read, Elizabeth L.; Bassaganya-Riera, Josep; Hafler, David A.; Sontag, Eduardo; Wang, Jin; Tsang, John S.; Day, Judy D.; Kleinstein, Steven; Butte, Atul J.; Altman, Matthew C; Hammond, Ross; Sealfon, Stuart C.
2016-01-01
Emergent responses of the immune system result from integration of molecular and cellular networks over time and across multiple organs. High-content and high-throughput analysis technologies, concomitantly with data-driven and mechanistic modeling, hold promise for systematic interrogation of these complex pathways. However, connecting genetic variation and molecular mechanisms to individual phenotypes and health outcomes has proven elusive. Gaps remain in data, and disagreements persist about the value of mechanistic modeling for immunology. Here, we present the perspectives that emerged from the NIAID workshop “Complex Systems Science, Modeling and Immunity” and subsequent discussions regarding the potential synergy of high-throughput data acquisition, data-driven modeling and mechanistic modeling to define new mechanisms of immunological disease and to accelerate the translation of these insights into therapies. PMID:27986392
Data-driven modelling of social forces and collective behaviour in zebrafish.
Zienkiewicz, Adam K; Ladu, Fabrizio; Barton, David A W; Porfiri, Maurizio; Bernardo, Mario Di
2018-04-14
Zebrafish are rapidly emerging as a powerful model organism in hypothesis-driven studies targeting a number of functional and dysfunctional processes. Mathematical models of zebrafish behaviour can inform the design of experiments, through the unprecedented ability to perform pilot trials on a computer. At the same time, in-silico experiments could help refining the analysis of real data, by enabling the systematic investigation of key neurobehavioural factors. Here, we establish a data-driven model of zebrafish social interaction. Specifically, we derive a set of interaction rules to capture the primary response mechanisms which have been observed experimentally. Contrary to previous studies, we include dynamic speed regulation in addition to turning responses, which together provide attractive, repulsive and alignment interactions between individuals. The resulting multi-agent model provides a novel, bottom-up framework to describe both the spontaneous motion and individual-level interaction dynamics of zebrafish, inferred directly from experimental observations. Copyright © 2018 Elsevier Ltd. All rights reserved.
Modeling of Diamond Field-Emitter-Arrays for high brightness photocathode applications
NASA Astrophysics Data System (ADS)
Kwan, Thomas; Huang, Chengkun; Piryatinski, Andrei; Lewellen, John; Nichols, Kimberly; Choi, Bo; Pavlenko, Vitaly; Shchegolkov, Dmitry; Nguyen, Dinh; Andrews, Heather; Simakov, Evgenya
2017-10-01
We propose to employ Diamond Field-Emitter-Arrays (DFEAs) as high-current-density ultra-low-emittance photocathodes for compact laser-driven dielectric accelerators capable of generating ultra-high brightness electron beams for advanced applications. We develop a semi-classical Monte-Carlo photoemission model for DFEAs that includes carriers' transport to the emitter surface and tunneling through the surface under external fields. The model accounts for the electronic structure size quantization affecting the transport and tunneling process within the sharp diamond tips. We compare this first principle model with other field emission models, such as the Child-Langmuir and Murphy-Good models. By further including effects of carrier photoexcitation, we perform simulations of the DFEAs' photoemission quantum yield and the emitted electron beam. Details of the theoretical model and validation against preliminary experimental data will be presented. Work ssupported by LDRD program at LANL.
Data-Driven Software Framework for Web-Based ISS Telescience
NASA Technical Reports Server (NTRS)
Tso, Kam S.
2005-01-01
Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Recent Advances in Ionospheric Modeling Using the USU GAIM Data Assimilation Models
NASA Astrophysics Data System (ADS)
Scherliess, L.; Thompson, D. C.; Schunk, R. W.
2009-12-01
The ionospheric plasma distribution at low and mid latitudes has been shown to display both a background state (climatology) and a disturbed state (weather). Ionospheric climatology has been successfully modeled, but ionospheric weather has been much more difficult to model because the ionosphere can vary significantly on an hour-by-hour basis. Unfortunately, ionospheric weather can have detrimental effects on several human activities and systems, including high-frequency communications, over-the-horizon radars, and survey and navigation systems using Global Positioning System (GPS) satellites. As shown by meteorologists and oceanographers, the most reliable weather models are physics-based, data-driven models that use Kalman filter or other data assimilation techniques. Since the state of a medium (ocean, lower atmosphere, ionosphere) is driven by complex and frequently nonlinear internal and external processes, it is not possible to accurately specify all of the drivers and initial conditions of the medium. Therefore physics-based models alone cannot provide reliable specifications and forecasts. In an effort to better understand the ionosphere and to mitigate its adverse effects on military and civilian operations, specification and forecast models are being developed that use state-of-the-art data assimilation techniques. Over the past decade, Utah State University (USU) has developed two data assimilation models for the ionosphere as part of the USU Global Assimilation of Ionospheric Measurements (GAIM) program and one of these models has been implemented at the Air Force Weather Agency for operational use. The USU-GAIM models are also being used for scientific studies, and this should lead to a dramatic advance in our understanding of ionospheric physics; similar to what occurred in meteorology and oceanography after the introduction of data assimilation models in those fields. Both USU-GAIM models are capable of assimilating data from a variety of data sources, including in situ electron densities from satellites, bottomside electron density profiles from ionosondes, total electron content (TEC) measurements between ground receivers and the GPS satellites, occultation data from satellite constellations, and ultraviolet emissions from the ionosphere measured by satellites. We will present the current status of the model development and discuss the employed data assimilation technique. Recent examples of the ionosphere specifications obtained from our model runs will be presented with an emphasis on the ionospheric plasma distribution during the current low solar activity conditions. Various comparisons with independent data will also be shown in an effort to validate the models.
NASA Astrophysics Data System (ADS)
Zhang, Wenkun; Zhang, Hanming; Wang, Linyuan; Cai, Ailong; Li, Lei; Yan, Bin
2018-02-01
Limited angle computed tomography (CT) reconstruction is widely performed in medical diagnosis and industrial testing because of the size of objects, engine/armor inspection requirements, and limited scan flexibility. Limited angle reconstruction necessitates usage of optimization-based methods that utilize additional sparse priors. However, most of conventional methods solely exploit sparsity priors of spatial domains. When CT projection suffers from serious data deficiency or various noises, obtaining reconstruction images that meet the requirement of quality becomes difficult and challenging. To solve this problem, this paper developed an adaptive reconstruction method for limited angle CT problem. The proposed method simultaneously uses spatial and Radon domain regularization model based on total variation (TV) and data-driven tight frame. Data-driven tight frame being derived from wavelet transformation aims at exploiting sparsity priors of sinogram in Radon domain. Unlike existing works that utilize pre-constructed sparse transformation, the framelets of the data-driven regularization model can be adaptively learned from the latest projection data in the process of iterative reconstruction to provide optimal sparse approximations for given sinogram. At the same time, an effective alternating direction method is designed to solve the simultaneous spatial and Radon domain regularization model. The experiments for both simulation and real data demonstrate that the proposed algorithm shows better performance in artifacts depression and details preservation than the algorithms solely using regularization model of spatial domain. Quantitative evaluations for the results also indicate that the proposed algorithm applying learning strategy performs better than the dual domains algorithms without learning regularization model
The Nursing Leadership Institute program evaluation: a critique
Havaei, Farinaz; MacPhee, Maura
2015-01-01
A theory-driven program evaluation was conducted for a nursing leadership program, as a collaborative project between university faculty, the nurses’ union, the provincial Ministry of Health, and its chief nursing officers. A collaborative logic model process was used to engage stakeholders, and mixed methods approaches were used to answer evaluation questions. Despite demonstrated, successful outcomes, the leadership program was not supported with continued funding. This paper examines what happened during the evaluation process: What factors failed to sustain this program? PMID:29355180
Data-driven Model of the ICME Propagation through the Solar Corona and Inner Heliosphere
NASA Astrophysics Data System (ADS)
Yalim, M. S.; Pogorelov, N.; Singh, T.; Liu, Y.
2017-12-01
The solar wind (SW) emerging from the Sun is the main driving mechanism of solar events which may lead to geomagnetic storms that are the primary causes of space weather disturbances that affect the magnetic environment of Earth and may have hazardous effects on the space-borne and ground-based technological systems as well as human health. Therefore, accurate modeling of the SW is very important to understand the underlying mechanisms of such storms.Getting ready for the Parker Solar Probe mission, we have developed a data-driven magnetohydrodynamic (MHD) model of the global solar corona which utilizes characteristic boundary conditions implemented within the Multi-Scale Fluid-Kinetic Simulation Suite (MS-FLUKSS) - a collection of problem oriented routines incorporated into the Chombo adaptive mesh refinement framework developed at Lawrence Berkeley National Laboratory. Our global solar corona model can be driven by both synoptic and synchronic vector magnetogram data obtained by the Solar Dynamics Observatory/Helioseismic and Magnetic Imager (SDO/HMI) and the horizontal velocity data on the photosphere obtained by applying the Differential Affine Velocity Estimatorfor Vector Magnetograms (DAVE4VM) method on the HMI-observed vector magnetic fields.Our CME generation model is based on Gibson-Low-type flux ropes the parameters of which are determined from analysis of observational data from STEREO/SECCHI, SDO/AIA and SOHO/LASCO, and by applying the Graduate Cylindrical Shell model for the flux rope reconstruction.In this study, we will present the results of three-dimensional global simulations of ICME propagation through our characteristically-consistent MHD model of the background SW from the Sun to Earth driven by HMI-observed vector magnetic fields and validate our results using multiple spacecraft data at 1 AU.
Zhang, Xike; Zhang, Qiuwen; Zhang, Gui; Nie, Zhiping; Gui, Zifan; Que, Huafei
2018-01-01
Daily land surface temperature (LST) forecasting is of great significance for application in climate-related, agricultural, eco-environmental, or industrial studies. Hybrid data-driven prediction models using Ensemble Empirical Mode Composition (EEMD) coupled with Machine Learning (ML) algorithms are useful for achieving these purposes because they can reduce the difficulty of modeling, require less history data, are easy to develop, and are less complex than physical models. In this article, a computationally simple, less data-intensive, fast and efficient novel hybrid data-driven model called the EEMD Long Short-Term Memory (LSTM) neural network, namely EEMD-LSTM, is proposed to reduce the difficulty of modeling and to improve prediction accuracy. The daily LST data series from the Mapoling and Zhijiang stations in the Dongting Lake basin, central south China, from 1 January 2014 to 31 December 2016 is used as a case study. The EEMD is firstly employed to decompose the original daily LST data series into many Intrinsic Mode Functions (IMFs) and a single residue item. Then, the Partial Autocorrelation Function (PACF) is used to obtain the number of input data sample points for LSTM models. Next, the LSTM models are constructed to predict the decompositions. All the predicted results of the decompositions are aggregated as the final daily LST. Finally, the prediction performance of the hybrid EEMD-LSTM model is assessed in terms of the Mean Square Error (MSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), Root Mean Square Error (RMSE), Pearson Correlation Coefficient (CC) and Nash-Sutcliffe Coefficient of Efficiency (NSCE). To validate the hybrid data-driven model, the hybrid EEMD-LSTM model is compared with the Recurrent Neural Network (RNN), LSTM and Empirical Mode Decomposition (EMD) coupled with RNN, EMD-LSTM and EEMD-RNN models, and their comparison results demonstrate that the hybrid EEMD-LSTM model performs better than the other five models. The scatterplots of the predicted results of the six models versus the original daily LST data series show that the hybrid EEMD-LSTM model is superior to the other five models. It is concluded that the proposed hybrid EEMD-LSTM model in this study is a suitable tool for temperature forecasting. PMID:29883381
Zhang, Xike; Zhang, Qiuwen; Zhang, Gui; Nie, Zhiping; Gui, Zifan; Que, Huafei
2018-05-21
Daily land surface temperature (LST) forecasting is of great significance for application in climate-related, agricultural, eco-environmental, or industrial studies. Hybrid data-driven prediction models using Ensemble Empirical Mode Composition (EEMD) coupled with Machine Learning (ML) algorithms are useful for achieving these purposes because they can reduce the difficulty of modeling, require less history data, are easy to develop, and are less complex than physical models. In this article, a computationally simple, less data-intensive, fast and efficient novel hybrid data-driven model called the EEMD Long Short-Term Memory (LSTM) neural network, namely EEMD-LSTM, is proposed to reduce the difficulty of modeling and to improve prediction accuracy. The daily LST data series from the Mapoling and Zhijaing stations in the Dongting Lake basin, central south China, from 1 January 2014 to 31 December 2016 is used as a case study. The EEMD is firstly employed to decompose the original daily LST data series into many Intrinsic Mode Functions (IMFs) and a single residue item. Then, the Partial Autocorrelation Function (PACF) is used to obtain the number of input data sample points for LSTM models. Next, the LSTM models are constructed to predict the decompositions. All the predicted results of the decompositions are aggregated as the final daily LST. Finally, the prediction performance of the hybrid EEMD-LSTM model is assessed in terms of the Mean Square Error (MSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), Root Mean Square Error (RMSE), Pearson Correlation Coefficient (CC) and Nash-Sutcliffe Coefficient of Efficiency (NSCE). To validate the hybrid data-driven model, the hybrid EEMD-LSTM model is compared with the Recurrent Neural Network (RNN), LSTM and Empirical Mode Decomposition (EMD) coupled with RNN, EMD-LSTM and EEMD-RNN models, and their comparison results demonstrate that the hybrid EEMD-LSTM model performs better than the other five models. The scatterplots of the predicted results of the six models versus the original daily LST data series show that the hybrid EEMD-LSTM model is superior to the other five models. It is concluded that the proposed hybrid EEMD-LSTM model in this study is a suitable tool for temperature forecasting.
ODISEES: Ontology-Driven Interactive Search Environment for Earth Sciences
NASA Technical Reports Server (NTRS)
Rutherford, Matthew T.; Huffer, Elisabeth B.; Kusterer, John M.; Quam, Brandi M.
2015-01-01
This paper discusses the Ontology-driven Interactive Search Environment for Earth Sciences (ODISEES) project currently being developed to aid researchers attempting to find usable data among an overabundance of closely related data. ODISEES' ontological structure relies on a modular, adaptable concept modeling approach, which allows the domain to be modeled more or less as it is without worrying about terminology or external requirements. In the model, variables are individually assigned semantic content based on the characteristics of the measurements they represent, allowing intuitive discovery and comparison of data without requiring the user to sift through large numbers of data sets and variables to find the desired information.
The Principal's Mind-Set for Data
ERIC Educational Resources Information Center
Fox, Dennis
2013-01-01
Is there a school leader anywhere who hasn't been directed, or at least encouraged, to "analyze the data" and practice what has been termed "data-driven decision-making"? Today's principal is expected to be able to skillfully collect, organize, analyze, interpret and use a variety of data in order to improve instruction, services and programs for…
Motakis, E S; Nason, G P; Fryzlewicz, P; Rutter, G A
2006-10-15
Many standard statistical techniques are effective on data that are normally distributed with constant variance. Microarray data typically violate these assumptions since they come from non-Gaussian distributions with a non-trivial mean-variance relationship. Several methods have been proposed that transform microarray data to stabilize variance and draw its distribution towards the Gaussian. Some methods, such as log or generalized log, rely on an underlying model for the data. Others, such as the spread-versus-level plot, do not. We propose an alternative data-driven multiscale approach, called the Data-Driven Haar-Fisz for microarrays (DDHFm) with replicates. DDHFm has the advantage of being 'distribution-free' in the sense that no parametric model for the underlying microarray data is required to be specified or estimated; hence, DDHFm can be applied very generally, not just to microarray data. DDHFm achieves very good variance stabilization of microarray data with replicates and produces transformed intensities that are approximately normally distributed. Simulation studies show that it performs better than other existing methods. Application of DDHFm to real one-color cDNA data validates these results. The R package of the Data-Driven Haar-Fisz transform (DDHFm) for microarrays is available in Bioconductor and CRAN.
Model-driven meta-analyses for informing health care: a diabetes meta-analysis as an exemplar.
Brown, Sharon A; Becker, Betsy Jane; García, Alexandra A; Brown, Adama; Ramírez, Gilbert
2015-04-01
A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. © The Author(s) 2014.
MODEL-DRIVEN META-ANALYSES FOR INFORMING HEALTH CARE: A DIABETES META-ANALYSIS AS AN EXEMPLAR
Brown, Sharon A.; Becker, Betsy Jane; García, Alexandra A.; Brown, Adama; Ramírez, Gilbert
2015-01-01
A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. PMID:25142707
GLOBEC (Global Ocean Ecosystems Dynamics: Northwest Atlantic program
NASA Technical Reports Server (NTRS)
1991-01-01
The specific objective of the meeting was to plan an experiment in the Northwestern Atlantic to study the marine ecosystem and its role, together with that of climate and physical dynamics, in determining fisheries recruitment. The underlying focus of the GLOBEC initiative is to understand the marine ecosystem as it related to marine living resources and to understand how fluctuation in these resources are driven by climate change and exploitation. In this sense the goal is a solid scientific program to provide basic information concerning major fisheries stocks and the environment that sustains them. The plan is to attempt to reach this understanding through a multidisciplinary program that brings to bear new techniques as disparate as numerical fluid dynamic models of ocean circulation, molecular biology and modern acoustic imaging. The effort will also make use of the massive historical data sets on fisheries and the state of the climate in a coordinated manner.
The Greenhouse Gas (GHG) Technology Verification Center is one of 12 independently operated verification centers established by the U.S. Environmental Protection Agency. The Center provides third-party performance data to stakeholders interested in environmetnal technologies tha...
Data-Driven Instructional Leadership
ERIC Educational Resources Information Center
Blink, Rebecca
2006-01-01
With real-world examples from actual schools, this book illustrates how to nurture a culture of continuous improvement, meet the needs of individual students, foster an environment of high expectations, and meet the requirements of NCLB. Each component of the Data-Driven Instructional Leadership (DDIS) model represents several branches of…
Indicators of ecosystem function identify alternate states in the sagebrush steppe.
Kachergis, Emily; Rocca, Monique E; Fernandez-Gimenez, Maria E
2011-10-01
Models of ecosystem change that incorporate nonlinear dynamics and thresholds, such as state-and-transition models (STMs), are increasingly popular tools for land management decision-making. However, few models are based on systematic collection and documentation of ecological data, and of these, most rely solely on structural indicators (species composition) to identify states and transitions. As STMs are adopted as an assessment framework throughout the United States, finding effective and efficient ways to create data-driven models that integrate ecosystem function and structure is vital. This study aims to (1) evaluate the utility of functional indicators (indicators of rangeland health, IRH) as proxies for more difficult ecosystem function measurements and (2) create a data-driven STM for the sagebrush steppe of Colorado, USA, that incorporates both ecosystem structure and function. We sampled soils, plant communities, and IRH at 41 plots with similar clayey soils but different site histories to identify potential states and infer the effects of management practices and disturbances on transitions. We found that many IRH were correlated with quantitative measures of functional indicators, suggesting that the IRH can be used to approximate ecosystem function. In addition to a reference state that functions as expected for this soil type, we identified four biotically and functionally distinct potential states, consistent with the theoretical concept of alternate states. Three potential states were related to management practices (chemical and mechanical shrub treatments and seeding history) while one was related only to ecosystem processes (erosion). IRH and potential states were also related to environmental variation (slope, soil texture), suggesting that there are environmental factors within areas with similar soils that affect ecosystem dynamics and should be noted within STMs. Our approach generated an objective, data-driven model of ecosystem dynamics for rangeland management. Our findings suggest that the IRH approximate ecosystem processes and can distinguish between alternate states and communities and identify transitions when building data-driven STMs. Functional indicators are a simple, efficient way to create data-driven models that are consistent with alternate state theory. Managers can use them to improve current model-building methods and thus apply state-and-transition models more broadly for land management decision-making.
NASA Astrophysics Data System (ADS)
Butell, Bart
1996-02-01
Microsoft's Visual Basic (VB) and Borland's Delphi provide an extremely robust programming environment for delivering multimedia solutions for interactive kiosks, games and titles. Their object oriented use of standard and custom controls enable a user to build extremely powerful applications. A multipurpose, database enabled programming environment that can provide an event driven interface functions as a multimedia kernel. This kernel can provide a variety of authoring solutions (e.g. a timeline based model similar to Macromedia Director or a node authoring model similar to Icon Author). At the heart of the kernel is a set of low level multimedia components providing object oriented interfaces for graphics, audio, video and imaging. Data preparation tools (e.g., layout, palette and Sprite Editors) could be built to manage the media database. The flexible interface for VB allows the construction of an infinite number of user models. The proliferation of these models within a popular, easy to use environment will allow the vast developer segment of 'producer' types to bring their ideas to the market. This is the key to building exciting, content rich multimedia solutions. Microsoft's VB and Borland's Delphi environments combined with multimedia components enable these possibilities.
Horvath, Monica M.; Rusincovitch, Shelley A.; Brinson, Stephanie; Shang, Howard C.; Evans, Steve; Ferranti, Jeffrey M.
2015-01-01
Purpose Data generated in the care of patients are widely used to support clinical research and quality improvement, which has hastened the development of self-service query tools. User interface design for such tools, execution of query activity, and underlying application architecture have not been widely reported, and existing tools reflect a wide heterogeneity of methods and technical frameworks. We describe the design, application architecture, and use of a self-service model for enterprise data delivery within Duke Medicine. Methods Our query platform, the Duke Enterprise Data Unified Content Explorer (DEDUCE), supports enhanced data exploration, cohort identification, and data extraction from our enterprise data warehouse (EDW) using a series of modular environments that interact with a central keystone module, Cohort Manager (CM). A data-driven application architecture is implemented through three components: an application data dictionary, the concept of “smart dimensions”, and dynamically-generated user interfaces. Results DEDUCE CM allows flexible hierarchies of EDW queries within a grid-like workspace. A cohort “join” functionality allows switching between filters based on criteria occurring within or across patient encounters. To date, 674 users have been trained and activated in DEDUCE, and logon activity shows a steady increase, with variability between months. A comparison of filter conditions and export criteria shows that these activities have different patterns of usage across subject areas. Conclusions Organizations with sophisticated EDWs may find that users benefit from development of advanced query functionality, complimentary to the user interfaces and infrastructure used in other well-published models. Driven by its EDW context, the DEDUCE application architecture was also designed to be responsive to source data and to allow modification through alterations in metadata rather than programming, allowing an agile response to source system changes. PMID:25051403
Horvath, Monica M; Rusincovitch, Shelley A; Brinson, Stephanie; Shang, Howard C; Evans, Steve; Ferranti, Jeffrey M
2014-12-01
Data generated in the care of patients are widely used to support clinical research and quality improvement, which has hastened the development of self-service query tools. User interface design for such tools, execution of query activity, and underlying application architecture have not been widely reported, and existing tools reflect a wide heterogeneity of methods and technical frameworks. We describe the design, application architecture, and use of a self-service model for enterprise data delivery within Duke Medicine. Our query platform, the Duke Enterprise Data Unified Content Explorer (DEDUCE), supports enhanced data exploration, cohort identification, and data extraction from our enterprise data warehouse (EDW) using a series of modular environments that interact with a central keystone module, Cohort Manager (CM). A data-driven application architecture is implemented through three components: an application data dictionary, the concept of "smart dimensions", and dynamically-generated user interfaces. DEDUCE CM allows flexible hierarchies of EDW queries within a grid-like workspace. A cohort "join" functionality allows switching between filters based on criteria occurring within or across patient encounters. To date, 674 users have been trained and activated in DEDUCE, and logon activity shows a steady increase, with variability between months. A comparison of filter conditions and export criteria shows that these activities have different patterns of usage across subject areas. Organizations with sophisticated EDWs may find that users benefit from development of advanced query functionality, complimentary to the user interfaces and infrastructure used in other well-published models. Driven by its EDW context, the DEDUCE application architecture was also designed to be responsive to source data and to allow modification through alterations in metadata rather than programming, allowing an agile response to source system changes. Copyright © 2014 Elsevier Inc. All rights reserved.
Convergence of service, policy, and science toward consumer-driven mental health care.
Carroll, Christopher D; Manderscheid, Ronald W; Daniels, Allen S; Compagni, Amelia
2006-12-01
A common theme is emerging in sentinel reports on the United States health care system. Consumer relevance and demands on service systems and practices are influencing how mental health care is delivered and how systems will be shaped in the future. The present report seeks to assemble a confluence of consumer-driven themes from noteworthy reports on the state of the mental health system in the U.S. It also explores innovative efforts, promising practices, collaborative efforts, as well as identification of barriers to consumer-directed care, with possible solutions. The report reviews the relevant public mental health policy and data used in published work. The findings indicate an increasing public and private interest in promoting consumer-driven care, even though historical systems of care predominate, and often create, barriers to wide-spread redesign of a consumer-centered mental health care system. Innovative consumer-driven practices are increasing as quality, choice, and self-determination become integral parts of a redesigned U.S. mental health care system. The use of consumer-driven approaches in mental health is limited at best. These programs challenge industry norms and traditional practices. Limitations include the need for additional and thorough evaluations of effectiveness (cost and clinical) and replicability of consumer-directed programs. Consumer-driven services indicate that mental health consumers are expecting to be more participative in their mental health care. This expectation will influence how traditional mental health services and providers become more consumer-centric and meet the demand. Public and private interest in consumer-driven health care range from creating cost-conscious consumers to individualized control of recovery. The health care sector should seek to invest more resources in the provision of consumer-driven health care programs. The results of this study have implications and are informative for other countries where consumer-directed care is delivered in either the private or public health care systems. More research is needed to obtain further evidence on the use of consumer-driven services and their overall effectiveness.
Profiling Arthritis Pain with a Decision Tree.
Hung, Man; Bounsanga, Jerry; Liu, Fangzhou; Voss, Maren W
2018-06-01
Arthritis is the leading cause of work disability and contributes to lost productivity. Previous studies showed that various factors predict pain, but they were limited in sample size and scope from a data analytics perspective. The current study applied machine learning algorithms to identify predictors of pain associated with arthritis in a large national sample. Using data from the 2011 to 2012 Medical Expenditure Panel Survey, data mining was performed to develop algorithms to identify factors and patterns that contribute to risk of pain. The model incorporated over 200 variables within the algorithm development, including demographic data, medical claims, laboratory tests, patient-reported outcomes, and sociobehavioral characteristics. The developed algorithms to predict pain utilize variables readily available in patient medical records. Using the machine learning classification algorithm J48 with 50-fold cross-validations, we found that the model can significantly distinguish those with and without pain (c-statistics = 0.9108). The F measure was 0.856, accuracy rate was 85.68%, sensitivity was 0.862, specificity was 0.852, and precision was 0.849. Physical and mental function scores, the ability to climb stairs, and overall assessment of feeling were the most discriminative predictors from the 12 identified variables, predicting pain with 86% accuracy for individuals with arthritis. In this era of rapid expansion of big data application, the nature of healthcare research is moving from hypothesis-driven to data-driven solutions. The algorithms generated in this study offer new insights on individualized pain prediction, allowing the development of cost-effective care management programs for those experiencing arthritis pain. © 2017 World Institute of Pain.
Produce and Consume Linked Data with Drupal!
NASA Astrophysics Data System (ADS)
Corlosquet, Stéphane; Delbru, Renaud; Clark, Tim; Polleres, Axel; Decker, Stefan
Currently a large number of Web sites are driven by Content Management Systems (CMS) which manage textual and multimedia content but also - inherently - carry valuable information about a site's structure and content model. Exposing this structured information to the Web of Data has so far required considerable expertise in RDF and OWL modelling and additional programming effort. In this paper we tackle one of the most popular CMS: Drupal. We enable site administrators to export their site content model and data to the Web of Data without requiring extensive knowledge on Semantic Web technologies. Our modules create RDFa annotations and - optionally - a SPARQL endpoint for any Drupal site out of the box. Likewise, we add the means to map the site data to existing ontologies on the Web with a search interface to find commonly used ontology terms. We also allow a Drupal site administrator to include existing RDF data from remote SPARQL endpoints on the Web in the site. When brought together, these features allow networked RDF Drupal sites that reuse and enrich Linked Data. We finally discuss the adoption of our modules and report on a use case in the biomedical field and the current status of its deployment.
de Vries, Natalie Jane; Carlson, Jamie; Moscato, Pablo
2014-01-01
Online consumer behavior in general and online customer engagement with brands in particular, has become a major focus of research activity fuelled by the exponential increase of interactive functions of the internet and social media platforms and applications. Current research in this area is mostly hypothesis-driven and much debate about the concept of Customer Engagement and its related constructs remains existent in the literature. In this paper, we aim to propose a novel methodology for reverse engineering a consumer behavior model for online customer engagement, based on a computational and data-driven perspective. This methodology could be generalized and prove useful for future research in the fields of consumer behaviors using questionnaire data or studies investigating other types of human behaviors. The method we propose contains five main stages; symbolic regression analysis, graph building, community detection, evaluation of results and finally, investigation of directed cycles and common feedback loops. The ‘communities’ of questionnaire items that emerge from our community detection method form possible ‘functional constructs’ inferred from data rather than assumed from literature and theory. Our results show consistent partitioning of questionnaire items into such ‘functional constructs’ suggesting the method proposed here could be adopted as a new data-driven way of human behavior modeling. PMID:25036766
de Vries, Natalie Jane; Carlson, Jamie; Moscato, Pablo
2014-01-01
Online consumer behavior in general and online customer engagement with brands in particular, has become a major focus of research activity fuelled by the exponential increase of interactive functions of the internet and social media platforms and applications. Current research in this area is mostly hypothesis-driven and much debate about the concept of Customer Engagement and its related constructs remains existent in the literature. In this paper, we aim to propose a novel methodology for reverse engineering a consumer behavior model for online customer engagement, based on a computational and data-driven perspective. This methodology could be generalized and prove useful for future research in the fields of consumer behaviors using questionnaire data or studies investigating other types of human behaviors. The method we propose contains five main stages; symbolic regression analysis, graph building, community detection, evaluation of results and finally, investigation of directed cycles and common feedback loops. The 'communities' of questionnaire items that emerge from our community detection method form possible 'functional constructs' inferred from data rather than assumed from literature and theory. Our results show consistent partitioning of questionnaire items into such 'functional constructs' suggesting the method proposed here could be adopted as a new data-driven way of human behavior modeling.
Predictive Monitoring for Improved Management of Glucose Levels
Reifman, Jaques; Rajaraman, Srinivasan; Gribok, Andrei; Ward, W. Kenneth
2007-01-01
Background Recent developments and expected near-future improvements in continuous glucose monitoring (CGM) devices provide opportunities to couple them with mathematical forecasting models to produce predictive monitoring systems for early, proactive glycemia management of diabetes mellitus patients before glucose levels drift to undesirable levels. This article assesses the feasibility of data-driven models to serve as the forecasting engine of predictive monitoring systems. Methods We investigated the capabilities of data-driven autoregressive (AR) models to (1) capture the correlations in glucose time-series data, (2) make accurate predictions as a function of prediction horizon, and (3) be made portable from individual to individual without any need for model tuning. The investigation is performed by employing CGM data from nine type 1 diabetic subjects collected over a continuous 5-day period. Results With CGM data serving as the gold standard, AR model-based predictions of glucose levels assessed over nine subjects with Clarke error grid analysis indicated that, for a 30-minute prediction horizon, individually tuned models yield 97.6 to 100.0% of data in the clinically acceptable zones A and B, whereas cross-subject, portable models yield 95.8 to 99.7% of data in zones A and B. Conclusions This study shows that, for a 30-minute prediction horizon, data-driven AR models provide sufficiently-accurate and clinically-acceptable estimates of glucose levels for timely, proactive therapy and should be considered as the modeling engine for predictive monitoring of patients with type 1 diabetes mellitus. It also suggests that AR models can be made portable from individual to individual with minor performance penalties, while greatly reducing the burden associated with model tuning and data collection for model development. PMID:19885110
Issues and Theoretical Constructs regarding Parent Education for Autism Spectrum Disorders
ERIC Educational Resources Information Center
Steiner, Amanda M.; Koegel, Lynn K.; Koegel, Robert L.; Ence, Whitney A.
2012-01-01
Participation of parents of children with autism is commonplace in most comprehensive intervention programs, yet, there is limited research relating to the best practices in this area. This article provides an overview of parent education programs for young children with autism and details data-driven procedures which are associated with improved…
Effective Practices in Teacher Preparation Programs: Reading Action Research Projects
ERIC Educational Resources Information Center
Beal, Jennifer S.
2018-01-01
Data-driven instruction is simply good educational practice. In the deaf education certification program--which confers master's degrees in education at Valdosta State University, Georgia, through a variety of online options--the professors address this issue directly with graduate students, all of whom are teacher candidates. One of the ways they…
DOE Office of Scientific and Technical Information (OSTI.GOV)
VAN ZEIJTS,J.; DOTTAVIO,T.; FRAK,B.
The Relativistic Heavy Ion Collider (RHIC) has a high level asynchronous time-line driven by a controlling program called the ''Sequencer''. Most high-level magnet and beam related issues are orchestrated by this system. The system also plays an important task in coordinated data acquisition and saving. We present the program, operator interface, operational impact and experience.
A Comprehensive Evaluation of a K-5 Chinese Language Immersion Program
ERIC Educational Resources Information Center
Jacobson, Shoufen
2013-01-01
This dissertation was designed to provide a comprehensive data-driven evaluation of a Chinese language Immersion Program (CIP) for the stakeholders. CIP was implemented in 2006 with a goal for students to become proficient in the Chinese language and develop increased cultural awareness while reaching at least the same level of academic…
2014-01-01
Background The impact of efforts by healthcare organizations to enhance the use of evidence to improve organizational processes through training programs has seldom been assessed. We therefore endeavored to assess whether and how the training of mid- and senior-level healthcare managers could lead to organizational change. Methods We conducted a theory-driven evaluation of the organizational impact of healthcare leaders’ participation in two training programs using a logic model based on Nonaka’s theory of knowledge conversion. We analyzed six case studies nested within the two programs using three embedded units of analysis (individual, group and organization). Interviews were conducted during intensive one-week data collection site visits. A total of 84 people were interviewed. Results We found that the impact of training could primarily be felt in trainees’ immediate work environments. The conversion of attitudes was found to be easier to achieve than the conversion of skills. Our results show that, although socialization and externalization were common in all cases, a lack of combination impeded the conversion of skills. We also identified several individual, organizational and program design factors that facilitated and/or impeded the dissemination of the attitudes and skills gained by trainees to other organizational members. Conclusions Our theory-driven evaluation showed that factors before, during and after training can influence the extent of skills and knowledge transfer. Our evaluation went further than previous research by revealing the influence—both positive and negative—of specific organizational factors on extending the impact of training programs. PMID:24885800
Yun Chen; Hui Yang
2014-01-01
The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.
Data-driven outbreak forecasting with a simple nonlinear growth model
Lega, Joceline; Brown, Heidi E.
2016-01-01
Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. PMID:27770752
A data driven nonlinear stochastic model for blood glucose dynamics.
Zhang, Yan; Holt, Tim A; Khovanova, Natalia
2016-03-01
The development of adequate mathematical models for blood glucose dynamics may improve early diagnosis and control of diabetes mellitus (DM). We have developed a stochastic nonlinear second order differential equation to describe the response of blood glucose concentration to food intake using continuous glucose monitoring (CGM) data. A variational Bayesian learning scheme was applied to define the number and values of the system's parameters by iterative optimisation of free energy. The model has the minimal order and number of parameters to successfully describe blood glucose dynamics in people with and without DM. The model accounts for the nonlinearity and stochasticity of the underlying glucose-insulin dynamic process. Being data-driven, it takes full advantage of available CGM data and, at the same time, reflects the intrinsic characteristics of the glucose-insulin system without detailed knowledge of the physiological mechanisms. We have shown that the dynamics of some postprandial blood glucose excursions can be described by a reduced (linear) model, previously seen in the literature. A comprehensive analysis demonstrates that deterministic system parameters belong to different ranges for diabetes and controls. Implications for clinical practice are discussed. This is the first study introducing a continuous data-driven nonlinear stochastic model capable of describing both DM and non-DM profiles. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO 2 leaks and associated concentrations from geological CO 2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO 2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO 2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO 2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems. Copyright © 2017 Elsevier B.V. All rights reserved.
Yu, Yangyang R; Abbas, Paulette I; Smith, Carolyn M; Carberry, Kathleen E; Ren, Hui; Patel, Binita; Nuchtern, Jed G; Lopez, Monica E
2016-12-01
As reimbursement programs shift to value-based payment models emphasizing quality and efficient healthcare delivery, there exists a need to better understand process management to unearth true costs of patient care. We sought to identify cost-reduction opportunities in simple appendicitis management by applying a time-driven activity-based costing (TDABC) methodology to this high-volume surgical condition. Process maps were created using medical record time stamps. Labor capacity cost rates were calculated using national median physician salaries, weighted nurse-patient ratios, and hospital cost data. Consumable costs for supplies, pharmacy, laboratory, and food were derived from the hospital general ledger. Time-driven activity-based costing resulted in precise per-minute calculation of personnel costs. Highest costs were in the operating room ($747.07), hospital floor ($388.20), and emergency department ($296.21). Major contributors to length of stay were emergency department evaluation (270min), operating room availability (395min), and post-operative monitoring (1128min). The TDABC model led to $1712.16 in personnel costs and $1041.23 in consumable costs for a total appendicitis cost of $2753.39. Inefficiencies in healthcare delivery can be identified through TDABC. Triage-based standing delegation orders, advanced practice providers, and same day discharge protocols are proposed cost-reducing interventions to optimize value-based care for simple appendicitis. II. Copyright © 2016 Elsevier Inc. All rights reserved.
Two facets of stress and indirect effects on child diet through emotion-driven eating.
Tate, Eleanor B; Spruijt-Metz, Donna; Pickering, Trevor A; Pentz, Mary Ann
2015-08-01
Stress has been associated with high-calorie, low-nutrient food intake (HCLN) and emotion-driven eating (EDE). However, effects on healthy food intake remain unknown. This study examined two facets of stress (self-efficacy, perceived helplessness) and food consumption, mediated by EDE. Cross-sectional data from fourth-graders (n=978; 52% female, 28% Hispanic) in an obesity intervention used self-report to assess self-efficacy, helplessness, EDE, fruit/vegetable (FV) intake, and high-calorie/low-nutrient (HCLN) food. Higher stress self-efficacy was associated with higher FV intake, β=.354, p<0.001, and stress perceived helplessness had an indirect effect on HCLN intake through emotion-driven eating, indirect effect=.094, p<0.001; χ(2)(347)=659.930, p<0.001, CFI=0.940, TLI=0.930, RMSEA=0.030, p=1.00, adjusting for gender, ethnicity, BMI z-score, and program group. Stress self-efficacy may be more important for healthy food intake and perceived helplessness may indicate emotion-driven eating and unhealthy snack food intake. Obesity prevention programs may consider teaching stress management techniques to avoid emotion-driven eating. Copyright © 2015 Elsevier Ltd. All rights reserved.
Two Facets of Stress And Indirect Effects on Child Diet via Emotion-Driven Eating
Tate, Eleanor B.; Spruijt-Metz, Donna; Pickering, Trevor A.; Pentz, Mary Ann
2015-01-01
Objective Stress has been associated with high-calorie, low-nutrient food intake (HCLN) and emotion-driven eating (EDE). However, effects on healthy food intake remain unknown. This study examined two facets of stress (self-efficacy, perceived helplessness) and food consumption, mediated by EDE. Methods Cross-sectional data from fourth-graders (n = 978; 52% female, 28% Hispanic) in an obesity intervention used self-report to assess self-efficacy, helplessness, EDE, fruit/vegetable (FV) intake, and high-calorie/low-nutrient (HCLN) food. Results Higher stress self-efficacy was associated with higher FV intake, β = .354, p < 0.001, and stress perceived helplessness had an indirect effect on HCLN intake through emotion-driven eating, indirect effect = .094, p < 0.001; χ2(347) = 659.930, p < 0.001, CFI = 0.940, TLI = 0.930, RMSEA = 0.030, p = 1.00, adjusting for gender, ethnicity, BMI z-score, and program group. Conclusions and Implications Stress self-efficacy may be more important for healthy food intake and perceived helplessness may indicate emotion-driven eating and unhealthy snack food intake. Obesity prevention programs may consider teaching stress management techniques to avoid emotion-driven eating. PMID:26004248
Automatic translation of MPI source into a latency-tolerant, data-driven form
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric
Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. We reformulate MPI source into a task dependency graph representation, which partially orders the tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotation for a variety ofmore » applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo’s performance meets or exceeds that of labor-intensive hand coding. As a result, the translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a well-known library.« less
Automatic translation of MPI source into a latency-tolerant, data-driven form
Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric; ...
2017-03-06
Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. We reformulate MPI source into a task dependency graph representation, which partially orders the tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotation for a variety ofmore » applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo’s performance meets or exceeds that of labor-intensive hand coding. As a result, the translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a well-known library.« less
Automatic translation of MPI source into a latency-tolerant, data-driven form
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric
Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. Bamboo reformulates MPI source into the form of a task dependency graph that expresses a partial ordering among tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotationmore » for a variety of applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo's performance meets or exceeds that of labor-intensive hand coding. The translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a wellknown library.« less
Data-driven train set crash dynamics simulation
NASA Astrophysics Data System (ADS)
Tang, Zhao; Zhu, Yunrui; Nie, Yinyu; Guo, Shihui; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2017-02-01
Traditional finite element (FE) methods are arguably expensive in computation/simulation of the train crash. High computational cost limits their direct applications in investigating dynamic behaviours of an entire train set for crashworthiness design and structural optimisation. On the contrary, multi-body modelling is widely used because of its low computational cost with the trade-off in accuracy. In this study, a data-driven train crash modelling method is proposed to improve the performance of a multi-body dynamics simulation of train set crash without increasing the computational burden. This is achieved by the parallel random forest algorithm, which is a machine learning approach that extracts useful patterns of force-displacement curves and predicts a force-displacement relation in a given collision condition from a collection of offline FE simulation data on various collision conditions, namely different crash velocities in our analysis. Using the FE simulation results as a benchmark, we compared our method with traditional multi-body modelling methods and the result shows that our data-driven method improves the accuracy over traditional multi-body models in train crash simulation and runs at the same level of efficiency.
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
Capture the Human Side of Learning: Data Makeover Puts Students Front and Center
ERIC Educational Resources Information Center
Sharratt, Lyn; Fullan, Michael
2013-01-01
Education is overloaded with programs and data. The growth of digital power has aided and abetted the spread of accountability-driven data--Adequate Yearly Progress, test results for every child in every grade, Common Core standards, formative and summative assessments. Technology accelerates the onslaught of data. All this information goes for…
A method for using real world data in breast cancer modeling.
Pobiruchin, Monika; Bochum, Sylvia; Martens, Uwe M; Kieser, Meinhard; Schramm, Wendelin
2016-04-01
Today, hospitals and other health care-related institutions are accumulating a growing bulk of real world clinical data. Such data offer new possibilities for the generation of disease models for the health economic evaluation. In this article, we propose a new approach to leverage cancer registry data for the development of Markov models. Records of breast cancer patients from a clinical cancer registry were used to construct a real world data driven disease model. We describe a model generation process which maps database structures to disease state definitions based on medical expert knowledge. Software was programmed in Java to automatically derive a model structure and transition probabilities. We illustrate our method with the reconstruction of a published breast cancer reference model derived primarily from clinical study data. In doing so, we exported longitudinal patient data from a clinical cancer registry covering eight years. The patient cohort (n=892) comprised HER2-positive and HER2-negative women treated with or without Trastuzumab. The models generated with this method for the respective patient cohorts were comparable to the reference model in their structure and treatment effects. However, our computed disease models reflect a more detailed picture of the transition probabilities, especially for disease free survival and recurrence. Our work presents an approach to extract Markov models semi-automatically using real world data from a clinical cancer registry. Health care decision makers may benefit from more realistic disease models to improve health care-related planning and actions based on their own data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Mission Driven Scene Understanding: Candidate Model Training and Validation
2016-09-01
driven scene understanding. One of the candidate engines that we are evaluating is a convolutional neural network (CNN) program installed on a Windows 10...Theano-AlexNet6,7) installed on a Windows 10 notebook computer. To the best of our knowledge, an implementation of the open-source, Python-based...AlexNet CNN on a Windows notebook computer has not been previously reported. In this report, we present progress toward the proof-of-principle testing
RADC Multi-Dimensional Signal-Processing Research Program.
1980-09-30
Formulation 7 3.2.2 Methods of Accelerating Convergence 8 3.2.3 Application to Image Deblurring 8 3.2.4 Extensions 11 3.3 Convergence of Iterative Signal... noise -driven linear filters, permit development of the joint probability density function oz " kelihood function for the image. With an expression...spatial linear filter driven by white noise (see Fig. i). If the probability density function for the white noise is known, Fig. t. Model for image
Teachers' Intentions to Use National Literacy and Numeracy Assessment Data: A Pilot Study
ERIC Educational Resources Information Center
Pierce, Robyn; Chick, Helen
2011-01-01
In recent years the educational policy environment has emphasised data-driven change. This has increased the expectation for school personnel to use statistical information to inform their programs and to improve teaching practices. Such data include system reports of student achievement tests and socio-economic profiles provided to schools by…
NASA Technical Reports Server (NTRS)
Smith, James A.
2003-01-01
This paper addresses the fundamental question of why birds occur where and when they do, i.e., what are the causative factors that determine the spatio-temporal distributions, abundance, or richness of bird species? In this paper we outline the first steps toward building a satellite, data-driven model of avian energetics and species richness based on individual bird physiology, morphology, and interaction with the spatio-temporal habitat. To evaluate our model, we will use the North American Breeding Bird Survey and Christmas Bird Count data for species richness, wintering and breeding range. Long term and current satellite data series include AVHRR, Landsat, and MODIS.
Climate-based models for West Nile Culex mosquito vectors in the Northeastern US
NASA Astrophysics Data System (ADS)
Gong, Hongfei; Degaetano, Arthur T.; Harrington, Laura C.
2011-05-01
Climate-based models simulating Culex mosquito population abundance in the Northeastern US were developed. Two West Nile vector species, Culex pipiens and Culex restuans, were included in model simulations. The model was optimized by a parameter-space search within biological bounds. Mosquito population dynamics were driven by major environmental factors including temperature, rainfall, evaporation rate and photoperiod. The results show a strong correlation between the timing of early population increases (as early warning of West Nile virus risk) and decreases in late summer. Simulated abundance was highly correlated with actual mosquito capture in New Jersey light traps and validated with field data. This climate-based model simulates the population dynamics of both the adult and immature mosquito life stage of Culex arbovirus vectors in the Northeastern US. It is expected to have direct and practical application for mosquito control and West Nile prevention programs.
ERIC Educational Resources Information Center
Shaw, Rhonda R.
2017-01-01
Education reform is inevitable; however, the journey of reform must ensure that educators are equipped to meet the diverse needs of all children within the classrooms throughout. Data-driven decision making is going to be the driving force for making that happen. This mixed model research was designed to show how implementing data-driven…
Data-Driven H∞ Control for Nonlinear Distributed Parameter Systems.
Luo, Biao; Huang, Tingwen; Wu, Huai-Ning; Yang, Xiong
2015-11-01
The data-driven H∞ control problem of nonlinear distributed parameter systems is considered in this paper. An off-policy learning method is developed to learn the H∞ control policy from real system data rather than the mathematical model. First, Karhunen-Loève decomposition is used to compute the empirical eigenfunctions, which are then employed to derive a reduced-order model (ROM) of slow subsystem based on the singular perturbation theory. The H∞ control problem is reformulated based on the ROM, which can be transformed to solve the Hamilton-Jacobi-Isaacs (HJI) equation, theoretically. To learn the solution of the HJI equation from real system data, a data-driven off-policy learning approach is proposed based on the simultaneous policy update algorithm and its convergence is proved. For implementation purpose, a neural network (NN)- based action-critic structure is developed, where a critic NN and two action NNs are employed to approximate the value function, control, and disturbance policies, respectively. Subsequently, a least-square NN weight-tuning rule is derived with the method of weighted residuals. Finally, the developed data-driven off-policy learning approach is applied to a nonlinear diffusion-reaction process, and the obtained results demonstrate its effectiveness.
A Data System for a Rapid Evaluation Class of Subscale Aerial Vehicle
NASA Technical Reports Server (NTRS)
Hogge, Edward F.; Quach, Cuong C.; Vazquez, Sixto L.; Hill, Boyd L.
2011-01-01
A low cost, rapid evaluation, test aircraft is used to develop and test airframe damage diagnosis algorithms at Langley Research Center as part of NASA's Aviation Safety Program. The remotely operated subscale aircraft is instrumented with sensors to monitor structural response during flight. Data is collected for good and compromised airframe configurations to develop data driven models for diagnosing airframe state. This paper describes the data acquisition system (DAS) of the rapid evaluation test aircraft. A PC/104 form factor DAS was developed to allow use of Matlab, Simulink simulation code in Langley's existing subscale aircraft flight test infrastructure. The small scale of the test aircraft permitted laboratory testing of the actual flight article under controlled conditions. The low cost and modularity of the DAS permitted adaptation to various flight experiment requirements.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones
2015-07-01
COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES Sb. GRANT NUMBER N/A Sc. PROGRAM ELEMENT NUMBER 6 1101E 6. AUTHOR(S) Sd...machine analysis system to detect novel, sophisticated Android malware. (c) An innovative library summarization technique and its incorporation in
Spreter Von Kreudenstein, Thomas; Lario, Paula I; Dixit, Surjit B
2014-01-01
Computational and structure guided methods can make significant contributions to the development of solutions for difficult protein engineering problems, including the optimization of next generation of engineered antibodies. In this paper, we describe a contemporary industrial antibody engineering program, based on hypothesis-driven in silico protein optimization method. The foundational concepts and methods of computational protein engineering are discussed, and an example of a computational modeling and structure-guided protein engineering workflow is provided for the design of best-in-class heterodimeric Fc with high purity and favorable biophysical properties. We present the engineering rationale as well as structural and functional characterization data on these engineered designs. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bouya, Zahra; Terkildsen, Michael
2016-07-01
The Australian Space Forecast Centre (ASFC) provides space weather forecasts to a diverse group of customers. Space Weather Services (SWS) within the Australian Bureau of Meteorology is focussed both on developing tailored products and services for the key customer groups, and supporting ASFC operations. Research in SWS is largely centred on the development of data-driven models using a range of solar-terrestrial data. This paper will cover some data requirements , approaches and recent SWS activities for data driven modelling with a focus on the regional Ionospheric specification and forecasting.
Latest in Campus Transportation
ERIC Educational Resources Information Center
Molloy, Larry
1974-01-01
Innovations in handling bicycles, autos, and buses are appearing on campuses across the country. Computer-driven shuttle cars and monorails are on the way. Provides information sources for more data about ongoing, innovative campus transportation programs. (Author)
Magnani, Barbarajean; Harubin, Beth; Katz, Judith F; Zuckerman, Andrea L; Strohsnitter, William C
2016-12-01
- See, Test & Treat is a pathologist-driven program to provide cervical and breast cancer screening to underserved and underinsured patient populations. This program is largely funded by the CAP Foundation (College of American Pathologists, Northfield, Illinois) and is a collaborative effort among several medical specialties united to address gaps in the current health care system. - To provide an outline for administering a See, Test & Treat program, using an academic medical center as a model for providing care and collating the results of 5 years of data on the See, Test & Treat program's findings. - Sources include data from patients seen at Tufts Medical Center (Boston, Massachusetts) who presented to the See, Test & Treat program and institutional data between 2010 and 2014 detailing the outline of how to organize and operationalize a volunteer cancer-screening program. - During the 5-year course of the program, 203 women were provided free cervical and breast cancer screening. Of the 169 patients who obtained Papanicolaou screening, 36 (21.3%) had abnormal Papanicolaou tests. In addition, 16 of 130 patients (12.3%) who underwent mammography had abnormal findings. - In general, women from ethnic populations have barriers that prevent them from participating in cancer screening. However, the CAP Foundation's See, Test & Treat program is designed to reduce those barriers for these women by providing care that addresses cultural, financial, and practical issues. Although screening programs are helpful in identifying those who need further treatment, obtaining further treatment for these patients continues to be a challenge.
NASA Astrophysics Data System (ADS)
Sboev, A.; Moloshnikov, I.; Gudovskikh, D.; Rybka, R.
2017-12-01
In this work we compare several data-driven approaches to the task of author’s gender identification for texts with or without gender imitation. The data corpus has been specially gathered with crowdsourcing for this task. The best models are convolutional neural network with input of morphological data (fl-measure: 88%±3) for texts without imitation, and gradient boosting model with vector of character n-grams frequencies as input data (f1-measure: 64% ± 3) for texts with gender imitation. The method to filter the crowdsourced corpus using limited reference sample of texts to increase the accuracy of result is discussed.
Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.
2012-01-01
Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365
NASA Technical Reports Server (NTRS)
Thompson, David E.; Rajkumar, T.; Clancy, Daniel (Technical Monitor)
2002-01-01
The San Francisco Bay Delta is a large hydrodynamic complex that incorporates the Sacramento and San Joaquin Estuaries, the Burman Marsh, and the San Francisco Bay proper. Competition exists for the use of this extensive water system both from the fisheries industry, the agricultural industry, and from the marine and estuarine animal species within the Delta. As tidal fluctuations occur, more saline water pushes upstream allowing fish to migrate beyond the Burman Marsh for breeding and habitat occupation. However, the agriculture industry does not want extensive salinity intrusion to impact water quality for human and plant consumption. The balance is regulated by pumping stations located alone the estuaries and reservoirs whereby flushing of fresh water keeps the saline intrusion at bay. The pumping schedule is driven by data collected at various locations within the Bay Delta and by numerical models that predict the salinity intrusion as part of a larger model of the system. The Interagency Ecological Program (IEP) for the San Francisco Bay/Sacramento-San Joaquin Estuary collects, monitors, and archives the data, and the Department of Water Resources provides a numerical model simulation (DSM2) from which predictions are made that drive the pumping schedule. A problem with this procedure is that the numerical simulation takes roughly 16 hours to complete a C: prediction. We have created a neural net, optimized with a genetic algorithm, that takes as input the archived data from multiple stations and predicts stage, salinity, and flow at the Carquinez Straits (at the downstream end of the Burman Marsh). This model seems to be robust in its predictions and operates much faster than the current numerical DSM2 model. Because the system is strongly tidal driven, we used both Principal Component Analysis and Fast Fourier Transforms to discover dominant features within the IEP data. We then filtered out the dominant tidal forcing to discover non-primary tidal effects, and used this to enhance the neural network by mapping input-output relationships in a more efficient manner. Furthermore, the neural network implicitly incorporates both the hydrodynamic and water quality models into a single predictive system. Although our model has not yet been enhanced to demonstrate improve pumping schedules, it has the possibility to support better decision-making procedures that may then be implemented by State agencies if desired. Our intention is now to use this model in the smaller Elkhorn Slough complex near Monterey Bay where no such hydrodynamic model currently exists. At the Elkhorn Slough, we are fusing the neural net model of tidally-driven flow with in situ flow data and airborne and satellite remote sensation data. These further constrain the behavior of the model in predicting the longer-term health and future of this vital estuary.
Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-10-06
Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.
Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-01-01
Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840
Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code
ERIC Educational Resources Information Center
Donaldson, Stewart I.
2005-01-01
Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…
Machine Learning and Deep Learning Models to Predict Runoff Water Quantity and Quality
NASA Astrophysics Data System (ADS)
Bradford, S. A.; Liang, J.; Li, W.; Murata, T.; Simunek, J.
2017-12-01
Contaminants can be rapidly transported at the soil surface by runoff to surface water bodies. Physically-based models, which are based on the mathematical description of main hydrological processes, are key tools for predicting surface water impairment. Along with physically-based models, data-driven models are becoming increasingly popular for describing the behavior of hydrological and water resources systems since these models can be used to complement or even replace physically based-models. In this presentation we propose a new data-driven model as an alternative to a physically-based overland flow and transport model. First, we have developed a physically-based numerical model to simulate overland flow and contaminant transport (the HYDRUS-1D overland flow module). A large number of numerical simulations were carried out to develop a database containing information about the impact of various input parameters (weather patterns, surface topography, vegetation, soil conditions, contaminants, and best management practices) on runoff water quantity and quality outputs. This database was used to train data-driven models. Three different methods (Neural Networks, Support Vector Machines, and Recurrence Neural Networks) were explored to prepare input- output functional relations. Results demonstrate the ability and limitations of machine learning and deep learning models to predict runoff water quantity and quality.
All Hands on Deck: A Comprehensive, Results-Driven Counseling Model
ERIC Educational Resources Information Center
Salina, Charles; Girtz, Suzann; Eppinga, Joanie; Martinez, David; Kilian, Diana Blumer; Lozano, Elizabeth; Martinez, Adrian P.; Crowe, Dustin; De La Barrera, Maria; Mendez, Maribel Madrigal; Shines, Terry
2014-01-01
A graduation rate of 49% alarmed Sunnyside High School in 2009. With graduation rates in the bottom 5% statewide, Sunnyside was awarded a federally funded School Improvement Grant. The "turnaround" principal and the school counselors aligned goals with the ASCA National Model through the program All Hands On Deck (AHOD), based on…
MICRO-U 70.1: Training Model of an Instructional Institution, Users Manual.
ERIC Educational Resources Information Center
Springer, Colby H.
MICRO-U is a student demand driven deterministic model. Student enrollment, by degree program, is used to develop an Instructional Work Load Matrix. Linear equations using Weekly Student Contact Hours (WSCH), Full Time Equivalent (FTE) students, FTE faculty, and number of disciplines determine library, central administration, and physical plant…
A Model for Mapping Linkages between Health and Education Agencies To Improve School Health.
ERIC Educational Resources Information Center
St. Leger, Lawrence; Nutbeam, Don
2000-01-01
Reviews the evolution of efforts to develop effective, sustainable school health programs, arguing that efforts were significantly driven by public health priorities and have not adequately accounted for educational perspectives. A model illustrating linkages between different school-based inputs and strategies and long-term health and educational…
Inspiring Teaching: Preparing Teachers to Succeed in Mission-Driven Schools
ERIC Educational Resources Information Center
Feiman-Nemser, Sharon, Ed.; Tamir, Eran, Ed.; Hammerness, Karen, Ed.
2014-01-01
How can we best prepare pre-service teachers to succeed in the classroom--and to stay in teaching over time? The one-size-fits-all model of traditional teacher education programs has been widely criticized, yet the most popular alternative--fast-track programs--have at best a mixed record of success. An increasing number of districts and charter…
ERIC Educational Resources Information Center
Fields, Deborah; Vasudevan, Veena; Kafai, Yasmin B.
2015-01-01
We highlight ways to support interest-driven creation of digital media in Scratch, a visual-based programming language and community, within a high school programming workshop. We describe a collaborative approach, the programmers' collective, that builds on social models found in do-it-yourself and open source communities, but with scaffolding…
Saxton, Michael J
2007-01-01
Modeling obstructed diffusion is essential to the understanding of diffusion-mediated processes in the crowded cellular environment. Simple Monte Carlo techniques for modeling obstructed random walks are explained and related to Brownian dynamics and more complicated Monte Carlo methods. Random number generation is reviewed in the context of random walk simulations. Programming techniques and event-driven algorithms are discussed as ways to speed simulations.
Using Flexible Data-Driven Frameworks to Enhance School Psychology Training and Practice
ERIC Educational Resources Information Center
Coleman, Stephanie L.; Hendricker, Elise
2016-01-01
While a great number of scientific advances have been made in school psychology, the research to practice gap continues to exist, which has significant implications for training future school psychologists. Training in flexible, data-driven models may help school psychology trainees develop important competencies that will benefit them throughout…
Bustamante, Carlos D.; Valero-Cuevas, Francisco J.
2010-01-01
The field of complex biomechanical modeling has begun to rely on Monte Carlo techniques to investigate the effects of parameter variability and measurement uncertainty on model outputs, search for optimal parameter combinations, and define model limitations. However, advanced stochastic methods to perform data-driven explorations, such as Markov chain Monte Carlo (MCMC), become necessary as the number of model parameters increases. Here, we demonstrate the feasibility and, what to our knowledge is, the first use of an MCMC approach to improve the fitness of realistically large biomechanical models. We used a Metropolis–Hastings algorithm to search increasingly complex parameter landscapes (3, 8, 24, and 36 dimensions) to uncover underlying distributions of anatomical parameters of a “truth model” of the human thumb on the basis of simulated kinematic data (thumbnail location, orientation, and linear and angular velocities) polluted by zero-mean, uncorrelated multivariate Gaussian “measurement noise.” Driven by these data, ten Markov chains searched each model parameter space for the subspace that best fit the data (posterior distribution). As expected, the convergence time increased, more local minima were found, and marginal distributions broadened as the parameter space complexity increased. In the 36-D scenario, some chains found local minima but the majority of chains converged to the true posterior distribution (confirmed using a cross-validation dataset), thus demonstrating the feasibility and utility of these methods for realistically large biomechanical problems. PMID:19272906
Genomic signal processing: from matrix algebra to genetic networks.
Alter, Orly
2007-01-01
DNA microarrays make it possible, for the first time, to record the complete genomic signals that guide the progression of cellular processes. Future discovery in biology and medicine will come from the mathematical modeling of these data, which hold the key to fundamental understanding of life on the molecular level, as well as answers to questions regarding diagnosis, treatment, and drug development. This chapter reviews the first data-driven models that were created from these genome-scale data, through adaptations and generalizations of mathematical frameworks from matrix algebra that have proven successful in describing the physical world, in such diverse areas as mechanics and perception: the singular value decomposition model, the generalized singular value decomposition model comparative model, and the pseudoinverse projection integrative model. These models provide mathematical descriptions of the genetic networks that generate and sense the measured data, where the mathematical variables and operations represent biological reality. The variables, patterns uncovered in the data, correlate with activities of cellular elements such as regulators or transcription factors that drive the measured signals and cellular states where these elements are active. The operations, such as data reconstruction, rotation, and classification in subspaces of selected patterns, simulate experimental observation of only the cellular programs that these patterns represent. These models are illustrated in the analyses of RNA expression data from yeast and human during their cell cycle programs and DNA-binding data from yeast cell cycle transcription factors and replication initiation proteins. Two alternative pictures of RNA expression oscillations during the cell cycle that emerge from these analyses, which parallel well-known designs of physical oscillators, convey the capacity of the models to elucidate the design principles of cellular systems, as well as guide the design of synthetic ones. In these analyses, the power of the models to predict previously unknown biological principles is demonstrated with a prediction of a novel mechanism of regulation that correlates DNA replication initiation with cell cycle-regulated RNA transcription in yeast. These models may become the foundation of a future in which biological systems are modeled as physical systems are today.
The EPA Comptox Chemistry Dashboard: A Web-Based Data ...
The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data but recent developments have focused on the development of a new software architecture that assembles the resources into a single platform. A new web application, the CompTox Chemistry Dashboard provides access to data associated with ~720,000 chemical substances. These data include experimental and predicted physicochemical property data, bioassay screening data associated with the ToxCast program, product and functional use information and a myriad of related data of value to environmental scientists. The dashboard provides chemical-based searching based on chemical names, synonyms and CAS Registry Numbers. Flexible search capabilities allow for chemical identificati
The EPA CompTox Chemistry Dashboard - an online resource ...
The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data. Recent work has focused on the development of a new architecture that assembles the resources into a single platform. With a focus on delivering access to Open Data streams, web service integration accessibility and a user-friendly web application the CompTox Dashboard provides access to data associated with ~720,000 chemical substances. These data include research data in the form of bioassay screening data associated with the ToxCast program, experimental and predicted physicochemical properties, product and functional use information and related data of value to environmental scientists. This presentation will provide an overview of the CompTox Dashboard and its va
NASA Astrophysics Data System (ADS)
Li, Zhiyong; Hoagg, Jesse B.; Martin, Alexandre; Bailey, Sean C. C.
2018-03-01
This paper presents a data-driven computational model for simulating unsteady turbulent flows, where sparse measurement data is available. The model uses the retrospective cost adaptation (RCA) algorithm to automatically adjust the closure coefficients of the Reynolds-averaged Navier-Stokes (RANS) k- ω turbulence equations to improve agreement between the simulated flow and the measurements. The RCA-RANS k- ω model is verified for steady flow using a pipe-flow test case and for unsteady flow using a surface-mounted-cube test case. Measurements used for adaptation of the verification cases are obtained from baseline simulations with known closure coefficients. These verification test cases demonstrate that the RCA-RANS k- ω model can successfully adapt the closure coefficients to improve agreement between the simulated flow field and a set of sparse flow-field measurements. Furthermore, the RCA-RANS k- ω model improves agreement between the simulated flow and the baseline flow at locations at which measurements do not exist. The RCA-RANS k- ω model is also validated with experimental data from 2 test cases: steady pipe flow, and unsteady flow past a square cylinder. In both test cases, the adaptation improves agreement with experimental data in comparison to the results from a non-adaptive RANS k- ω model that uses the standard values of the k- ω closure coefficients. For the steady pipe flow, adaptation is driven by mean stream-wise velocity measurements at 24 locations along the pipe radius. The RCA-RANS k- ω model reduces the average velocity error at these locations by over 35%. For the unsteady flow over a square cylinder, adaptation is driven by time-varying surface pressure measurements at 2 locations on the square cylinder. The RCA-RANS k- ω model reduces the average surface-pressure error at these locations by 88.8%.
Lara A. Roman; E. Gregory McPherson; Bryant C. Scharenbroch; Julia Bartens
2013-01-01
Urban forest monitoring data are essential to assess the impacts of tree planting campaigns and management programs. Local practitioners have monitoring projects that have not been well documented in the urban forestry literature. To learn more about practitioner-driven monitoring efforts, the authors surveyed 32 local urban forestry organizations across the United...
Lessons Learned from a Data-Driven College Access Program: The National College Advising Corps
ERIC Educational Resources Information Center
Horng, Eileen L.; Evans, Brent J.; antonio, anthony l.; Foster, Jesse D.; Kalamkarian, Hoori S.; Hurd, Nicole F.; Bettinger, Eric P.
2013-01-01
This chapter discusses the collaboration between a national college access program, the National College Advising Corps (NCAC), and its research and evaluation team at Stanford University. NCAC is currently active in almost four hundred high schools and through the placement of a recent college graduate to serve as a college adviser provides…
Vodovotz, Yoram; Xia, Ashley; Read, Elizabeth L; Bassaganya-Riera, Josep; Hafler, David A; Sontag, Eduardo; Wang, Jin; Tsang, John S; Day, Judy D; Kleinstein, Steven H; Butte, Atul J; Altman, Matthew C; Hammond, Ross; Sealfon, Stuart C
2017-02-01
Emergent responses of the immune system result from the integration of molecular and cellular networks over time and across multiple organs. High-content and high-throughput analysis technologies, concomitantly with data-driven and mechanistic modeling, hold promise for the systematic interrogation of these complex pathways. However, connecting genetic variation and molecular mechanisms to individual phenotypes and health outcomes has proven elusive. Gaps remain in data, and disagreements persist about the value of mechanistic modeling for immunology. Here, we present the perspectives that emerged from the National Institute of Allergy and Infectious Disease (NIAID) workshop 'Complex Systems Science, Modeling and Immunity' and subsequent discussions regarding the potential synergy of high-throughput data acquisition, data-driven modeling, and mechanistic modeling to define new mechanisms of immunological disease and to accelerate the translation of these insights into therapies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dealing with Multiple Solutions in Structural Vector Autoregressive Models.
Beltz, Adriene M; Molenaar, Peter C M
2016-01-01
Structural vector autoregressive models (VARs) hold great potential for psychological science, particularly for time series data analysis. They capture the magnitude, direction of influence, and temporal (lagged and contemporaneous) nature of relations among variables. Unified structural equation modeling (uSEM) is an optimal structural VAR instantiation, according to large-scale simulation studies, and it is implemented within an SEM framework. However, little is known about the uniqueness of uSEM results. Thus, the goal of this study was to investigate whether multiple solutions result from uSEM analysis and, if so, to demonstrate ways to select an optimal solution. This was accomplished with two simulated data sets, an empirical data set concerning children's dyadic play, and modifications to the group iterative multiple model estimation (GIMME) program, which implements uSEMs with group- and individual-level relations in a data-driven manner. Results revealed multiple solutions when there were large contemporaneous relations among variables. Results also verified several ways to select the correct solution when the complete solution set was generated, such as the use of cross-validation, maximum standardized residuals, and information criteria. This work has immediate and direct implications for the analysis of time series data and for the inferences drawn from those data concerning human behavior.
NASA Technical Reports Server (NTRS)
Kelly, A. J.; Jahn, R. G.; Choueiri, E. Y.
1990-01-01
The dominant unstable electrostatic wave modes of an electromagnetically accelerated plasma are investigated. The study is the first part of a three-phase program aimed at characterizing the current-driven turbulent dissipation degrading the efficiency of Lorentz force plasma accelerators such as the MPD thruster. The analysis uses a kinetic theory that includes magnetic and thermal effects as well as those of an electron current transverse to the magnetic field and collisions, thus combining all the features of previous models. Analytical and numerical solutions allow a detailed description of threshold criteria, finite growth behavior, destabilization mechanisms and maximized-growth characteristics of the dominant unstable modes. The lower hybrid current-driven instability is implicated as dominant and was found to preserve its character in the collisional plasma regime.
NASA Technical Reports Server (NTRS)
Eluszkiewicz, Janusz; Nehrkorn, Thomas; Wofsy, Steven C.; Matross, Daniel; Gerbig, Christoph; Lin, John C.; Freitas, Saulo; Longo, Marcos; Andrews, Arlyn E.; Peters, Wouter
2007-01-01
This paper evaluates simulations of atmospheric CO2 measured in 2004 at continental surface and airborne receptors, intended to test the capability to use data with high temporal and spatial resolution for analyses of carbon sources and sinks at regional and continental scales. The simulations were performed using the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by the Weather Forecast and Research (WRF) model, and linked to surface fluxes from the satellite-driven Vegetation Photosynthesis and Respiration Model (VPRM). The simulations provide detailed representations of hourly CO2 tower data and reproduce the shapes of airborne vertical profiles with high fidelity. WRF meteorology gives superior model performance compared with standard meteorological products, and the impact of including WRF convective mass fluxes in the STILT trajectory calculations is significant in individual cases. Important biases in the simulation are associated with the nighttime CO2 build-up and subsequent morning transition to convective conditions, and with errors in the advected lateral boundary condition. Comparison of STILT simulations driven by the WRF model against those driven by the Brazilian variant of the Regional Atmospheric Modeling System (BRAMS) shows that model-to-model differences are smaller than between an individual transport model and observations, pointing to systematic errors in the simulated transport. Future developments in the WRF model s data assimilation capabilities, basic research into the fundamental aspects of trajectory calculations, and intercomparison studies involving other transport models, are possible venues for reducing these errors. Overall, the STILT/WRF/VPRM offers a powerful tool for continental and regional scale carbon flux estimates.
Empirically Driven Variable Selection for the Estimation of Causal Effects with Observational Data
ERIC Educational Resources Information Center
Keller, Bryan; Chen, Jianshen
2016-01-01
Observational studies are common in educational research, where subjects self-select or are otherwise non-randomly assigned to different interventions (e.g., educational programs, grade retention, special education). Unbiased estimation of a causal effect with observational data depends crucially on the assumption of ignorability, which specifies…
ERIC Educational Resources Information Center
Wyse, Sara A.; Long, Tammy M.; Ebert-May, Diane
2014-01-01
Graduate teaching assistants (TAs) are increasingly responsible for instruction in undergraduate science, technology, engineering, and mathematics (STEM) courses. Various professional development (PD) programs have been developed and implemented to prepare TAs for this role, but data about effectiveness are lacking and are derived almost…
Using GIS Tools and Environmental Scanning to Forecast Industry Workforce Needs
ERIC Educational Resources Information Center
Gaertner, Elaine; Fleming, Kevin; Marquez, Michelle
2009-01-01
The Centers of Excellence (COE) provide regional workforce data on high growth, high demand industries and occupations for use by community colleges in program planning and resource enhancement. This article discusses the environmental scanning research methodology and its application to data-driven decision making in community college program…
Sensor Web Dynamic Measurement Techniques and Adaptive Observing Strategies
NASA Technical Reports Server (NTRS)
Talabac, Stephen J.
2004-01-01
Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.
Big-Data-Driven Stem Cell Science and Tissue Engineering: Vision and Unique Opportunities.
Del Sol, Antonio; Thiesen, Hans J; Imitola, Jaime; Carazo Salas, Rafael E
2017-02-02
Achieving the promises of stem cell science to generate precise disease models and designer cell samples for personalized therapeutics will require harnessing pheno-genotypic cell-level data quantitatively and predictively in the lab and clinic. Those requirements could be met by developing a Big-Data-driven stem cell science strategy and community. Copyright © 2017 Elsevier Inc. All rights reserved.
A review of surrogate models and their application to groundwater modeling
NASA Astrophysics Data System (ADS)
Asher, M. J.; Croke, B. F. W.; Jakeman, A. J.; Peeters, L. J. M.
2015-08-01
The spatially and temporally variable parameters and inputs to complex groundwater models typically result in long runtimes which hinder comprehensive calibration, sensitivity, and uncertainty analysis. Surrogate modeling aims to provide a simpler, and hence faster, model which emulates the specified output of a more complex model in function of its inputs and parameters. In this review paper, we summarize surrogate modeling techniques in three categories: data-driven, projection, and hierarchical-based approaches. Data-driven surrogates approximate a groundwater model through an empirical model that captures the input-output mapping of the original model. Projection-based models reduce the dimensionality of the parameter space by projecting the governing equations onto a basis of orthonormal vectors. In hierarchical or multifidelity methods the surrogate is created by simplifying the representation of the physical system, such as by ignoring certain processes, or reducing the numerical resolution. In discussing the application to groundwater modeling of these methods, we note several imbalances in the existing literature: a large body of work on data-driven approaches seemingly ignores major drawbacks to the methods; only a fraction of the literature focuses on creating surrogates to reproduce outputs of fully distributed groundwater models, despite these being ubiquitous in practice; and a number of the more advanced surrogate modeling methods are yet to be fully applied in a groundwater modeling context.
EnviroNET: An on-line environment data base for LDEF data
NASA Technical Reports Server (NTRS)
Lauriente, Michael
1992-01-01
EnviroNET is an on-line, free form data base intended to provide a centralized depository for a wide range of technical information on environmentally induced interactions of use to Space Shuttle customers and spacecraft designers. It provides a user friendly, menu driven format on networks that are connected globally and is available twenty-four hours a day, every day. The information updated regularly, includes expository text, tabular numerical data, charts and graphs, and models. The system pools space data collected over the years by NASA, USAF, other government facilities, industry, universities, and ESA. The models accept parameter input from the user and calculate and display the derived values corresponding to that input. In addition to the archive, interactive graphics programs are also available on space debris, the neutral atmosphere, radiation, magnetic field, and ionosphere. A user friendly informative interface is standard for all the models with a pop-up window, help window with information on inputs, outputs, and caveats. The system will eventually simplify mission analysis with analytical tools and deliver solution for computational intense graphical applications to do 'What if' scenarios. A proposed plan for developing a repository of LDEF information for a user group concludes the presentation.
BrainLiner: A Neuroinformatics Platform for Sharing Time-Aligned Brain-Behavior Data
Takemiya, Makoto; Majima, Kei; Tsukamoto, Mitsuaki; Kamitani, Yukiyasu
2016-01-01
Data-driven neuroscience aims to find statistical relationships between brain activity and task behavior from large-scale datasets. To facilitate high-throughput data processing and modeling, we created BrainLiner as a web platform for sharing time-aligned, brain-behavior data. Using an HDF5-based data format, BrainLiner treats brain activity and data related to behavior with the same salience, aligning both behavioral and brain activity data on a common time axis. This facilitates learning the relationship between behavior and brain activity. Using a common data file format also simplifies data processing and analyses. Properties describing data are unambiguously defined using a schema, allowing machine-readable definition of data. The BrainLiner platform allows users to upload and download data, as well as to explore and search for data from the web platform. A WebGL-based data explorer can visualize highly detailed neurophysiological data from within the web browser, and a data-driven search feature allows users to search for similar time windows of data. This increases transparency, and allows for visual inspection of neural coding. BrainLiner thus provides an essential set of tools for data sharing and data-driven modeling. PMID:26858636
Sustainable Cost Models for mHealth at Scale: Modeling Program Data from m4RH Tanzania.
Mangone, Emily R; Agarwal, Smisha; L'Engle, Kelly; Lasway, Christine; Zan, Trinity; van Beijma, Hajo; Orkis, Jennifer; Karam, Robert
2016-01-01
There is increasing evidence that mobile phone health interventions ("mHealth") can improve health behaviors and outcomes and are critically important in low-resource, low-access settings. However, the majority of mHealth programs in developing countries fail to reach scale. One reason may be the challenge of developing financially sustainable programs. The goal of this paper is to explore strategies for mHealth program sustainability and develop cost-recovery models for program implementers using 2014 operational program data from Mobile for Reproductive Health (m4RH), a national text-message (SMS) based health communication service in Tanzania. We delineated 2014 m4RH program costs and considered three strategies for cost-recovery for the m4RH program: user pay-for-service, SMS cost reduction, and strategic partnerships. These inputs were used to develop four different cost-recovery scenarios. The four scenarios leveraged strategic partnerships to reduce per-SMS program costs and create per-SMS program revenue and varied the structure for user financial contribution. Finally, we conducted break-even and uncertainty analyses to evaluate the costs and revenues of these models at the 2014 user volume (125,320) and at any possible break-even volume. In three of four scenarios, costs exceeded revenue by $94,596, $34,443, and $84,571 at the 2014 user volume. However, these costs represented large reductions (54%, 83%, and 58%, respectively) from the 2014 program cost of $203,475. Scenario four, in which the lowest per-SMS rate ($0.01 per SMS) was negotiated and users paid for all m4RH SMS sent or received, achieved a $5,660 profit at the 2014 user volume. A Monte Carlo uncertainty analysis demonstrated that break-even points were driven by user volume rather than variations in program costs. These results reveal that breaking even was only probable when all SMS costs were transferred to users and the lowest per-SMS cost was negotiated with telecom partners. While this strategy was sustainable for the implementer, a central concern is that health information may not reach those who are too poor to pay, limiting the program's reach and impact. Incorporating strategies presented here may make mHealth programs more appealing to funders and investors but need further consideration to balance sustainability, scale, and impact.
UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.
Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun
2013-12-01
Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.
NASA Astrophysics Data System (ADS)
Breil, Marcus; Panitz, Hans-Jürgen
2014-05-01
Climate predictions on decadal timescales constitute a new field of research, closing the gap between short-term and seasonal weather predictions and long-term climate projections. Therefore, the Federal Ministry of Education and Research in Germany (BMBF) has recently funded the research program MiKlip (Mittelfristige Klimaprognosen), which aims to create a model system that can provide reliable decadal climate forecasts. Recent studies have suggested that one region with high potential decadal predictability is West Africa. Therefore, the project DEPARTURE (DEcadal Prediction of African Rainfall and ATlantic HURricanE Activity) was established within the MiKlip program to assess the feasibility and the potential added value of regional decadal climate predictions for West Africa. To quantify the potential decadal climate predictability, a multi-model approach with the three different regional climate models REMO, WRF and COSMO-CLM (CCLM) will be realized. The presented research will contribute to DEPARTURE by performing hindcast ensemble simulations with CCLM, driven by global decadal MPI-ESM-LR simulations. Thereby, one focus is on the dynamic soil-vegetation-climate interaction on decadal timescales. Recent studies indicate that there are significant feedbacks between the land-surface and the atmosphere, which might influence the decadal climate variability substantially. To investigate this connection, two different SVATs (Community Land Model (CLM), and VEG3D) will be coupled with the CCLM, replacing TERRA_ML, the standard SVAT implemented in CCLM. Thus, sensitive model parameters shall be identified, whereby the understanding of important processes might be improved. As a first step, TERRA_ML is substituted by VEG3D, a SVAT developed at the IMK-TRO, Karlsruhe, Germany. Compared to TERRA_ML, VEG3D includes an explicit vegetation layer by using a big leaf approach, inducing higher correlations with observations as it has been shown in previous studies. The coupling of VEG3D with CCLM is performed by using the OASIS3-MCT coupling software, developed by CERFACS, Toulouse, France. Results of CCLM simulations using both SVATs are analysed and compared for the DEPARTURE model domain. Thereby ERA-Interim driven CCLM simulations with VEG3D showed better agreement with observational data than simulations with TERRA_ML, especially for dense vegetaded areas. This will be demonstrated exemplarily. Additionally, results for MPI-ESM-LR driven decadal hindcast simulations (1966 - 1975) are analysed and presented.
CEREF: A hybrid data-driven model for forecasting annual streamflow from a socio-hydrological system
NASA Astrophysics Data System (ADS)
Zhang, Hongbo; Singh, Vijay P.; Wang, Bin; Yu, Yinghao
2016-09-01
Hydrological forecasting is complicated by flow regime alterations in a coupled socio-hydrologic system, encountering increasingly non-stationary, nonlinear and irregular changes, which make decision support difficult for future water resources management. Currently, many hybrid data-driven models, based on the decomposition-prediction-reconstruction principle, have been developed to improve the ability to make predictions of annual streamflow. However, there exist many problems that require further investigation, the chief among which is the direction of trend components decomposed from annual streamflow series and is always difficult to ascertain. In this paper, a hybrid data-driven model was proposed to capture this issue, which combined empirical mode decomposition (EMD), radial basis function neural networks (RBFNN), and external forces (EF) variable, also called the CEREF model. The hybrid model employed EMD for decomposition and RBFNN for intrinsic mode function (IMF) forecasting, and determined future trend component directions by regression with EF as basin water demand representing the social component in the socio-hydrologic system. The Wuding River basin was considered for the case study, and two standard statistical measures, root mean squared error (RMSE) and mean absolute error (MAE), were used to evaluate the performance of CEREF model and compare with other models: the autoregressive (AR), RBFNN and EMD-RBFNN. Results indicated that the CEREF model had lower RMSE and MAE statistics, 42.8% and 7.6%, respectively, than did other models, and provided a superior alternative for forecasting annual runoff in the Wuding River basin. Moreover, the CEREF model can enlarge the effective intervals of streamflow forecasting compared to the EMD-RBFNN model by introducing the water demand planned by the government department to improve long-term prediction accuracy. In addition, we considered the high-frequency component, a frequent subject of concern in EMD-based forecasting, and results showed that removing high-frequency component is an effective measure to improve forecasting precision and is suggested for use with the CEREF model for better performance. Finally, the study concluded that the CEREF model can be used to forecast non-stationary annual streamflow change as a co-evolution of hydrologic and social systems with better accuracy. Also, the modification about removing high-frequency can further improve the performance of the CEREF model. It should be noted that the CEREF model is beneficial for data-driven hydrologic forecasting in complex socio-hydrologic systems, and as a simple data-driven socio-hydrologic forecasting model, deserves more attention.
Velpuri, N.M.; Senay, G.B.; Asante, K.O.
2012-01-01
Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of interand intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellitedriven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2m. The lake level fluctuated in the range up to 4m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins. ?? Author(s) 2012.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
ISPE: A knowledge-based system for fluidization studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
Naqshbandi Hayward, Mariam; Paquette-Warren, Jann; Harris, Stewart B
2016-07-26
Given the dramatic rise and impact of chronic diseases and gaps in care in Indigenous peoples in Canada, a shift from the dominant episodic and responsive healthcare model most common in First Nations communities to one that places emphasis on proactive prevention and chronic disease management is urgently needed. The Transformation of Indigenous Primary Healthcare Delivery (FORGE AHEAD) Program partners with 11 First Nations communities across six provinces in Canada to develop and evaluate community-driven quality improvement (QI) initiatives to enhance chronic disease care. FORGE AHEAD is a 5-year research program (2013-2017) that utilizes a pre-post mixed-methods observational design rooted in participatory research principles to work with communities in developing culturally relevant innovations and improved access to available services. This intensive program incorporates a series of 10 inter-related and progressive program activities designed to foster community-driven initiatives with type 2 diabetes mellitus as the action disease. Preparatory activities include a national community profile survey, best practice and policy literature review, and readiness tool development. Community-level intervention activities include community and clinical readiness consultations, development of a diabetes registry and surveillance system, and QI activities. With a focus on capacity building, all community-level activities are driven by trained community members who champion QI initiatives in their community. Program wrap-up activities include readiness tool validation, cost-analysis and process evaluation. In collaboration with Health Canada and the Aboriginal Diabetes Initiative, scale-up toolkits will be developed in order to build on lessons-learned, tools and methods, and to fuel sustainability and spread of successful innovations. The outcomes of this research program, its related cost and the subsequent policy recommendations, will have the potential to significantly affect future policy decisions pertaining to chronic disease care in First Nations communities in Canada. Current ClinicalTrial.gov protocol ID NCT02234973 . Date of Registration: July 30, 2014.
A data-driven dynamics simulation framework for railway vehicles
NASA Astrophysics Data System (ADS)
Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2018-03-01
The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.
The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networksmore » and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.« less
Design and experiment of data-driven modeling and flutter control of a prototype wing
NASA Astrophysics Data System (ADS)
Lum, Kai-Yew; Xu, Cai-Lin; Lu, Zhenbo; Lai, Kwok-Leung; Cui, Yongdong
2017-06-01
This paper presents an approach for data-driven modeling of aeroelasticity and its application to flutter control design of a wind-tunnel wing model. Modeling is centered on system identification of unsteady aerodynamic loads using computational fluid dynamics data, and adopts a nonlinear multivariable extension of the Hammerstein-Wiener system. The formulation is in modal coordinates of the elastic structure, and yields a reduced-order model of the aeroelastic feedback loop that is parametrized by airspeed. Flutter suppression is thus cast as a robust stabilization problem over uncertain airspeed, for which a low-order H∞ controller is computed. The paper discusses in detail parameter sensitivity and observability of the model, the former to justify the chosen model structure, and the latter to provide a criterion for physical sensor placement. Wind tunnel experiments confirm the validity of the modeling approach and the effectiveness of the control design.
Consideration in selecting crops for the human-rated life support system: a Linear Programming model
NASA Technical Reports Server (NTRS)
Wheeler, E. F.; Kossowski, J.; Goto, E.; Langhans, R. W.; White, G.; Albright, L. D.; Wilcox, D.; Henninger, D. L. (Principal Investigator)
1996-01-01
A Linear Programming model has been constructed which aids in selecting appropriate crops for CELSS (Controlled Environment Life Support System) food production. A team of Controlled Environment Agriculture (CEA) faculty, staff, graduate students and invited experts representing more than a dozen disciplines, provided a wide range of expertise in developing the model and the crop production program. The model incorporates nutritional content and controlled-environment based production yields of carefully chosen crops into a framework where a crop mix can be constructed to suit the astronauts' needs. The crew's nutritional requirements can be adequately satisfied with only a few crops (assuming vitamin mineral supplements are provided) but this will not be satisfactory from a culinary standpoint. This model is flexible enough that taste and variety driven food choices can be built into the model.
Consideration in selecting crops for the human-rated life support system: a linear programming model
NASA Astrophysics Data System (ADS)
Wheeler, E. F.; Kossowski, J.; Goto, E.; Langhans, R. W.; White, G.; Albright, L. D.; Wilcox, D.
A Linear Programming model has been constructed which aids in selecting appropriate crops for CELSS (Controlled Environment Life Support System) food production. A team of Controlled Environment Agriculture (CEA) faculty, staff, graduate students and invited experts representing more than a dozen disciplines, provided a wide range of expertise in developing the model and the crop production program. The model incorporates nutritional content and controlled-environment based production yields of carefully chosen crops into a framework where a crop mix can be constructed to suit the astronauts' needs. The crew's nutritional requirements can be adequately satisfied with only a few crops (assuming vitamin mineral supplements are provided) but this will not be satisfactory from a culinary standpoint. This model is flexible enough that taste and variety driven food choices can be built into the model.
NASA Technical Reports Server (NTRS)
Celaya, Jose; Saxena, Abhinav; Saha, Sankalita; Goebel, Kai F.
2011-01-01
An approach for predicting remaining useful life of power MOSFETs (metal oxide field effect transistor) devices has been developed. Power MOSFETs are semiconductor switching devices that are instrumental in electronics equipment such as those used in operation and control of modern aircraft and spacecraft. The MOSFETs examined here were aged under thermal overstress in a controlled experiment and continuous performance degradation data were collected from the accelerated aging experiment. Dieattach degradation was determined to be the primary failure mode. The collected run-to-failure data were analyzed and it was revealed that ON-state resistance increased as die-attach degraded under high thermal stresses. Results from finite element simulation analysis support the observations from the experimental data. Data-driven and model based prognostics algorithms were investigated where ON-state resistance was used as the primary precursor of failure feature. A Gaussian process regression algorithm was explored as an example for a data-driven technique and an extended Kalman filter and a particle filter were used as examples for model-based techniques. Both methods were able to provide valid results. Prognostic performance metrics were employed to evaluate and compare the algorithms.
Dynamically adaptive data-driven simulation of extreme hydrological flows
NASA Astrophysics Data System (ADS)
Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint
2018-02-01
Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.
Sustainable Cost Models for mHealth at Scale: Modeling Program Data from m4RH Tanzania
Mangone, Emily R.; Agarwal, Smisha; L’Engle, Kelly; Lasway, Christine; Zan, Trinity; van Beijma, Hajo; Orkis, Jennifer; Karam, Robert
2016-01-01
Background There is increasing evidence that mobile phone health interventions (“mHealth”) can improve health behaviors and outcomes and are critically important in low-resource, low-access settings. However, the majority of mHealth programs in developing countries fail to reach scale. One reason may be the challenge of developing financially sustainable programs. The goal of this paper is to explore strategies for mHealth program sustainability and develop cost-recovery models for program implementers using 2014 operational program data from Mobile for Reproductive Health (m4RH), a national text-message (SMS) based health communication service in Tanzania. Methods We delineated 2014 m4RH program costs and considered three strategies for cost-recovery for the m4RH program: user pay-for-service, SMS cost reduction, and strategic partnerships. These inputs were used to develop four different cost-recovery scenarios. The four scenarios leveraged strategic partnerships to reduce per-SMS program costs and create per-SMS program revenue and varied the structure for user financial contribution. Finally, we conducted break-even and uncertainty analyses to evaluate the costs and revenues of these models at the 2014 user volume (125,320) and at any possible break-even volume. Results In three of four scenarios, costs exceeded revenue by $94,596, $34,443, and $84,571 at the 2014 user volume. However, these costs represented large reductions (54%, 83%, and 58%, respectively) from the 2014 program cost of $203,475. Scenario four, in which the lowest per-SMS rate ($0.01 per SMS) was negotiated and users paid for all m4RH SMS sent or received, achieved a $5,660 profit at the 2014 user volume. A Monte Carlo uncertainty analysis demonstrated that break-even points were driven by user volume rather than variations in program costs. Conclusions These results reveal that breaking even was only probable when all SMS costs were transferred to users and the lowest per-SMS cost was negotiated with telecom partners. While this strategy was sustainable for the implementer, a central concern is that health information may not reach those who are too poor to pay, limiting the program’s reach and impact. Incorporating strategies presented here may make mHealth programs more appealing to funders and investors but need further consideration to balance sustainability, scale, and impact. PMID:26824747
Smooth Particle Hydrodynamics-based Wind Representation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Smith, Curtis; Hess, Stephen
2016-12-01
As a result of the 2011 accident at the Fukushima Dai-ichi NPP and other operational NPP experience, there is an identified need to better characterize and evaluate the potential impacts of externally generated hazards on NPP safety. Due to the ubiquitous occurrence of high winds around the world and the possible extreme magnitude of the hazard that has been observed, the assessment of the impact of the high-winds hazard has been identified as an important activity by both NPP owner-operators and regulatory authorities. However, recent experience obtained from the conduct of high-winds risk assessments indicates that such activities have beenmore » both labor-intensive and expensive to perform. Additionally, the existing suite of methods and tools to conduct such assessments (which were developed decades ago) do not make use of modern computational architectures (e.g., parallel processing, object-oriented programming techniques, or simple user interfaces) or methods (e.g., efficient and robust numerical-solution schemes). As a result, the current suite of methods and tools will rapidly become obsolete. Physics-based 3D simulation methods can provide information to assist in the RISMC PRA methodology. This research is intended to determine what benefits SPH methods could bring to high-winds simulations for the purposes of assessing their potential impact on NPP safety. The initial investigation has determined that SPH can simulate key areas of high-wind events with reasonable accuracy, compared to other methods. Some problems, such as simulation voids, need to be addressed, but possible solutions have been identified and will be tested with continued work. This work also demonstrated that SPH simulations can provide a means for simulating debris movement; however, further investigations into the capability to determine the impact of high winds and the impacts of wind-driven debris that lead to SSC failures need to be done. SPH simulations alone would be limited in size and computation time. An advanced method of combing results from grid-based methods with SPH through a data-driven model is proposed. This method could allow for more accurate simulation of particle movement near rigid bodies even with larger SPH particle sizes. If successful, the data-driven model would eliminate the need for a SPH turbulence model and increase the simulation domain size. Continued research beyond the scope of this project will be needed in order to determine the viability of a data-driven model.« less
US hydropower resource assessment for Hawaii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francfort, J.E.
1996-09-01
US DOE is developing an estimate of the undeveloped hydropower potential in US. The Hydropower Evaluation Software (HES) is a computer model developed by INEL for this purpose. HES measures the undeveloped hydropower resources available in US, using uniform criteria for measurement. The software was tested using hydropower information and data provided by Southwestern Power Administration. It is a menu-driven program that allows the PC user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes, and generate reports. This report describes the resource assessment results for the State ofmore » Hawaii.« less
Austin, Åsa N.; Hansen, Joakim P.; Donadi, Serena; Eklöf, Johan S.
2017-01-01
Field surveys often show that high water turbidity limits cover of aquatic vegetation, while many small-scale experiments show that vegetation can reduce turbidity by decreasing water flow, stabilizing sediments, and competing with phytoplankton for nutrients. Here we bridged these two views by exploring the direction and strength of causal relationships between aquatic vegetation and turbidity across seasons (spring and late summer) and spatial scales (local and regional), using causal modeling based on data from a field survey along the central Swedish Baltic Sea coast. The two best-fitting regional-scale models both suggested that in spring, high cover of vegetation reduces water turbidity. In summer, the relationships differed between the two models; in the first model high vegetation cover reduced turbidity; while in the second model reduction of summer turbidity by high vegetation cover in spring had a positive effect on summer vegetation which suggests a positive feedback of vegetation on itself. Nitrogen load had a positive effect on turbidity in both seasons, which was comparable in strength to the effect of vegetation on turbidity. To assess whether the effect of vegetation was primarily caused by sediment stabilization or a reduction of phytoplankton, we also tested models where turbidity was replaced by phytoplankton fluorescence or sediment-driven turbidity. The best-fitting regional-scale models suggested that high sediment-driven turbidity in spring reduces vegetation cover in summer, which in turn has a negative effect on sediment-driven turbidity in summer, indicating a potential positive feedback of sediment-driven turbidity on itself. Using data at the local scale, few relationships were significant, likely due to the influence of unmeasured variables and/or spatial heterogeneity. In summary, causal modeling based on data from a large-scale field survey suggested that aquatic vegetation can reduce turbidity at regional scales, and that high vegetation cover vs. high sediment-driven turbidity may represent two self-enhancing, alternative states of shallow bay ecosystems. PMID:28854185
Austin, Åsa N; Hansen, Joakim P; Donadi, Serena; Eklöf, Johan S
2017-01-01
Field surveys often show that high water turbidity limits cover of aquatic vegetation, while many small-scale experiments show that vegetation can reduce turbidity by decreasing water flow, stabilizing sediments, and competing with phytoplankton for nutrients. Here we bridged these two views by exploring the direction and strength of causal relationships between aquatic vegetation and turbidity across seasons (spring and late summer) and spatial scales (local and regional), using causal modeling based on data from a field survey along the central Swedish Baltic Sea coast. The two best-fitting regional-scale models both suggested that in spring, high cover of vegetation reduces water turbidity. In summer, the relationships differed between the two models; in the first model high vegetation cover reduced turbidity; while in the second model reduction of summer turbidity by high vegetation cover in spring had a positive effect on summer vegetation which suggests a positive feedback of vegetation on itself. Nitrogen load had a positive effect on turbidity in both seasons, which was comparable in strength to the effect of vegetation on turbidity. To assess whether the effect of vegetation was primarily caused by sediment stabilization or a reduction of phytoplankton, we also tested models where turbidity was replaced by phytoplankton fluorescence or sediment-driven turbidity. The best-fitting regional-scale models suggested that high sediment-driven turbidity in spring reduces vegetation cover in summer, which in turn has a negative effect on sediment-driven turbidity in summer, indicating a potential positive feedback of sediment-driven turbidity on itself. Using data at the local scale, few relationships were significant, likely due to the influence of unmeasured variables and/or spatial heterogeneity. In summary, causal modeling based on data from a large-scale field survey suggested that aquatic vegetation can reduce turbidity at regional scales, and that high vegetation cover vs. high sediment-driven turbidity may represent two self-enhancing, alternative states of shallow bay ecosystems.
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
Implementation of transportation asset management in Grandview, Missouri : final report.
DOT National Transportation Integrated Search
2017-02-01
The successful implementation of transportation asset management (TAM) by local governments facilitates the optimization of limited resources. The use of a data-driven TAM program helps to identify and prioritize needs, identify and dedicate resource...
Measuring and Understanding Authentic Youth Engagement: The Youth-Adult Partnership Rubric
ERIC Educational Resources Information Center
Wu, Heng-Chieh Jamie; Kornbluh, Mariah; Weiss, John; Roddy, Lori
2016-01-01
Commonly described as youth-led or youth-driven, the youth-adult partnership (Y-AP) model has gained increasing popularity in out-of-school time (OST) programs in the past two decades (Larson, Walker, & Pearce, 2005; Zeldin, Christens, & Powers, 2013). The Y-AP model is defined as "the practice of (a) multiple youth and multiple…
From Status to Power: New Models at the Intersection of Two Theories
ERIC Educational Resources Information Center
Thye, Shane R.; Willer, David; Markovsky, Barry
2006-01-01
The study of group processes has benefited from longstanding programs of theory-driven research on status and power. The present work constructs a bridge between two formal theories of status and power: Status Characteristics Theory and Network Exchange Theory. Two theoretical models, one for "status value" and one for "status influence,"…
Evaluation of regional climate simulations for air quality modelling purposes
NASA Astrophysics Data System (ADS)
Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand
2013-05-01
In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.
Data Mining for Understanding and Impriving Decision-Making Affecting Ground Delay Programs
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Wang, Yao Xun; Sridhar, Banavar
2013-01-01
The continuous growth in the demand for air transportation results in an imbalance between airspace capacity and traffic demand. The airspace capacity of a region depends on the ability of the system to maintain safe separation between aircraft in the region. In addition to growing demand, the airspace capacity is severely limited by convective weather. During such conditions, traffic managers at the FAA's Air Traffic Control System Command Center (ATCSCC) and dispatchers at various Airlines' Operations Center (AOC) collaborate to mitigate the demand-capacity imbalance caused by weather. The end result is the implementation of a set of Traffic Flow Management (TFM) initiatives such as ground delay programs, reroute advisories, flow metering, and ground stops. Data Mining is the automated process of analyzing large sets of data and then extracting patterns in the data. Data mining tools are capable of predicting behaviors and future trends, allowing an organization to benefit from past experience in making knowledge-driven decisions. The work reported in this paper is focused on ground delay programs. Data mining algorithms have the potential to develop associations between weather patterns and the corresponding ground delay program responses. If successful, they can be used to improve and standardize TFM decision resulting in better predictability of traffic flows on days with reliable weather forecasts. The approach here seeks to develop a set of data mining and machine learning models and apply them to historical archives of weather observations and forecasts and TFM initiatives to determine the extent to which the theory can predict and explain the observed traffic flow behaviors.
Blagosklonny, Mikhail V
2013-06-15
If life were created by intelligent design, we would indeed age from accumulation of molecular damage. Repair is costly and limited by energetic resources, and we would allocate resources rationally. But, albeit elegant, this design is fictional. Instead, nature blindly selects for short-term benefits of robust developmental growth. "Quasi-programmed" by the blind watchmaker, aging is a wasteful and aimless continuation of developmental growth, driven by nutrient-sensing, growth-promoting signaling pathways such as MTOR (mechanistic target of rapamycin). A continuous post-developmental activity of such gerogenic pathways leads to hyperfunctions (aging), loss of homeostasis, age-related diseases, non-random organ damage and death. This model is consistent with a view that (1) soma is disposable, (2) aging and menopause are not programmed and (3) accumulation of random molecular damage is not a cause of aging as we know it.
Zhou, Ping; Guo, Dongwei; Wang, Hong; Chai, Tianyou
2017-09-29
Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVR (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. This indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ping; Guo, Dongwei; Wang, Hong
Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVRmore » (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. In conclusion, this indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.« less
Zhou, Ping; Guo, Dongwei; Wang, Hong; ...
2017-09-29
Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVRmore » (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. In conclusion, this indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.« less
ERIC Educational Resources Information Center
Oldham, Dale Smith
2012-01-01
Educators and policymakers have been concerned about the problem of early literacy performance for many decades. Despite educational reform to increase standards, many children consistently fail to read at levels that enable them to compete globally. The purpose of this study was to provide a data driven program evaluation of a reading…
The Alliance for Innovation in Maternal Health Care: A Way Forward.
Mahoney, Jeanne
2018-06-01
The Alliance for Innovation in Maternal Health is a program supported by the Health Services Resource Administration to reduce maternal mortality and severe maternal morbidity in the United States. This program develops bundles of evidence based action steps for birth facilities to adapt. Progress is monitored at the facility, state and national levels to foster data-driven quality improvement efforts.
NASA Astrophysics Data System (ADS)
Agrawal, Ankit; Choudhary, Alok
2016-05-01
Our ability to collect "big data" has greatly surpassed our capability to analyze it, underscoring the emergence of the fourth paradigm of science, which is data-driven discovery. The need for data informatics is also emphasized by the Materials Genome Initiative (MGI), further boosting the emerging field of materials informatics. In this article, we look at how data-driven techniques are playing a big role in deciphering processing-structure-property-performance relationships in materials, with illustrative examples of both forward models (property prediction) and inverse models (materials discovery). Such analytics can significantly reduce time-to-insight and accelerate cost-effective materials discovery, which is the goal of MGI.
The Data-Driven Approach to Spectroscopic Analyses
NASA Astrophysics Data System (ADS)
Ness, M.
2018-01-01
I review the data-driven approach to spectroscopy, The Cannon, which is a method for deriving fundamental diagnostics of galaxy formation of precise chemical compositions and stellar ages, across many stellar surveys that are mapping the Milky Way. With The Cannon, the abundances and stellar parameters from the multitude of stellar surveys can be placed directly on the same scale, using stars in common between the surveys. Furthermore, the information that resides in the data can be fully extracted, this has resulted in higher precision stellar parameters and abundances being delivered from spectroscopic data and has opened up new avenues in galactic archeology, for example, in the determination of ages for red giant stars across the Galactic disk. Coupled with Gaia distances, proper motions, and derived orbit families, the stellar age and individual abundance information delivered at the precision obtained with the data-driven approach provides very strong constraints on the evolution of and birthplace of stars in the Milky Way. I will review the role of data-driven spectroscopy as we enter the era where we have both the data and the tools to build the ultimate conglomerate of galactic information as well as highlight further applications of data-driven models in the coming decade.
A Data-driven Approach for Forecasting Next-day River Discharge
NASA Astrophysics Data System (ADS)
Sharif, H. O.; Billah, K. S.
2017-12-01
This study focuses on evaluating the performance of the Soil and Water Assessment Tool (SWAT) eco-hydrological model, a simple Auto-Regressive with eXogenous input (ARX) model, and a Gene expression programming (GEP)-based model in one-day-ahead forecasting of discharge of a subtropical basin (the upper Kentucky River Basin). The three models were calibrated with daily flow at the US Geological Survey (USGS) stream gauging station not affected by flow regulation for the period of 2002-2005. The calibrated models were then validated at the same gauging station as well as another USGS gauge 88 km downstream for the period of 2008-2010. The results suggest that simple models outperform a sophisticated hydrological model with GEP having the advantage of being able to generate functional relationships that allow scientific investigation of the complex nonlinear interrelationships among input variables. Unlike SWAT, GEP, and to some extent, ARX are less sensitive to the length of the calibration time series and do not require a spin-up period.
Using connectome-based predictive modeling to predict individual behavior from brain connectivity
Shen, Xilin; Finn, Emily S.; Scheinost, Dustin; Rosenberg, Monica D.; Chun, Marvin M.; Papademetris, Xenophon; Constable, R Todd
2017-01-01
Neuroimaging is a fast developing research area where anatomical and functional images of human brains are collected using techniques such as functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and electroencephalography (EEG). Technical advances and large-scale datasets have allowed for the development of models capable of predicting individual differences in traits and behavior using brain connectivity measures derived from neuroimaging data. Here, we present connectome-based predictive modeling (CPM), a data-driven protocol for developing predictive models of brain-behavior relationships from connectivity data using cross-validation. This protocol includes the following steps: 1) feature selection, 2) feature summarization, 3) model building, and 4) assessment of prediction significance. We also include suggestions for visualizing the most predictive features (i.e., brain connections). The final result should be a generalizable model that takes brain connectivity data as input and generates predictions of behavioral measures in novel subjects, accounting for a significant amount of the variance in these measures. It has been demonstrated that the CPM protocol performs equivalently or better than most of the existing approaches in brain-behavior prediction. However, because CPM focuses on linear modeling and a purely data-driven driven approach, neuroscientists with limited or no experience in machine learning or optimization would find it easy to implement the protocols. Depending on the volume of data to be processed, the protocol can take 10–100 minutes for model building, 1–48 hours for permutation testing, and 10–20 minutes for visualization of results. PMID:28182017
Data-driven outbreak forecasting with a simple nonlinear growth model.
Lega, Joceline; Brown, Heidi E
2016-12-01
Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
A software toolbox for robotics
NASA Technical Reports Server (NTRS)
Sanwal, J. C.
1985-01-01
A method for programming cooperating manipulators, which is guided by a geometric description of the task to be performed, is given. For this a suitable language must be used and a method for describing the workplace and the objects in it in geometric terms. A task level command language and its implementation for concurrently driven multiple robot arm is described. The language is suitable for driving a cell in which manipulators, end effectors, and sensors are controlled by their own dedicated processors. These processors can communicate with each other through a communication network. A mechanism for keeping track of the history of the commands already executed allows the command language for the manipulators to be event driven. A frame based world modeling system is utilized to describe the objects in the work environment and any relationships that hold between these objects. This system provides a versatile tool for managing information about the world model. Default actions normally needed are invoked when the data base is updated or accessed. Most of the first level error recovery is also invoked by the database by utilizing the concepts of demons. The package can be utilized to generate task level commands in a problem solver or a planner.
Data-driven Inference and Investigation of Thermosphere Dynamics and Variations
NASA Astrophysics Data System (ADS)
Mehta, P. M.; Linares, R.
2017-12-01
This paper presents a methodology for data-driven inference and investigation of thermosphere dynamics and variations. The approach uses data-driven modal analysis to extract the most energetic modes of variations for neutral thermospheric species using proper orthogonal decomposition, where the time-independent modes or basis represent the dynamics and the time-depedent coefficients or amplitudes represent the model parameters. The data-driven modal analysis approach combined with sparse, discrete observations is used to infer amplitues for the dynamic modes and to calibrate the energy content of the system. In this work, two different data-types, namely the number density measurements from TIMED/GUVI and the mass density measurements from CHAMP/GRACE are simultaneously ingested for an accurate and self-consistent specification of the thermosphere. The assimilation process is achieved with a non-linear least squares solver and allows estimation/tuning of the model parameters or amplitudes rather than the driver. In this work, we use the Naval Research Lab's MSIS model to derive the most energetic modes for six different species, He, O, N2, O2, H, and N. We examine the dominant drivers of variations for helium in MSIS and observe that seasonal latitudinal variation accounts for about 80% of the dynamic energy with a strong preference of helium for the winter hemisphere. We also observe enhanced helium presence near the poles at GRACE altitudes during periods of low solar activity (Feb 2007) as previously deduced. We will also examine the storm-time response of helium derived from observations. The results are expected to be useful in tuning/calibration of the physics-based models.
Some Recent Developments in Turbulence Closure Modeling
NASA Astrophysics Data System (ADS)
Durbin, Paul A.
2018-01-01
Turbulence closure models are central to a good deal of applied computational fluid dynamical analysis. Closure modeling endures as a productive area of research. This review covers recent developments in elliptic relaxation and elliptic blending models, unified rotation and curvature corrections, transition prediction, hybrid simulation, and data-driven methods. The focus is on closure models in which transport equations are solved for scalar variables, such as the turbulent kinetic energy, a timescale, or a measure of anisotropy. Algebraic constitutive representations are reviewed for their role in relating scalar closures to the Reynolds stress tensor. Seamless and nonzonal methods, which invoke a single closure model, are reviewed, especially detached eddy simulation (DES) and adaptive DES. Other topics surveyed include data-driven modeling and intermittency and laminar fluctuation models for transition prediction. The review concludes with an outlook.
McFadden, David G.; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K.; Song, Xiaoling; Pirun, Mono; Santiago, Philip M.; Kim-Kiselak, Caroline; Platt, James T.; Lee, Emily; Hodges, Emily; Rosebrock, Adam P.; Bronson, Roderick T.; Socci, Nicholas D.; Hannon, Gregory J.; Jacks, Tyler; Varmus, Harold
2016-01-01
Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity. PMID:27702896
McFadden, David G; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K; Song, Xiaoling; Pirun, Mono; Santiago, Philip M; Kim-Kiselak, Caroline; Platt, James T; Lee, Emily; Hodges, Emily; Rosebrock, Adam P; Bronson, Roderick T; Socci, Nicholas D; Hannon, Gregory J; Jacks, Tyler; Varmus, Harold
2016-10-18
Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity.
Baldi, Pierre
2011-12-27
A response is presented to sentiments expressed in "Data-Driven High-Throughput Prediction of the 3-D Structure of Small Molecules: Review and Progress. A Response from The Cambridge Crystallographic Data Centre", recently published in the Journal of Chemical Information and Modeling, (1) which may give readers a misleading impression regarding significant impediments to scientific research posed by the CCDC.
NASA Astrophysics Data System (ADS)
Kim, J. B.; Kim, Y.
2017-12-01
This study investigates how the water and carbon fluxes as well as vegetation distribution on the Korean peninsula would vary with climate change. Ecosystem Demography (ED) Model version 2 (ED2) is used in this study, which is an integrated terrestrial biosphere model that can utilize a set of size- and age- structured partial differential equations that track the changing structure and composition of the plant canopy. With using the vegetation distribution data of Jeju Island, located at the southern part of the Korean Peninsula, ED2 is setup and driven for the past 10 years. Then the results of ED2 are evaluated and adjusted with observed forestry data, i.e., growth and mortality, and the flux tower and MODIS satellite data, i.e., evapotranspiration (ET) and gross primary production (GPP). This adjusted ED2 are used to simulate the water and carbon fluxes as well as vegetation dynamics in the Korean Peninsula for the historical period with evaluating the model against the MODIS satellite data. Finally, the climate scenarios of RCP 2.6 and 6.0 are used to predict the fluxes and vegetation distribution of the Korean Peninsula in the future. With using the state-of-art terrestrial ecosystem model, this study would provide us better understanding of the future ecosystem vulnerability of the Korean Peninsula. AcknowledgementsThis work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2015R1C1A2A01054800) and by the Korea Meteorological Administration R&D Program under Grant KMIPA 2015-6180. This work was also supported by the Yonsei University Future-leading Research Initiative of 2015(2016-22-0061).
Exploration Supply Chain Simulation
NASA Technical Reports Server (NTRS)
2008-01-01
The Exploration Supply Chain Simulation project was chartered by the NASA Exploration Systems Mission Directorate to develop a software tool, with proper data, to quantitatively analyze supply chains for future program planning. This tool is a discrete-event simulation that uses the basic supply chain concepts of planning, sourcing, making, delivering, and returning. This supply chain perspective is combined with other discrete or continuous simulation factors. Discrete resource events (such as launch or delivery reviews) are represented as organizational functional units. Continuous resources (such as civil service or contractor program functions) are defined as enabling functional units. Concepts of fixed and variable costs are included in the model to allow the discrete events to interact with cost calculations. The definition file is intrinsic to the model, but a blank start can be initiated at any time. The current definition file is an Orion Ares I crew launch vehicle. Parameters stretch from Kennedy Space Center across and into other program entities (Michaud Assembly Facility, Aliant Techsystems, Stennis Space Center, Johnson Space Center, etc.) though these will only gain detail as the file continues to evolve. The Orion Ares I file definition in the tool continues to evolve, and analysis from this tool is expected in 2008. This is the first application of such business-driven modeling to a NASA/government-- aerospace contractor endeavor.
Exploring Cloud Computing Tools to Enhance Team-Based Problem Solving for Challenging Behavior
ERIC Educational Resources Information Center
Johnson, LeAnne D.
2017-01-01
Data-driven decision making is central to improving success of children. Actualizing the use of data is challenging when addressing the social, emotional, and behavioral needs of children across different types of early childhood programs (i.e., early childhood special education, early childhood family education, Head Start, and childcare).…
An Agent Allocation System for the West Virginia University Extension Service
ERIC Educational Resources Information Center
Dougherty, Michael John; Eades, Daniel
2015-01-01
Extension recognizes the importance of data in guiding programming decisions at the local level. However, allocating personnel resources and specializations at the state level is a more complex process. The West Virginia University Extension Service has adopted a data-driven process to determine the number, location, and specializations of county…
A DATA-DRIVEN MODEL FOR SPECTRA: FINDING DOUBLE REDSHIFTS IN THE SLOAN DIGITAL SKY SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsalmantza, P.; Hogg, David W., E-mail: vivitsal@mpia.de
2012-07-10
We present a data-driven method-heteroscedastic matrix factorization, a kind of probabilistic factor analysis-for modeling or performing dimensionality reduction on observed spectra or other high-dimensional data with known but non-uniform observational uncertainties. The method uses an iterative inverse-variance-weighted least-squares minimization procedure to generate a best set of basis functions. The method is similar to principal components analysis (PCA), but with the substantial advantage that it uses measurement uncertainties in a responsible way and accounts naturally for poorly measured and missing data; it models the variance in the noise-deconvolved data space. A regularization can be applied, in the form of a smoothnessmore » prior (inspired by Gaussian processes) or a non-negative constraint, without making the method prohibitively slow. Because the method optimizes a justified scalar (related to the likelihood), the basis provides a better fit to the data in a probabilistic sense than any PCA basis. We test the method on Sloan Digital Sky Survey (SDSS) spectra, concentrating on spectra known to contain two redshift components: these are spectra of gravitational lens candidates and massive black hole binaries. We apply a hypothesis test to compare one-redshift and two-redshift models for these spectra, utilizing the data-driven model trained on a random subset of all SDSS spectra. This test confirms 129 of the 131 lens candidates in our sample and all of the known binary candidates, and turns up very few false positives.« less
Xu, Wenjun; Chen, Jie; Lau, Henry Y K; Ren, Hongliang
2017-09-01
Accurate motion control of flexible surgical manipulators is crucial in tissue manipulation tasks. The tendon-driven serpentine manipulator (TSM) is one of the most widely adopted flexible mechanisms in minimally invasive surgery because of its enhanced maneuverability in torturous environments. TSM, however, exhibits high nonlinearities and conventional analytical kinematics model is insufficient to achieve high accuracy. To account for the system nonlinearities, we applied a data driven approach to encode the system inverse kinematics. Three regression methods: extreme learning machine (ELM), Gaussian mixture regression (GMR) and K-nearest neighbors regression (KNNR) were implemented to learn a nonlinear mapping from the robot 3D position states to the control inputs. The performance of the three algorithms was evaluated both in simulation and physical trajectory tracking experiments. KNNR performed the best in the tracking experiments, with the lowest RMSE of 2.1275 mm. The proposed inverse kinematics learning methods provide an alternative and efficient way to accurately model the tendon driven flexible manipulator. Copyright © 2016 John Wiley & Sons, Ltd.
Mining on Big Data Using Hadoop MapReduce Model
NASA Astrophysics Data System (ADS)
Salman Ahmed, G.; Bhattacharya, Sweta
2017-11-01
Customary parallel calculations for mining nonstop item create opportunity to adjust stack of similar data among hubs. The paper aims to review this process by analyzing the critical execution downside of the common parallel recurrent item-set mining calculations. Given a larger than average dataset, data apportioning strategies inside the current arrangements endure high correspondence and mining overhead evoked by repetitive exchanges transmitted among registering hubs. We tend to address this downside by building up a learning apportioning approach referred as Hadoop abuse using the map-reduce programming model. All objectives of Hadoop are to zest up the execution of parallel recurrent item-set mining on Hadoop bunches. Fusing the comparability metric and furthermore the locality-sensitive hashing procedure, Hadoop puts to a great degree comparative exchanges into an information segment to lift neighborhood while not making AN exorbitant assortment of excess exchanges. We tend to execute Hadoop on a 34-hub Hadoop bunch, driven by a decent change of datasets made by IBM quest market-basket manufactured data generator. Trial uncovers the fact that Hadoop contributes towards lessening system and processing masses by the uprightness of dispensing with excess exchanges on Hadoop hubs. Hadoop impressively outperforms and enhances the other models considerably.
Origin of the pulse-like signature of shallow long-period volcano seismicity
Chouet, Bernard A.; Dawson, Phillip B.
2016-01-01
Short-duration, pulse-like long-period (LP) events are a characteristic type of seismicity accompanying eruptive activity at Mount Etna in Italy in 2004 and 2008 and at Turrialba Volcano in Costa Rica and Ubinas Volcano in Peru in 2009. We use the discrete wave number method to compute the free surface response in the near field of a rectangular tensile crack embedded in a homogeneous elastic half space and to gain insights into the origin of the LP pulses. Two source models are considered, including (1) a vertical fluid-driven crack and (2) a unilateral tensile rupture growing at a fixed sub-Rayleigh velocity with constant opening on a vertical crack. We apply cross correlation to the synthetics and data to demonstrate that a fluid-driven crack provides a natural explanation for these data with realistic source sizes and fluid properties. Our modeling points to shallow sources (<1 km depth), whose signatures are representative of the Rayleigh pulse sampled at epicentral distances >∼1 km. While a slow-rupture failure provides another potential model for these events, the synthetics and resulting fits to the data are not optimal in this model compared to a fluid-driven source. We infer that pulse-like LP signatures are parts of the continuum of responses produced by shallow fluid-driven sources in volcanoes.
Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.
Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes
Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A
2014-01-01
This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed. PMID:26167432
Shim, Hongseok; Kim, Ji Hyun; Kim, Chan Yeong; Hwang, Sohyun; Kim, Hyojin; Yang, Sunmo; Lee, Ji Eun; Lee, Insuk
2016-11-16
Whole exome sequencing (WES) accelerates disease gene discovery using rare genetic variants, but further statistical and functional evidence is required to avoid false-discovery. To complement variant-driven disease gene discovery, here we present function-driven disease gene discovery in zebrafish (Danio rerio), a promising human disease model owing to its high anatomical and genomic similarity to humans. To facilitate zebrafish-based function-driven disease gene discovery, we developed a genome-scale co-functional network of zebrafish genes, DanioNet (www.inetbio.org/danionet), which was constructed by Bayesian integration of genomics big data. Rigorous statistical assessment confirmed the high prediction capacity of DanioNet for a wide variety of human diseases. To demonstrate the feasibility of the function-driven disease gene discovery using DanioNet, we predicted genes for ciliopathies and performed experimental validation for eight candidate genes. We also validated the existence of heterozygous rare variants in the candidate genes of individuals with ciliopathies yet not in controls derived from the UK10K consortium, suggesting that these variants are potentially involved in enhancing the risk of ciliopathies. These results showed that an integrated genomics big data for a model animal of diseases can expand our opportunity for harnessing WES data in disease gene discovery. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Carbon Consequences of Forest Disturbance and Recovery Across the Conterminous United States
NASA Technical Reports Server (NTRS)
Williams, Christopher A.; Collatz, G. James; Masek, Jeffrey; Goward, Samuel N.
2012-01-01
Forests of North America are thought to constitute a significant long term sink for atmospheric carbon. The United States Forest Service Forest Inventory and Analysis (FIA) program has developed a large data base of stock changes derived from consecutive estimates of growing stock volume in the US. These data reveal a large and relatively stable increase in forest carbon stocks over the last two decades or more. The mechanisms underlying this national increase in forest stocks may include recovery of forests from past disturbances, net increases in forest area, and growth enhancement driven by climate or fertilization by CO2 and Nitrogen. Here we estimate the forest recovery component of the observed stock changes using FIA data on the age structure of US forests and carbon stocks as a function of age. The latter are used to parameterize forest disturbance and recovery processes in a carbon cycle model. We then apply resulting disturbance/recovery dynamics to landscapes and regions based on the forest age distributions. The analysis centers on 28 representative climate settings spread about forested regions of the conterminous US. We estimate carbon fluxes for each region and propagate uncertainties in calibration data through to the predicted fluxes. The largest recovery-driven carbon sinks are found in the South central, Pacific Northwest, and Pacific Southwest regions, with spatially averaged net ecosystem productivity (NEP) of about 100 g C / square m / a driven by forest age structure. Carbon sinks from recovery in the Northeast and Northern Lake States remain moderate to large owing to the legacy of historical clearing and relatively low modern disturbance rates from harvest and fire. At the continental scale, we find a conterminous U.S. forest NEP of only 0.16 Pg C/a from age structure in 2005, or only 0.047 Pg C/a of forest stock change after accounting for fire emissions and harvest transfers. Recent estimates of NEP derived from inventory stock change, harvest, and fire data show twice the NEP sink we derive from forest age distributions. We discuss possible reasons for the discrepancies including modeling errors and the possibility of climate and/or fertilization (CO2 or N) growth enhancements.
Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks.
Vlachas, Pantelis R; Byeon, Wonmin; Wan, Zhong Y; Sapsis, Themistoklis P; Koumoutsakos, Petros
2018-05-01
We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto-Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM-LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.
A data-driven prediction method for fast-slow systems
NASA Astrophysics Data System (ADS)
Groth, Andreas; Chekroun, Mickael; Kondrashov, Dmitri; Ghil, Michael
2016-04-01
In this work, we present a prediction method for processes that exhibit a mixture of variability on low and fast scales. The method relies on combining empirical model reduction (EMR) with singular spectrum analysis (SSA). EMR is a data-driven methodology for constructing stochastic low-dimensional models that account for nonlinearity and serial correlation in the estimated noise, while SSA provides a decomposition of the complex dynamics into low-order components that capture spatio-temporal behavior on different time scales. Our study focuses on the data-driven modeling of partial observations from dynamical systems that exhibit power spectra with broad peaks. The main result in this talk is that the combination of SSA pre-filtering with EMR modeling improves, under certain circumstances, the modeling and prediction skill of such a system, as compared to a standard EMR prediction based on raw data. Specifically, it is the separation into "fast" and "slow" temporal scales by the SSA pre-filtering that achieves the improvement. We show, in particular that the resulting EMR-SSA emulators help predict intermittent behavior such as rapid transitions between specific regions of the system's phase space. This capability of the EMR-SSA prediction will be demonstrated on two low-dimensional models: the Rössler system and a Lotka-Volterra model for interspecies competition. In either case, the chaotic dynamics is produced through a Shilnikov-type mechanism and we argue that the latter seems to be an important ingredient for the good prediction skills of EMR-SSA emulators. Shilnikov-type behavior has been shown to arise in various complex geophysical fluid models, such as baroclinic quasi-geostrophic flows in the mid-latitude atmosphere and wind-driven double-gyre ocean circulation models. This pervasiveness of the Shilnikow mechanism of fast-slow transition opens interesting perspectives for the extension of the proposed EMR-SSA approach to more realistic situations.
2011-09-30
Number : N00014 N00014-09-1-0503 http://ceprofs.civil.tamu.edu/jkaihatu/research/proj.html LONG-TERM GOALS The present project is part of a... number . 1. REPORT DATE 30 SEP 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Data-Driven Boundary...Correction and Optimization of a Nearshore Wave and Hydrodynamic Model to Enable Rapid Environmental Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c
Clustering and Network Analysis of Reverse Phase Protein Array Data.
Byron, Adam
2017-01-01
Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2013-12-01
In this paper, are presented some possibilities concerning the implementation of a test-driven development as a programming method. Here is offered a different point of view for creation of advanced programming techniques (build tests before programming source with all necessary software tools and modules respectively). Therefore, this nontraditional approach for easier programmer's work through building tests at first is preferable way of software development. This approach allows comparatively simple programming (applied with different object-oriented programming languages as for example JAVA, XML, PYTHON etc.). It is predictable way to develop software tools and to provide help about creating better software that is also easier to maintain. Test-driven programming is able to replace more complicated casual paradigms, used by many programmers.
Wyse, Sara A; Long, Tammy M; Ebert-May, Diane
2014-01-01
Graduate teaching assistants (TAs) are increasingly responsible for instruction in undergraduate science, technology, engineering, and mathematics (STEM) courses. Various professional development (PD) programs have been developed and implemented to prepare TAs for this role, but data about effectiveness are lacking and are derived almost exclusively from self-reported surveys. In this study, we describe the design of a reformed PD (RPD) model and apply Kirkpatrick's Evaluation Framework to evaluate multiple outcomes of TA PD before, during, and after implementing RPD. This framework allows evaluation that includes both direct measures and self-reported data. In RPD, TAs created and aligned learning objectives and assessments and incorporated more learner-centered instructional practices in their teaching. However, these data are inconsistent with TAs' self-reported perceptions about RPD and suggest that single measures are insufficient to evaluate TA PD programs. © 2014 Wyse et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Graphical user interface for image acquisition and processing
Goldberg, Kenneth A.
2002-01-01
An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.
Off-line programming motion and process commands for robotic welding of Space Shuttle main engines
NASA Technical Reports Server (NTRS)
Ruokangas, C. C.; Guthmiller, W. A.; Pierson, B. L.; Sliwinski, K. E.; Lee, J. M. F.
1987-01-01
The off-line-programming software and hardware being developed for robotic welding of the Space Shuttle main engine are described and illustrated with diagrams, drawings, graphs, and photographs. The menu-driven workstation-based interactive programming system is designed to permit generation of both motion and process commands for the robotic workcell by weld engineers (with only limited knowledge of programming or CAD systems) on the production floor. Consideration is given to the user interface, geometric-sources interfaces, overall menu structure, weld-parameter data base, and displays of run time and archived data. Ongoing efforts to address limitations related to automatic-downhand-configuration coordinated motion, a lack of source codes for the motion-control software, CAD data incompatibility, interfacing with the robotic workcell, and definition of the welding data base are discussed.
NASA Astrophysics Data System (ADS)
Tellman, B.; Schwarz, B.
2014-12-01
This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.
An interactive review system for NASTRAN
NASA Technical Reports Server (NTRS)
Durocher, L. L.; Gasper, A. F.
1982-01-01
An interactive review system that addresses the problems of model display, model error checking, and postprocessing is described. The menu driven system consists of four programs whose advantages and limitations are detailed. The interface between NASTRAN and MOVIE-BYU, the modifications required to make MOVIE usable in a finite element context, and the resulting capabilities of MOVIE as a graphics postprocessor for NASTRAN are illustrated.
An Ontology Driven Information Architecture for Interoperable Disparate Data Sources
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Dan; Hardman, Sean; Joyner, Ronald; Mattmann, Chris; Ramirez, Paul; Kelly, Sean; Castano, Rebecca
2011-01-01
The mission of the Planetary Data System is to facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data produced by or relevant to NASA's planetary missions, research programs, and data analysis programs. The vision is: (1) To gather and preserve the data obtained from exploration of the Solar System by the U.S. and other nations (2) To facilitate new and exciting discoveries by providing access to and ensuring usability of those data to the worldwide community (3) To inspire the public through availability and distribution of the body of knowledge reflected in the PDS data collection PDS is a federation of heterogeneous nodes including science and support nodes
Witt, Emitt C.
2015-01-01
Growing use of two-dimensional (2-D) hydraulic models has created a need for high resolution data to support flood volume estimates, floodplain specific engineering data, and accurate flood inundation scenarios. Elevation data are a critical input to these models that guide the flood-wave across the landscape allowing the computation of valuable engineering specific data that provides a better understanding of flooding impacts on structures, debris movement, bed scour, and direction. High resolution elevation data are becoming publicly available that can benefit the 2-D flood modeling community. Comparison of these newly available data with legacy data suggests that better modeling outcomes are achieved by using 3D Elevation Program (3DEP) lidar point data and the derived 1 m Digital Elevation Model (DEM) product relative to the legacy 3 m, 10 m, or 30 m products currently available in the U.S. Geological Survey (USGS) National Elevation Dataset. Within the low topographic relief of a coastal floodplain, the newer 3DEP data better resolved elevations within the forested and swampy areas achieving simulations that compared well with a historic flooding event. Results show that the 1 m DEM derived from 3DEP lidar source provides a more conservative estimate of specific energy, static pressure, and impact pressure for grid elements at maximum flow relative to the legacy DEM data. Better flood simulations are critically important in coastal floodplains where climate change driven storm frequency and sea level rise will contribute to more frequent flooding events.
NASA Astrophysics Data System (ADS)
Thompson, D. E.; Rajkumar, T.
2002-12-01
The San Francisco Bay Delta is a large hydrodynamic complex that incorporates the Sacramento and San Joaquin Estuaries, the Suisan Marsh, and the San Francisco Bay proper. Competition exists for the use of this extensive water system both from the fisheries industry, the agricultural industry, and from the marine and estuarine animal species within the Delta. As tidal fluctuations occur, more saline water pushes upstream allowing fish to migrate beyond the Suisan Marsh for breeding and habitat occupation. However, the agriculture industry does not want extensive salinity intrusion to impact water quality for human and plant consumption. The balance is regulated by pumping stations located along the estuaries and reservoirs whereby flushing of fresh water keeps the saline intrusion at bay. The pumping schedule is driven by data collected at various locations within the Bay Delta and by numerical models that predict the salinity intrusion as part of a larger model of the system. The Interagency Ecological Program (IEP) for the San Francisco Bay / Sacramento-San Joaquin Estuary collects, monitors, and archives the data, and the Department of Water Resources provides a numerical model simulation (DSM2) from which predictions are made that drive the pumping schedule. A problem with DSM2 is that the numerical simulation takes roughly 16 hours to complete a prediction. We have created a neural net, optimized with a genetic algorithm, that takes as input the archived data from multiple gauging stations and predicts stage, salinity, and flow at the Carquinez Straits (at the downstream end of the Suisan Marsh). This model seems to be robust in its predictions and operates much faster than the current numerical DSM2 model. Because the Bay-Delta is strongly tidally driven, we used both Principal Component Analysis and Fast Fourier Transforms to discover dominant features within the IEP data. We then filtered out the dominant tidal forcing to discover non-primary tidal effects, and used this to enhance the neural network by mapping input-output relationships in a more efficient manner. Furthermore, the neural network implicitly incorporates both the hydrodynamic and water quality models into a single predictive system. Although our model has not yet been enhanced to demonstrate improve pumping schedules, it has the possibility to support better decision-making procedures that may then be implemented by State agencies if desired. Our intention is now to use our calibrated Bay-Delta neural model in the smaller Elkhorn Slough complex near Monterey Bay where no such hydrodynamic model currently exists. At the Elkhorn Slough, we are fusing the neural net model of tidally-driven flow with in situ flow data and airborne and satellite remote sensing data. These further constrain the behavior of the model in predicting the longer-term health and future of this vital estuary. In particular, we are using visible data to explore the effects of the sediment plume that wastes into Monterey Bay, and infrared data and thermal emissivities to characterize the plant habitat along the margins of the Slough as salinity intrusion and sediment removal change the boundary of the estuary. The details of the Bay-Delta neural net model and its application to the Elkhorn Slough are presented in this paper.
A satellite-driven, client-server hydro-economic model prototype for agricultural water management
NASA Astrophysics Data System (ADS)
Maneta, Marco; Kimball, John; He, Mingzhu; Payton Gardner, W.
2017-04-01
Anticipating agricultural water demand, land reallocation, and impact on farm revenues associated with different policy or climate constraints is a challenge for water managers and for policy makers. While current integrated decision support systems based on programming methods provide estimates of farmer reaction to external constraints, they have important shortcomings such as the high cost of data collection surveys necessary to calibrate the model, biases associated with inadequate farm sampling, infrequent model updates and recalibration, model overfitting, or their deterministic nature, among other problems. In addition, the administration of water supplies and the generation of policies that promote sustainable agricultural regions depend on more than one bureau or office. Unfortunately, managers from local and regional agencies often use different datasets of variable quality, which complicates coordinated action. To overcome these limitations, we present a client-server, integrated hydro-economic modeling and observation framework driven by satellite remote sensing and other ancillary information from regional monitoring networks. The core of the framework is a stochastic data assimilation system that sequentially ingests remote sensing observations and corrects the parameters of the hydro-economic model at unprecedented spatial and temporal resolutions. An economic model of agricultural production, based on mathematical programming, requires information on crop type and extent, crop yield, crop transpiration and irrigation technology. A regional hydro-climatologic model provides biophysical constraints to an economic model of agricultural production with a level of detail that permits the study of the spatial impact of large- and small-scale water use decisions. Crop type and extent is obtained from the Cropland Data Layer (CDL), which is multi-sensor operational classification of crops maintained by the United States Department of Agriculture. Because this product is only available for the conterminous United States, the framework is currently only applicable in this region. To obtain information on crop phenology, productivity and transpiration at adequate spatial and temporal frequencies we blend high spatial resolution Landsat information with high temporal fidelity MODIS imagery. The result is a 30 m, 8-day fused dataset of crop greenness that is subsequently transformed into productivity and transpiration by adapting existing forest productivity and transpiration algorithms for agricultural applications. To ensure all involved agencies work with identical information and that end-users are sheltered from the computational burden of storing and processing remote sensing data, this modeling framework is integrated in a client-server architecture based on the Hydra platform (www.hydraplatform.org). Assimilation and processing of resource-intensive remote sensing information, as well as hydrologic and other ancillary data, occur on the server side. With this architecture, our decision support system becomes a light weight 'app' that connects to the server to retrieve the latest information regarding water demands, land use, yields and hydrologic information required to run different management scenarios. This architecture ensures that all agencies and teams involved in water management use the same, up-to-date information in their simulations.
NASA Astrophysics Data System (ADS)
Iungo, Giacomo Valerio; Camarri, Simone; Ciri, Umberto; El-Asha, Said; Leonardi, Stefano; Rotea, Mario A.; Santhanagopalan, Vignesh; Viola, Francesco; Zhan, Lu
2016-11-01
Site conditions, such as topography and local climate, as well as wind farm layout strongly affect performance of a wind power plant. Therefore, predictions of wake interactions and their effects on power production still remain a great challenge in wind energy. For this study, an onshore wind turbine array was monitored through lidar measurements, SCADA and met-tower data. Power losses due to wake interactions were estimated to be approximately 4% and 2% of the total power production under stable and convective conditions, respectively. This dataset was then leveraged for the calibration of a data driven RANS (DDRANS) solver, which is a compelling tool for prediction of wind turbine wakes and power production. DDRANS is characterized by a computational cost as low as that for engineering wake models, and adequate accuracy achieved through data-driven tuning of the turbulence closure model. DDRANS is based on a parabolic formulation, axisymmetry and boundary layer approximations, which allow achieving low computational costs. The turbulence closure model consists in a mixing length model, which is optimally calibrated with the experimental dataset. Assessment of DDRANS is then performed through lidar and SCADA data for different atmospheric conditions. This material is based upon work supported by the National Science Foundation under the I/UCRC WindSTAR, NSF Award IIP 1362033.
Domain Specific Language Support for Exascale. Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baden, Scott
The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically
NASA Astrophysics Data System (ADS)
Christiansen, Christian; Hartmann, Daniel
This paper documents a package of menu-driven POLYPASCAL87 computer programs for handling grouped observations data from both sieving (increment data) and settling tube procedures (cumulative data). The package is designed deliberately for use on IBM-compatible personal computers. Two of the programs solve the numerical problem of determining the estimates of the four (main) parameters of the log-hyperbolic distribution and their derivatives. The package also contains a program for determining the mean, sorting, skewness. and kurtosis according to the standard moments. Moreover, the package contains procedures for smoothing and grouping of settling tube data. A graphic part of the package plots the data in a log-log plot together with the estimated log-hyperbolic curve. Along with the plot follows all estimated parameters. Another graphic option is a plot of the log-hyperbolic shape triangle with the (χ,ζ) position of the sample.
The Steward Observatory asteroid relational database
NASA Technical Reports Server (NTRS)
Sykes, Mark V.; Alvarezdelcastillo, Elizabeth M.
1991-01-01
The Steward Observatory Asteroid Relational Database (SOARD) was created as a flexible tool for undertaking studies of asteroid populations and sub-populations, to probe the biases intrinsic to asteroid databases, to ascertain the completeness of data pertaining to specific problems, to aid in the development of observational programs, and to develop pedagogical materials. To date, SOARD has compiled an extensive list of data available on asteroids and made it accessible through a single menu-driven database program. Users may obtain tailored lists of asteroid properties for any subset of asteroids or output files which are suitable for plotting spectral data on individual asteroids. The program has online help as well as user and programmer documentation manuals. The SOARD already has provided data to fulfill requests by members of the astronomical community. The SOARD continues to grow as data is added to the database and new features are added to the program.
Impact of the HITECH Act on physicians' adoption of electronic health records.
Mennemeyer, Stephen T; Menachemi, Nir; Rahurkar, Saurabh; Ford, Eric W
2016-03-01
The Health Information Technology for Economic and Clinical Health (HITECH) Act has distributed billions of dollars to physicians as incentives for adopting certified electronic health records (EHRs) through the meaningful use (MU) program ultimately aimed at improving healthcare outcomes. The authors examine the extent to which the MU program impacted the EHR adoption curve that existed prior to the Act. Bass and Gamma Shifted Gompertz (G/SG) diffusion models of the adoption of "Any" and "Basic" EHR systems in physicians' offices using consistent data series covering 2001-2013 and 2006-2013, respectively, are estimated to determine if adoption was stimulated during either a PrePay (2009-2010) period of subsidy anticipation or a PostPay (2011-2013) period when payments were actually made. Adoption of Any EHR system may have increased by as much as 7 percentage points above the level predicted in the absence of the MU subsidies. This estimate, however, lacks statistical significance and becomes smaller or negative under alternative model specifications. No substantial effects are found for Basic systems. The models suggest that adoption was largely driven by "imitation" effects (q-coefficient) as physicians mimic their peers' technology use or respond to mandates. Small and often insignificant "innovation" effects (p-coefficient) are found suggesting little enthusiasm by physicians who are leaders in technology adoption. The authors find weak evidence of the impact of the MU program on EHR uptake. This is consistent with reports that many current EHR systems reduce physician productivity, lack data sharing capabilities, and need to incorporate other key interoperability features (e.g., application program interfaces). © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Protein-Protein Interface Predictions by Data-Driven Methods: A Review
Xue, Li C; Dobbs, Drena; Bonvin, Alexandre M.J.J.; Honavar, Vasant
2015-01-01
Reliably pinpointing which specific amino acid residues form the interface(s) between a protein and its binding partner(s) is critical for understanding the structural and physicochemical determinants of protein recognition and binding affinity, and has wide applications in modeling and validating protein interactions predicted by high-throughput methods, in engineering proteins, and in prioritizing drug targets. Here, we review the basic concepts, principles and recent advances in computational approaches to the analysis and prediction of protein-protein interfaces. We point out caveats for objectively evaluating interface predictors, and discuss various applications of data-driven interface predictors for improving energy model-driven protein-protein docking. Finally, we stress the importance of exploiting binding partner information in reliably predicting interfaces and highlight recent advances in this emerging direction. PMID:26460190
NASA Astrophysics Data System (ADS)
Andrina, G.; Basso, V.; Saitta, L.
2004-08-01
The effort in optimising the AIV process has been mainly focused in the recent years on the standardisation of approaches and on the application of new methodologies. But the earlier the intervention, the greater the benefits in terms of cost and schedule. Early phases of AIV process relied up to now on standards that need to be tailored through company and personal expertise. A study has then been conducted in order to exploit the possibility to develop an expert system helping in making choices in the early, conceptual phase of Assembly, Integration and Verification, namely the Model Philosophy and the test definition. The work focused on a hybrid approach, allowing interaction between historical data and human expertise. The expert system that has been prototyped exploits both information elicited from domain experts and results of a Data Mining activity on the existent data bases of completed projects verification data. The Data Mining algorithms allow the extraction of past experience resident on ESA/ MATD data base, which contains information in the form of statistical summaries, costs, frequencies of on-ground and in flight failures. Finding non-trivial associations could then be utilised by the experts to manage new decisions in a controlled way (Standards driven) at the beginning or during the AIV Process Moreover, the Expert AIV could allow compilation of a set of feasible AIV schedules to support further programmatic-driven choices.
Integrating Information & Communications Technologies into the Classroom
ERIC Educational Resources Information Center
Tomei, Lawrence, Ed.
2007-01-01
"Integrating Information & Communications Technologies Into the Classroom" examines topics critical to business, computer science, and information technology education, such as: school improvement and reform, standards-based technology education programs, data-driven decision making, and strategic technology education planning. This book also…
Modeling of developmental toxicology presents a significant challenge to computational toxicology due to endpoint complexity and lack of data coverage. These challenges largely account for the relatively few modeling successes using the structure–activity relationship (SAR) parad...
Linear dynamical modes as new variables for data-driven ENSO forecast
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Seleznev, Aleksei; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander; Kurths, Juergen
2018-05-01
A new data-driven model for analysis and prediction of spatially distributed time series is proposed. The model is based on a linear dynamical mode (LDM) decomposition of the observed data which is derived from a recently developed nonlinear dimensionality reduction approach. The key point of this approach is its ability to take into account simple dynamical properties of the observed system by means of revealing the system's dominant time scales. The LDMs are used as new variables for empirical construction of a nonlinear stochastic evolution operator. The method is applied to the sea surface temperature anomaly field in the tropical belt where the El Nino Southern Oscillation (ENSO) is the main mode of variability. The advantage of LDMs versus traditionally used empirical orthogonal function decomposition is demonstrated for this data. Specifically, it is shown that the new model has a competitive ENSO forecast skill in comparison with the other existing ENSO models.
Modeling Martian Dust Using Mars-GRAM
NASA Technical Reports Server (NTRS)
Justh, Hilary L.; Justus, C. G.
2010-01-01
Engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). From the surface to 80 km altitude, Mars-GRAM is based on NASA Ames Mars General Circulation Model (MGCM). Mars-GRAM and MGCM use surface topography from Mars Global Surveyor Mars Orbiter Laser Altimeter (MOLA), with altitudes referenced to the MOLA areoid, or constant potential surface. Traditional Mars-GRAM options for representing the mean atmosphere along entry corridors include: TES Mapping Years 1 and 2, with Mars-GRAM data coming from MGCM model results driven by observed TES dust optical depth TES Mapping Year 0, with user-controlled dust optical depth and Mars-GRAM data interpolated from MGCM model results driven by selected values of globally-uniform dust optical depth. Mars-GRAM 2005 has been validated against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES).
Amin, Waqas; Singh, Harpreet; Pople, Andre K.; Winters, Sharon; Dhir, Rajiv; Parwani, Anil V.; Becich, Michael J.
2010-01-01
Context: Tissue banking informatics deals with standardized annotation, collection and storage of biospecimens that can further be shared by researchers. Over the last decade, the Department of Biomedical Informatics (DBMI) at the University of Pittsburgh has developed various tissue banking informatics tools to expedite translational medicine research. In this review, we describe the technical approach and capabilities of these models. Design: Clinical annotation of biospecimens requires data retrieval from various clinical information systems and the de-identification of the data by an honest broker. Based upon these requirements, DBMI, with its collaborators, has developed both Oracle-based organ-specific data marts and a more generic, model-driven architecture for biorepositories. The organ-specific models are developed utilizing Oracle 9.2.0.1 server tools and software applications and the model-driven architecture is implemented in a J2EE framework. Result: The organ-specific biorepositories implemented by DBMI include the Cooperative Prostate Cancer Tissue Resource (http://www.cpctr.info/), Pennsylvania Cancer Alliance Bioinformatics Consortium (http://pcabc.upmc.edu/main.cfm), EDRN Colorectal and Pancreatic Neoplasm Database (http://edrn.nci.nih.gov/) and Specialized Programs of Research Excellence (SPORE) Head and Neck Neoplasm Database (http://spores.nci.nih.gov/current/hn/index.htm). The model-based architecture is represented by the National Mesothelioma Virtual Bank (http://mesotissue.org/). These biorepositories provide thousands of well annotated biospecimens for the researchers that are searchable through query interfaces available via the Internet. Conclusion: These systems, developed and supported by our institute, serve to form a common platform for cancer research to accelerate progress in clinical and translational research. In addition, they provide a tangible infrastructure and resource for exposing research resources and biospecimen services in collaboration with the clinical anatomic pathology laboratory information system (APLIS) and the cancer registry information systems. PMID:20922029
Amin, Waqas; Singh, Harpreet; Pople, Andre K; Winters, Sharon; Dhir, Rajiv; Parwani, Anil V; Becich, Michael J
2010-08-10
Tissue banking informatics deals with standardized annotation, collection and storage of biospecimens that can further be shared by researchers. Over the last decade, the Department of Biomedical Informatics (DBMI) at the University of Pittsburgh has developed various tissue banking informatics tools to expedite translational medicine research. In this review, we describe the technical approach and capabilities of these models. Clinical annotation of biospecimens requires data retrieval from various clinical information systems and the de-identification of the data by an honest broker. Based upon these requirements, DBMI, with its collaborators, has developed both Oracle-based organ-specific data marts and a more generic, model-driven architecture for biorepositories. The organ-specific models are developed utilizing Oracle 9.2.0.1 server tools and software applications and the model-driven architecture is implemented in a J2EE framework. The organ-specific biorepositories implemented by DBMI include the Cooperative Prostate Cancer Tissue Resource (http://www.cpctr.info/), Pennsylvania Cancer Alliance Bioinformatics Consortium (http://pcabc.upmc.edu/main.cfm), EDRN Colorectal and Pancreatic Neoplasm Database (http://edrn.nci.nih.gov/) and Specialized Programs of Research Excellence (SPORE) Head and Neck Neoplasm Database (http://spores.nci.nih.gov/current/hn/index.htm). The model-based architecture is represented by the National Mesothelioma Virtual Bank (http://mesotissue.org/). These biorepositories provide thousands of well annotated biospecimens for the researchers that are searchable through query interfaces available via the Internet. These systems, developed and supported by our institute, serve to form a common platform for cancer research to accelerate progress in clinical and translational research. In addition, they provide a tangible infrastructure and resource for exposing research resources and biospecimen services in collaboration with the clinical anatomic pathology laboratory information system (APLIS) and the cancer registry information systems.
A DNA-based semantic fusion model for remote sensing data.
Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H
2013-01-01
Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.
A DNA-Based Semantic Fusion Model for Remote Sensing Data
Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H.
2013-01-01
Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology. PMID:24116207
A new approach to configurable primary data collection.
Stanek, J; Babkin, E; Zubov, M
2016-09-01
The formats, semantics and operational rules of data processing tasks in genomics (and health in general) are highly divergent and can rapidly change. In such an environment, the problem of consistent transformation and loading of heterogeneous input data to various target repositories becomes a critical success factor. The objective of the project was to design a new conceptual approach to configurable data transformation, de-identification, and submission of health and genomic data sets. Main motivation was to facilitate automated or human-driven data uploading, as well as consolidation of heterogeneous sources in large genomic or health projects. Modern methods of on-demand specialization of generic software components were applied. For specification of input-output data and required data collection activities, we propose a simple data model of flat tables as well as a domain-oriented graphical interface and portable representation of transformations in XML. Using such methods, the prototype of the Configurable Data Collection System (CDCS) was implemented in Java programming language with Swing graphical interfaces. The core logic of transformations was implemented as a library of reusable plugins. The solution is implemented as a software prototype for a configurable service-oriented system for semi-automatic data collection, transformation, sanitization and safe uploading to heterogeneous data repositories-CDCS. To address the dynamic nature of data schemas and data collection processes, the CDCS prototype facilitates interactive, user-driven configuration of the data collection process and extends basic functionality with a wide range of third-party plugins. Notably, our solution also allows for the reduction of manual data entry for data originally missing in the output data sets. First experiments and feedback from domain experts confirm the prototype is flexible, configurable and extensible; runs well on data owner's systems; and is not dependent on vendor's standards. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Marschall, Raphael; Su, Cheng-Chin; Liao, Ying; Rubin, Martin; Wu, Jong-Shinn; Thomas, Nicolas; altwegg, kathrin; Sierks, Holger; OSIRIS, ROSINA
2016-10-01
The study by [1] has proposed the idea that the cometary dust jets in the northern hemisphere of comet 67P/Churyumov-Gerasimenko arise mainly from rough cliff like terrain. Using our 3D gas and dust dynamics coma model [2] we have run simulations targeting the question whether areas with high gravitational slopes alone can indeed account for both the ROSINA/COPS and the OSIRIS data obtained for mid August to end October 2014.The basis of our simulations is the shape model "SHAP4S" of [3]. Surface temperatures have been defined using a simple 1-D thermal model (including insolation, shadowing, thermal emission, sublimation but neglecting conduction) computed for each facet of the shape model allowing a consistent and known description of the gas flux and its initial temperature. In a next step we use the DSMC program PDSC++ [4] to calculate the gas properties in 3D space. The gas solution can be compared with the in situ measurements by ROSINA/COPS. In a subsequent step dust particles are introduced into the gas flow to determine dust densities and with a column integrator and Mie theory dust brightnesses that can be compared to OSIRIS data.To examine cliff activity we have divided the surface into two sets. One with gravitational slopes larger than 30° which we call cliffs and one with slopes less than 30° which we shall call plains. We have set up two models, "cliff only" and "plains only" where the respective set of areas are active and the others inert. The outgassing areas are assumed to be purely insolation driven. The "cliffs only" model is a statistically equally good fit to the ROSINA/COPS data as the global insolation driven model presented in [2]. The "plains only" model on the other hand is statistically inferior to the "cliffs only" model. We found in [2] that increased activity in the Hapi region (called inhomogeneous model) of the comet improves the fit of the gas results significantly. We can show in this study that a "cliffs + Hapi" model fits the ROSINA/COPS data equally well as the inhomogeneous model. These results are consistent with OSIRIS data.[1] Vincent et al., 2016, A&A, 587, A14[2] Marschall et al., 2016; A&A, 589, A90[3] Preusker et al., 2015, A&A 583, A33[4] Su, C. C., 2013
Prospects of second generation artificial intelligence tools in calibration of chemical sensors.
Braibanti, Antonio; Rao, Rupenaguntla Sambasiva; Ramam, Veluri Anantha; Rao, Gollapalli Nageswara; Rao, Vaddadi Venkata Panakala
2005-05-01
Multivariate data driven calibration models with neural networks (NNs) are developed for binary (Cu++ and Ca++) and quaternary (K+, Ca++, NO3- and Cl-) ion-selective electrode (ISE) data. The response profiles of ISEs with concentrations are non-linear and sub-Nernstian. This task represents function approximation of multi-variate, multi-response, correlated, non-linear data with unknown noise structure i.e. multi-component calibration/prediction in chemometric parlance. Radial distribution function (RBF) and Fuzzy-ARTMAP-NN models implemented in the software packages, TRAJAN and Professional II, are employed for the calibration. The optimum NN models reported are based on residuals in concentration space. Being a data driven information technology, NN does not require a model, prior- or posterior- distribution of data or noise structure. Missing information, spikes or newer trends in different concentration ranges can be modeled through novelty detection. Two simulated data sets generated from mathematical functions are modeled as a function of number of data points and network parameters like number of neurons and nearest neighbors. The success of RBF and Fuzzy-ARTMAP-NNs to develop adequate calibration models for experimental data and function approximation models for more complex simulated data sets ensures AI2 (artificial intelligence, 2nd generation) as a promising technology in quantitation.
Measuring Conditions and Consequences of Tracking in the High School Curriculum
ERIC Educational Resources Information Center
Archbald, Doug; Keleher, Julia
2008-01-01
Despite a decade of advocacy and advances in technology, data driven decision making remains an elusive vision for most high schools. This article identifies key data systems design needs and presents methods for monitoring, managing, and improving programs. Because of its continuing salience, we focus on the issue of tracking (ability grouping).…
Kam, Chi-Ming; Greenberg, Mark T; Walls, Carla T
2003-03-01
In order for empirically validated school-based prevention programs to "go to scale," it is important to understand the processes underlying program dissemination. Data collected in effectiveness trials, especially those measuring the quality of program implementation and administrative support, are valuable in explicating important factors influencing implementation. This study describes findings regarding quality of implementation in a recent effectiveness trial conducted in a high-risk, American urban community. This delinquency prevention trial is a locally owned intervention, which used the Promoting Alternative THinking Skills Curriculum as its major program component. The intervention involved 350 first graders in 6 inner-city public schools. Three schools implemented the intervention and the other 3 were comparison schools from the same school district. Although intervention effects were not found for all the intervention schools, the intervention was effective in improving children's emotional competence and reducing their aggression in schools which effectively supported the intervention. This study, utilizing data from the 3 intervention schools (13 classrooms and 164 students), suggested that 2 factors contributed to the success of the intervention: (a) adequate support from school principals and (b) high degree of classroom implementation by teachers. These findings are discussed in light of the theory-driven models in program evaluation that emphasized the importance of the multiple factors influencing the implementation of school-based interventions.
NASA Astrophysics Data System (ADS)
Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.
2017-12-01
Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).
Swash saturation: an assessment of available models
NASA Astrophysics Data System (ADS)
Hughes, Michael G.; Baldock, Tom E.; Aagaard, Troels
2018-06-01
An extensive previously published (Hughes et al. Mar Geol 355, 88-97, 2014) field data set representing the full range of micro-tidal beach states (reflective, intermediate and dissipative) is used to investigate swash saturation. Two models that predict the behavior of saturated swash are tested: one driven by standing waves and the other driven by bores. Despite being based on entirely different premises, they predict similar trends in the limiting (saturated) swash height with respect to dependency on frequency and beach gradient. For a given frequency and beach gradient, however, the bore-driven model predicts a larger saturated swash height by a factor 2.5. Both models broadly predict the general behavior of swash saturation evident in the data, but neither model is accurate in detail. While swash saturation in the short-wave frequency band is common on some beach types, it does not always occur across all beach types. Further work is required on wave reflection/breaking and the role of wave-wave and wave-swash interactions to determine limiting swash heights on natural beaches.
Voss, Frank; Maule, Alec
2013-01-01
A model for simulating daily maximum and mean water temperatures was developed by linking two existing models: one developed by the U.S. Geological Survey and one developed by the Bureau of Reclamation. The study area included the lower Yakima River main stem between the Roza Dam and West Richland, Washington. To automate execution of the labor-intensive models, a database-driven model automation program was developed to decrease operation costs, to reduce user error, and to provide the capability to perform simulations quickly for multiple management and climate change scenarios. Microsoft© SQL Server 2008 R2 Integration Services packages were developed to (1) integrate climate, flow, and stream geometry data from diverse sources (such as weather stations, a hydrologic model, and field measurements) into a single relational database; (2) programmatically generate heavily formatted model input files; (3) iteratively run water temperature simulations; (4) process simulation results for export to other models; and (5) create a database-driven infrastructure that facilitated experimentation with a variety of scenarios, node permutations, weather data, and hydrologic conditions while minimizing costs of running the model with various model configurations. As a proof-of-concept exercise, water temperatures were simulated for a "Current Conditions" scenario, where local weather data from 1980 through 2005 were used as input, and for "Plus 1" and "Plus 2" climate warming scenarios, where the average annual air temperatures used in the Current Conditions scenario were increased by 1degree Celsius (°C) and by 2°C, respectively. Average monthly mean daily water temperatures simulated for the Current Conditions scenario were compared to measured values at the Bureau of Reclamation Hydromet gage at Kiona, Washington, for 2002-05. Differences ranged between 1.9° and 1.1°C for February, March, May, and June, and were less than 0.8°C for the remaining months of the year. The difference between current conditions and measured monthly values for the two warmest months (July and August) were 0.5°C and 0.2°C, respectively. The model predicted that water temperature generally becomes less sensitive to air temperature increases as the distance from the mouth of the river decreases. As a consequence, the difference between climate warming scenarios also decreased. The pattern of decreasing sensitivity is most pronounced from August to October. Interactive graphing tools were developed to explore the relative sensitivity of average monthly and mean daily water temperature to increases in air temperature for model output locations along the lower Yakima River main stem.
NASA Astrophysics Data System (ADS)
Tu, Weichao; Cunningham, G. S.; Chen, Y.; Henderson, M. G.; Camporeale, E.; Reeves, G. D.
2013-10-01
a response to the Geospace Environment Modeling (GEM) "Global Radiation Belt Modeling Challenge," a 3D diffusion model is used to simulate the radiation belt electron dynamics during two intervals of the Combined Release and Radiation Effects Satellite (CRRES) mission, 15 August to 15 October 1990 and 1 February to 31 July 1991. The 3D diffusion model, developed as part of the Dynamic Radiation Environment Assimilation Model (DREAM) project, includes radial, pitch angle, and momentum diffusion and mixed pitch angle-momentum diffusion, which are driven by dynamic wave databases from the statistical CRRES wave data, including plasmaspheric hiss, lower-band, and upper-band chorus. By comparing the DREAM3D model outputs to the CRRES electron phase space density (PSD) data, we find that, with a data-driven boundary condition at Lmax = 5.5, the electron enhancements can generally be explained by radial diffusion, though additional local heating from chorus waves is required. Because the PSD reductions are included in the boundary condition at Lmax = 5.5, our model captures the fast electron dropouts over a large L range, producing better model performance compared to previous published results. Plasmaspheric hiss produces electron losses inside the plasmasphere, but the model still sometimes overestimates the PSD there. Test simulations using reduced radial diffusion coefficients or increased pitch angle diffusion coefficients inside the plasmasphere suggest that better wave models and more realistic radial diffusion coefficients, both inside and outside the plasmasphere, are needed to improve the model performance. Statistically, the results show that, with the data-driven outer boundary condition, including radial diffusion and plasmaspheric hiss is sufficient to model the electrons during geomagnetically quiet times, but to best capture the radiation belt variations during active times, pitch angle and momentum diffusion from chorus waves are required.
Pyrotechnic modeling for the NSI and pin puller
NASA Technical Reports Server (NTRS)
Powers, Joseph M.; Gonthier, Keith A.
1993-01-01
A discussion concerning the modeling of pyrotechnically driven actuators is presented in viewgraph format. The following topics are discussed: literature search, constitutive data for full-scale model, simple deterministic model, observed phenomena, and results from simple model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohn, Michael; Adams, Paul
2006-09-05
The L3 system is a computational steering environment for image processing and scientific computing. It consists of an interactive graphical language and interface. Its purpose is to help advanced users in controlling their computational software and assist in the management of data accumulated during numerical experiments. L3 provides a combination of features not found in other environments; these are: - textual and graphical construction of programs - persistence of programs and associated data - direct mapping between the scripts, the parameters, and the produced data - implicit hierarchial data organization - full programmability, including conditionals and functions - incremental executionmore » of programs The software includes the l3 language and the graphical environment. The language is a single-assignment functional language; the implementation consists of lexer, parser, interpreter, storage handler, and editing support, The graphical environment is an event-driven nested list viewer/editor providing graphical elements corresponding to the language. These elements are both the represenation of a users program and active interfaces to the values computed by that program.« less
Shlizerman, Eli; Riffell, Jeffrey A.; Kutz, J. Nathan
2014-01-01
The antennal lobe (AL), olfactory processing center in insects, is able to process stimuli into distinct neural activity patterns, called olfactory neural codes. To model their dynamics we perform multichannel recordings from the projection neurons in the AL driven by different odorants. We then derive a dynamic neuronal network from the electrophysiological data. The network consists of lateral-inhibitory neurons and excitatory neurons (modeled as firing-rate units), and is capable of producing unique olfactory neural codes for the tested odorants. To construct the network, we (1) design a projection, an odor space, for the neural recording from the AL, which discriminates between distinct odorants trajectories (2) characterize scent recognition, i.e., decision-making based on olfactory signals and (3) infer the wiring of the neural circuit, the connectome of the AL. We show that the constructed model is consistent with biological observations, such as contrast enhancement and robustness to noise. The study suggests a data-driven approach to answer a key biological question in identifying how lateral inhibitory neurons can be wired to excitatory neurons to permit robust activity patterns. PMID:25165442
NASA Astrophysics Data System (ADS)
Sambrotto, R.
2015-12-01
The Secondary School Field Research Program is a field and laboratory internship for high school students at the Lamont-Doherty Earth Observatory. Over the past 11 years it has grown into a significant program, engaging approximately 50 high school and college students each summer, most of them from ethnic and economic groups that are under-represented in the STEM fields. The internships are based on research-driven science questions on estuarine physics, chemistry, ecology and the paleo-environment. Field studies are linked to associated laboratory analyses whose results are reported by the students as a final project. For the past two years, we have focused on the transition to an institutional program, with sustainable funding and organizational structures. At a grant-driven institution whose mission is largely restricted to basic research, institutionalization has not been an easy task. To leverage scarce resources we have implemented a layered structure that relies on near-peer mentoring. So a typical research team might include a mix of new and more experienced high school students, a college student, a high school science teacher and a Lamont researcher as a mentor. Graduates of the program are employed to assist with administration. Knowledge and best practices diffuse through the organization in an organic, if not entirely structured, fashion. We have found that a key to long-term funding has been survival: as we have sustained a successful program and developed a model adapted to Lamont's unique environment, we have attracted longer term core financing on which grant-driven extensions can be built. The result is a highly flexible program that is student-centered in the context of a broader research culture connecting our participants with the advantages of working at a premier soft-money research institution.
Simplified subsurface modelling: data assimilation and violated model assumptions
NASA Astrophysics Data System (ADS)
Erdal, Daniel; Lange, Natascha; Neuweiler, Insa
2017-04-01
Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated compartments. Under such circumstances, direct modelling using a simplified model will not provide good results. However, a more data driven (e.g. grey box) approach, driven by the filter, may still provide an improved understanding of the system. Comparisons between full 3D simulations and simplified filter driven models will be shown and the resulting benefits and drawbacks will be discussed.
Wei, Qinglai; Song, Ruizhuo; Yan, Pengfei
2016-02-01
This paper is concerned with a new data-driven zero-sum neuro-optimal control problem for continuous-time unknown nonlinear systems with disturbance. According to the input-output data of the nonlinear system, an effective recurrent neural network is introduced to reconstruct the dynamics of the nonlinear system. Considering the system disturbance as a control input, a two-player zero-sum optimal control problem is established. Adaptive dynamic programming (ADP) is developed to obtain the optimal control under the worst case of the disturbance. Three single-layer neural networks, including one critic and two action networks, are employed to approximate the performance index function, the optimal control law, and the disturbance, respectively, for facilitating the implementation of the ADP method. Convergence properties of the ADP method are developed to show that the system state will converge to a finite neighborhood of the equilibrium. The weight matrices of the critic and the two action networks are also convergent to finite neighborhoods of their optimal ones. Finally, the simulation results will show the effectiveness of the developed data-driven ADP methods.
Continuing Education: Market Driven or Learner Centered? Myths and Realities.
ERIC Educational Resources Information Center
Kerka, Sandra
At the heart of the controversy over market-driven continuing education programs is the issue of whether they are necessarily antithetical to the principles and philosophy of adult learning. Opponents identify the following problems of market-driven programs: they perpetuate inequality by neglecting needs of those less able to pay; they may meet…
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
NASA Astrophysics Data System (ADS)
Robinet, A.; Castelle, B.; Idier, D.; Le Cozannet, G.; Déqué, M.; Charles, E.
2016-12-01
Modeling studies addressing daily to interannual coastal evolution typically relate shoreline change with waves, currents and sediment transport through complex processes and feedbacks. For wave-dominated environments, the main driver (waves) is controlled by the regional atmospheric circulation. Here a simple weather regime-driven shoreline model is developed for a 15-year shoreline dataset (2000-2014) collected at Truc Vert beach, Bay of Biscay, SW France. In all, 16 weather regimes (four per season) are considered. The centroids and occurrences are computed using the ERA-40 and ERA-Interim reanalyses, applying k-means and EOF methods to the anomalies of the 500-hPa geopotential height over the North Atlantic Basin. The weather regime-driven shoreline model explains 70% of the observed interannual shoreline variability. The application of a proven wave-driven equilibrium shoreline model to the same period shows that both models have similar skills at the interannual scale. Relation between the weather regimes and the wave climate in the Bay of Biscay is investigated and the primary weather regimes impacting shoreline change are identified. For instance, the winter zonal regime characterized by a strengthening of the pressure gradient between the Iceland low and the Azores high is associated with high-energy wave conditions and is found to drive an increase in the shoreline erosion rate. The study demonstrates the predictability of interannual shoreline change from a limited number of weather regimes, which opens new perspectives for shoreline change modeling and encourages long-term shoreline monitoring programs.
1989-12-01
Interrupt Procedures ....... 29 13. Support for a Larger Memory Model ................ 29 C. IMPLEMENTATION ........................................ 29...describe the programmer’s model of the hardware utilized in the microcomputers and interrupt driven serial communication considerations. Chapter III...Central Processor Unit The programming model of Table 2.1 is common to the Intel 8088, 8086 and 80x86 series of microprocessors used in the IBM PC/AT
Open-source chemogenomic data-driven algorithms for predicting drug-target interactions.
Hao, Ming; Bryant, Stephen H; Wang, Yanli
2018-02-06
While novel technologies such as high-throughput screening have advanced together with significant investment by pharmaceutical companies during the past decades, the success rate for drug development has not yet been improved prompting researchers looking for new strategies of drug discovery. Drug repositioning is a potential approach to solve this dilemma. However, experimental identification and validation of potential drug targets encoded by the human genome is both costly and time-consuming. Therefore, effective computational approaches have been proposed to facilitate drug repositioning, which have proved to be successful in drug discovery. Doubtlessly, the availability of open-accessible data from basic chemical biology research and the success of human genome sequencing are crucial to develop effective in silico drug repositioning methods allowing the identification of potential targets for existing drugs. In this work, we review several chemogenomic data-driven computational algorithms with source codes publicly accessible for predicting drug-target interactions (DTIs). We organize these algorithms by model properties and model evolutionary relationships. We re-implemented five representative algorithms in R programming language, and compared these algorithms by means of mean percentile ranking, a new recall-based evaluation metric in the DTI prediction research field. We anticipate that this review will be objective and helpful to researchers who would like to further improve existing algorithms or need to choose appropriate algorithms to infer potential DTIs in the projects. The source codes for DTI predictions are available at: https://github.com/minghao2016/chemogenomicAlg4DTIpred. Published by Oxford University Press 2018. This work is written by US Government employees and is in the public domain in the US.
NASA Technical Reports Server (NTRS)
Goodman, Michael L.; Kwan, Chiman; Ayhan, Bulent; Shang, Eric L.
2017-01-01
There are many flare forecasting models. For an excellent review and comparison of some of them see Barnes et al. (2016). All these models are successful to some degree, but there is a need for better models. We claim the most successful models explicitly or implicitly base their forecasts on various estimates of components of the photospheric current density J, based on observations of the photospheric magnetic field B. However, none of the models we are aware of compute the complete J. We seek to develop a better model based on computing the complete photospheric J. Initial results from this model are presented in this talk. We present a data driven, near photospheric, 3 D, non-force free magnetohydrodynamic (MHD) model that computes time series of the total J, and associated resistive heating rate in each pixel at the photosphere in the neutral line regions (NLRs) of 14 active regions (ARs). The model is driven by time series of B measured by the Helioseismic & Magnetic Imager (HMI) on the Solar Dynamics Observatory (SDO) satellite. Spurious Doppler periods due to SDO orbital motion are filtered out of the time series of B in every AR pixel. Errors in B due to these periods can be significant.
Data-driven Applications for the Sun-Earth System
NASA Astrophysics Data System (ADS)
Kondrashov, D. A.
2016-12-01
Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.
Data-driven indexing mechanism for the recognition of polyhedral objects
NASA Astrophysics Data System (ADS)
McLean, Stewart; Horan, Peter; Caelli, Terry M.
1992-02-01
This paper is concerned with the problem of searching large model databases. To date, most object recognition systems have concentrated on the problem of matching using simple searching algorithms. This is quite acceptable when the number of object models is small. However, in the future, general purpose computer vision systems will be required to recognize hundreds or perhaps thousands of objects and, in such circumstances, efficient searching algorithms will be needed. The problem of searching a large model database is one which must be addressed if future computer vision systems are to be at all effective. In this paper we present a method we call data-driven feature-indexed hypothesis generation as one solution to the problem of searching large model databases.
Michael, Edwin; Singh, Brajendra K; Mayala, Benjamin K; Smith, Morgan E; Hampton, Scott; Nabrzyski, Jaroslaw
2017-09-27
There are growing demands for predicting the prospects of achieving the global elimination of neglected tropical diseases as a result of the institution of large-scale nation-wide intervention programs by the WHO-set target year of 2020. Such predictions will be uncertain due to the impacts that spatial heterogeneity and scaling effects will have on parasite transmission processes, which will introduce significant aggregation errors into any attempt aiming to predict the outcomes of interventions at the broader spatial levels relevant to policy making. We describe a modeling platform that addresses this problem of upscaling from local settings to facilitate predictions at regional levels by the discovery and use of locality-specific transmission models, and we illustrate the utility of using this approach to evaluate the prospects for eliminating the vector-borne disease, lymphatic filariasis (LF), in sub-Saharan Africa by the WHO target year of 2020 using currently applied or newly proposed intervention strategies. METHODS AND RESULTS: We show how a computational platform that couples site-specific data discovery with model fitting and calibration can allow both learning of local LF transmission models and simulations of the impact of interventions that take a fuller account of the fine-scale heterogeneous transmission of this parasitic disease within endemic countries. We highlight how such a spatially hierarchical modeling tool that incorporates actual data regarding the roll-out of national drug treatment programs and spatial variability in infection patterns into the modeling process can produce more realistic predictions of timelines to LF elimination at coarse spatial scales, ranging from district to country to continental levels. Our results show that when locally applicable extinction thresholds are used, only three countries are likely to meet the goal of LF elimination by 2020 using currently applied mass drug treatments, and that switching to more intensive drug regimens, increasing the frequency of treatments, or switching to new triple drug regimens will be required if LF elimination is to be accelerated in Africa. The proportion of countries that would meet the goal of eliminating LF by 2020 may, however, reach up to 24/36 if the WHO 1% microfilaremia prevalence threshold is used and sequential mass drug deliveries are applied in countries. We have developed and applied a data-driven spatially hierarchical computational platform that uses the discovery of locally applicable transmission models in order to predict the prospects for eliminating the macroparasitic disease, LF, at the coarser country level in sub-Saharan Africa. We show that fine-scale spatial heterogeneity in local parasite transmission and extinction dynamics, as well as the exact nature of intervention roll-outs in countries, will impact the timelines to achieving national LF elimination on this continent.
Yield Hardening of Electrorheological Fluids in Channel Flow
NASA Astrophysics Data System (ADS)
Helal, Ahmed; Qian, Bian; McKinley, Gareth H.; Hosoi, A. E.
2016-06-01
Electrorheological fluids offer potential for developing rapidly actuated hydraulic devices where shear forces or pressure-driven flow are present. In this study, the Bingham yield stress of electrorheological fluids with different particle volume fractions is investigated experimentally in wall-driven and pressure-driven flow modes using measurements in a parallel-plate rheometer and a microfluidic channel, respectively. A modified Krieger-Dougherty model can be used to describe the effects of the particle volume fraction on the yield stress and is in good agreement with the viscometric data. However, significant yield hardening in pressure-driven channel flow is observed and attributed to an increase and eventual saturation of the particle volume fraction in the channel. A phenomenological physical model linking the densification and consequent microstructure to the ratio of the particle aggregation time scale compared to the convective time scale is presented and used to predict the enhancement in yield stress in channel flow, enabling us to reconcile discrepancies in the literature between wall-driven and pressure-driven flows.
Hensler, David; Richardson, Chad L; Brown, Joslyn; Tseng, Christine; DeCamp, Phyllis J; Yang, Amy; Pawlowski, Anna; Ho, Bing; Ison, Michael G
2018-04-01
Prophylaxis with valganciclovir reduces the incidence of cytomegalovirus (CMV) infection following solid organ transplant (SOT). Under-dosing of valganciclovir is associated with an increased risk of CMV infection and development of ganciclovir-resistant CMV. An automated electronic health record (EHR)-based, pharmacist-driven program was developed to optimize dosing of valganciclovir in solid organ transplant recipients at a large transplant center. Two cohorts of kidney, pancreas-kidney, and liver transplant recipients from our center pre-implementation (April 2011-March 2012, n = 303) and post-implementation of the optimization program (September 2012-August 2013, n=263) had demographic and key outcomes data collected for 1 year post-transplant. The 1-year incidence of CMV infection dropped from 56 (18.5%) to 32 (12.2%, P = .05) and the incidence of breakthrough infections on prophylaxis was cut in half (61% vs 34%, P = .03) after implementation of the dose optimization program. The hazard ratio of developing CMV was 1.64 (95% CI 1.06-2.60, P = .027) for the pre-implementation group after adjusting for potential confounders. The program also resulted in a numerical reduction in the number of ganciclovir-resistant CMV cases (2 [0.7%] pre-implementation vs 0 post-implementation). An EHR-based, pharmacist-driven valganciclovir dose optimization program was associated with reduction in CMV infections. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Post, Vincent E. A.; Houben, Georg J.
2017-08-01
Due to the growing vulnerability of low-lying coastal zones to flooding by seawater, there is a current need for studies of the impact of such inundations on fresh groundwater resources. The knowledge from the literature is biased towards tropical atoll environments, and only few studies specifically investigated the effect of density-driven downward flow, even though its importance is widely acknowledged. The present study is based on previously unpublished hydrochemical data collected on the island of Baltrum following a devastating storm in 1962, which uniquely show the impact of seawater inundation on a freshwater lens in a siliciclastic aquifer. The field data show that about 3 kg of Cl per m2 of inundated land area, or 18 cm of seawater, infiltrated, and that elevated salinities persisted at the measurement depths of 4 and 6 m for at least 4 years, and at least for 6 years at greater depths. Numerical models support the assertion that the shape of the measured salinographs, i.e. an initial sharp rise in the salt concentration with time, followed by a continually-slowing decrease, must be attributed to density-driven salt fingering. Models that did not consider density effects fail to simulate the observed patterns. Transient recharge, model dimension and lateral flow modify the details of the simulation results, but in all models density-driven vertical flow dominates the overall system behaviour. The diminishing importance of density-driven flow at greater depths, however, in combination with slow recharge-driven flow rates prolongs flushing times, and enhances the risk of brackish-water up-coning when pumping is resumed too soon.
2014-04-25
EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL file and generate the corresponding UML...ObjectItemStructure specification shown in Figure 10. Running this script in the relational database server MySQL creates the physical schema that
Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano
2013-01-01
The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified “sensory” networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom–up signals on brain activity during viewing of complex and dynamic multisensory stimuli, beyond the capability of purely data-driven approaches. PMID:23202431
NASA Astrophysics Data System (ADS)
Vernon, F.; Arrott, M.; Orcutt, J. A.; Mueller, C.; Case, J.; De Wardener, G.; Kerfoot, J.; Schofield, O.
2013-12-01
Any approach sophisticated enough to handle a variety of data sources and scale, yet easy enough to promote wide use and mainstream adoption is required to address the following mappings: - From the authored domain of observation to the requested domain of interest; - From the authored spatiotemporal resolution to the requested resolution; and - From the representation of data placed on wide variety of discrete mesh types to the use of that data as a continuos field with a selectable continuity. The Open Geospatial Consortium's (OGC) Reference Model[1] with its direct association with the ISO 19000 series standards provides a comprehensive foundation to represent all data on any type of mesh structure, aka "Discrete Coverages". The Reference Model also provides the specification for the core operations required to utilize any Discrete Coverage. The FEniCS Project[2] provides a comprehensive model for how to represent the Basis Functions on mesh structures as "Degrees of Freedom" to present discrete data as continuous fields with variable continuity. In this talk, we will present the research and development the OOI Cyberinfrastructure Project is pursuing to integrate these approaches into a comprehensive Application Programming Interface (API) to author, acquire and operate on the broad range of data formulation from time series, trajectories and tables through to time variant finite difference grids and finite element meshes.
Small Aircraft Data Distribution System
NASA Technical Reports Server (NTRS)
Chazanoff, Seth L.; Dinardo, Steven J.
2012-01-01
The CARVE Small Aircraft Data Distribution System acquires the aircraft location and attitude data that is required by the various programs running on a distributed network. This system distributes the data it acquires to the data acquisition programs for inclusion in their data files. It uses UDP (User Datagram Protocol) to broadcast data over a LAN (Local Area Network) to any programs that might have a use for the data. The program is easily adaptable to acquire additional data and log that data to disk. The current version also drives displays using precision pitch and roll information to aid the pilot in maintaining a level-level attitude for radar/radiometer mapping beyond the degree available by flying visually or using a standard gyro-driven attitude indicator. The software is designed to acquire an array of data to help the mission manager make real-time decisions as to the effectiveness of the flight. This data is displayed for the mission manager and broadcast to the other experiments on the aircraft for inclusion in their data files. The program also drives real-time precision pitch and roll displays for the pilot and copilot to aid them in maintaining the desired attitude, when required, during data acquisition on mapping lines.
A practical guide to big data research in psychology.
Chen, Eric Evan; Wojcik, Sean P
2016-12-01
The massive volume of data that now covers a wide variety of human behaviors offers researchers in psychology an unprecedented opportunity to conduct innovative theory- and data-driven field research. This article is a practical guide to conducting big data research, covering data management, acquisition, processing, and analytics (including key supervised and unsupervised learning data mining methods). It is accompanied by walkthrough tutorials on data acquisition, text analysis with latent Dirichlet allocation topic modeling, and classification with support vector machines. Big data practitioners in academia, industry, and the community have built a comprehensive base of tools and knowledge that makes big data research accessible to researchers in a broad range of fields. However, big data research does require knowledge of software programming and a different analytical mindset. For those willing to acquire the requisite skills, innovative analyses of unexpected or previously untapped data sources can offer fresh ways to develop, test, and extend theories. When conducted with care and respect, big data research can become an essential complement to traditional research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
ERIC Educational Resources Information Center
Beisel, Raymond W.
This report describes development of the "Prepare Them for the Future" project, a K-3 activity-oriented science curriculum. The program, funded through two grants, was driven by the need to boost the distressed labor-based economy in rural western Pennsylvania. Data showed a drop of 1,100 coal-mining jobs between 1980 and 1986 in Indiana…
A NASTRAN-based computer program for structural dynamic analysis of Horizontal Axis Wind Turbines
NASA Technical Reports Server (NTRS)
Lobitz, Don W.
1995-01-01
This paper describes a computer program developed for structural dynamic analysis of horizontal axis wind turbines (HAWT's). It is based on the finite element method through its reliance on NASTRAN for the development of mass, stiffness, and damping matrices of the tower end rotor, which are treated in NASTRAN as separate structures. The tower is modeled in a stationary frame and the rotor in one rotating at a constant angular velocity. The two structures are subsequently joined together (external to NASTRAN) using a time-dependent transformation consistent with the hub configuration. Aerodynamic loads are computed with an established flow model based on strip theory. Aeroelastic effects are included by incorporating the local velocity and twisting deformation of the blade in the load computation. The turbulent nature of the wind, both in space and time, is modeled by adding in stochastic wind increments. The resulting equations of motion are solved in the time domain using the implicit Newmark-Beta integrator. Preliminary comparisons with data from the Boeing/NASA MOD2 HAWT indicate that the code is capable of accurately and efficiently predicting the response of HAWT's driven by turbulent winds.
Dynamic Data Driven Methods for Self-aware Aerospace Vehicles
2015-04-08
structural response model that incorporates multiple degradation or failure modes including damaged panel strength (BVID, thru- hole ), damaged panel...stiffness (BVID, thru- hole ), loose fastener, fretted fastener hole , and disbonded surface. • A new data-driven approach for the online updating of the flight...between the first and second plies. The panels were reinforced around the boarders of the panel with through holes to simulate mounting the wing skins to
Integrating geo web services for a user driven exploratory analysis
NASA Astrophysics Data System (ADS)
Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate
2016-04-01
In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.
LabPatch, an acquisition and analysis program for patch-clamp electrophysiology.
Robinson, T; Thomsen, L; Huizinga, J D
2000-05-01
An acquisition and analysis program, "LabPatch," has been developed for use in patch-clamp research. LabPatch controls any patch-clamp amplifier, acquires and records data, runs voltage protocols, plots and analyzes data, and connects to spreadsheet and database programs. Controls within LabPatch are grouped by function on one screen, much like an oscilloscope front panel. The software is mouse driven, so that the user need only point and click. Finally, the ability to copy data to other programs running in Windows 95/98, and the ability to keep track of experiments using a database, make LabPatch extremely versatile. The system requirements include Windows 95/98, at least a 100-MHz processor and 16 MB RAM, a data acquisition card, digital-to-analog converter, and a patch-clamp amplifier. LabPatch is available free of charge at http://www.fhs.mcmaster.ca/huizinga/.
ISPE: A knowledge-based system for fluidization studies. 1990 Annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all ``specified goals`` are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
Procedures and models for estimating preconstruction costs of highway projects.
DOT National Transportation Integrated Search
2012-07-01
This study presents data driven and component based PE cost prediction models by utilizing critical factors retrieved from ten years of historical project data obtained from ODOT roadway division. The study used factor analysis of covariance and corr...
Real-time quality monitoring in debutanizer column with regression tree and ANFIS
NASA Astrophysics Data System (ADS)
Siddharth, Kumar; Pathak, Amey; Pani, Ajaya Kumar
2018-05-01
A debutanizer column is an integral part of any petroleum refinery. Online composition monitoring of debutanizer column outlet streams is highly desirable in order to maximize the production of liquefied petroleum gas. In this article, data-driven models for debutanizer column are developed for real-time composition monitoring. The dataset used has seven process variables as inputs and the output is the butane concentration in the debutanizer column bottom product. The input-output dataset is divided equally into a training (calibration) set and a validation (testing) set. The training set data were used to develop fuzzy inference, adaptive neuro fuzzy (ANFIS) and regression tree models for the debutanizer column. The accuracy of the developed models were evaluated by simulation of the models with the validation dataset. It is observed that the ANFIS model has better estimation accuracy than other models developed in this work and many data-driven models proposed so far in the literature for the debutanizer column.
Asynchronous Data Retrieval from an Object-Oriented Database
NASA Astrophysics Data System (ADS)
Gilbert, Jonathan P.; Bic, Lubomir
We present an object-oriented semantic database model which, similar to other object-oriented systems, combines the virtues of four concepts: the functional data model, a property inheritance hierarchy, abstract data types and message-driven computation. The main emphasis is on the last of these four concepts. We describe generic procedures that permit queries to be processed in a purely message-driven manner. A database is represented as a network of nodes and directed arcs, in which each node is a logical processing element, capable of communicating with other nodes by exchanging messages. This eliminates the need for shared memory and for centralized control during query processing. Hence, the model is suitable for implementation on a multiprocessor computer architecture, consisting of large numbers of loosely coupled processing elements.
Factors associated with children being driven to school: implications for walk to school programs.
Wen, Li Ming; Fry, Denise; Rissel, Chris; Dirkis, Helen; Balafas, Angela; Merom, Dafna
2008-04-01
In this study, we examined factors associated with children being driven to school. Participants were 1603 students (aged 9-11 years) and their parents from 24 public primary schools in inner western Sydney, Australia. Students recorded their modes of travel to and from school for 5 days in a student survey. Parents recorded their demographic data, their attitudes to travel, and their modes of travel to work, using a self-administered survey. An analysis of the two linked data sets found that 41% of students travelled by car to or from school for more than 5 trips per week. Almost a third (32%) of students walked all the way. Only 1% of students rode a bike and 22% used more than one mode of travel. Of those who were driven, 29% lived less than 1 km and a further 18% lived between 1 and 1.5 km from school. Factors associated with car travel (after adjusting for other potential confounders) were mode of parents' travel to work, parent attitudes, number of cars in the household, and distance from home to school. To be effective, walk to school programs need to address the link between parent journey to work and student journey to school.
Balakrishnan, Ilango; Harris, Peter; Birks, Diane K; Griesinger, Andrea; Amani, Vladimir; Cristiano, Brian; Remke, Marc; Taylor, Michael D; Handler, Michael; Foreman, Nicholas K; Vibhakar, Rajeev
2014-01-01
Medulloblastoma is a pediatric brain tumor with a variable prognosis due to clinical and genomic heterogeneity. Among the 4 major genomic sub-groups, patients with MYC amplified tumors have a particularly poor prognosis despite therapy with surgery, radiation and chemotherapy. Targeting the MYC oncogene has traditionally been problematic. Here we report that MYC driven medulloblastoma can be targeted by inhibition of the bromodomain protein BRD4. We show that bromodomain inhibition with JQ1 restricts c-MYC driven transcriptional programs in medulloblastoma, suppresses medulloblastoma cell growth and induces a cell cycle arrest. Importantly JQ1 suppresses stem cell associated signaling in medulloblastoma cells and inhibits medulloblastoma tumor cell self-renewal. Additionally JQ1 also promotes senescence in medulloblastoma cells by activating cell cycle kinase inhibitors and inhibiting activity of E2F1. Furthermore BRD4 inhibition displayed an anti-proliferative, pro-senescence effect in a medulloblastoma model in vivo. In clinical samples we found that transcriptional programs suppressed by JQ1 are associated with adverse risk in medulloblastoma patients. Our work indicates that BRD4 inhibition attenuates stem cell signaling in MYC driven medulloblastoma and demonstrates the feasibility BET domain inhibition as a therapeutic approach in vivo. PMID:24796395
Andaya, January M; Yamada, Seiji; Maskarinec, Gregory G
2014-01-01
In the current rapidly evolving healthcare environment of the United States, social justice programs in pre-medical and medical education are needed to cultivate socially conscious and health professionals inclined to interdisciplinary collaborations. To address ongoing healthcare inequalities, medical education must help medical students to become physicians skilled not only in the biomedical management of diseases, but also in identifying and addressing social and structural determinants of the patients' daily lives. Using a longitudinal Problem-Based Learning (PBL) methodology, the medical students and faculty advisers at the University of Hawai‘i John A. Burns School of Medicine (JABSOM) developed the Social Justice Curriculum Program (SJCP) to supplement the biomedical curriculum. The SJCP consists of three components: (1) active self-directed learning and didactics, (2) implementation and action, and (3) self-reflection and personal growth. The purpose of introducing a student-driven SJ curriculum is to expose the students to various components of SJ in health and medicine, and maximize engagement by using their own inputs for content and design. It is our hope that the SJCP will serve as a logistic and research-oriented model for future student-driven SJ programs that respond to global health inequalities by cultivating skills and interest in leadership and community service. PMID:25157325
Ambrose, Adrian Jacques H; Andaya, January M; Yamada, Seiji; Maskarinec, Gregory G
2014-08-01
In the current rapidly evolving healthcare environment of the United States, social justice programs in pre-medical and medical education are needed to cultivate socially conscious and health professionals inclined to interdisciplinary collaborations. To address ongoing healthcare inequalities, medical education must help medical students to become physicians skilled not only in the biomedical management of diseases, but also in identifying and addressing social and structural determinants of the patients' daily lives. Using a longitudinal Problem-Based Learning (PBL) methodology, the medical students and faculty advisers at the University of Hawai'i John A. Burns School of Medicine (JABSOM) developed the Social Justice Curriculum Program (SJCP) to supplement the biomedical curriculum. The SJCP consists of three components: (1) active self-directed learning and didactics, (2) implementation and action, and (3) self-reflection and personal growth. The purpose of introducing a student-driven SJ curriculum is to expose the students to various components of SJ in health and medicine, and maximize engagement by using their own inputs for content and design. It is our hope that the SJCP will serve as a logistic and research-oriented model for future student-driven SJ programs that respond to global health inequalities by cultivating skills and interest in leadership and community service.
Parallel Computation of Ocean-Atmosphere-Wave Coupled Storm Surge Model
NASA Astrophysics Data System (ADS)
Kim, K.; Yamashita, T.
2003-12-01
Ocean-atmosphere interactions are very important in the formation and development of tropical storms. These interactions are dominant in exchanging heat, momentum, and moisture fluxes. Heat flux is usually computed using a bulk equation. In this equation air-sea interface supplies heat energy to the atmosphere and to the storm. Dynamical interaction is most often one way in which it is the atmosphere that drives the ocean. The winds transfer momentum to both ocean surface waves and ocean current. The wind wave makes an important role in the exchange of the quantities of motion, heat and a substance between the atmosphere and the ocean. Storm surges can be considered as the phenomena of mean sea-level changes, which are the result of the frictional stresses of strong winds blowing toward the land and causing the set level and the low atmospheric pressure at the centre of the cyclone can additionally raise the sea level. In addition to the rise in water level itself, another wave factor must be considered. A rise of mean sea level due to white-cap wave dissipation should be considered. In bounded bodies of water, such as small seas, wind driven sea level set up is much serious than inverted barometer effects, in which the effects of wind waves on wind-driven current play an important role. It is necessary to develop the coupled system of the full spectral third-generation wind-wave model (WAM or WAVEWATCH III), the meso-scale atmosphere model (MM5) and the coastal ocean model (POM) for simulating these physical interactions. As the component of coupled system is so heavy for personal usage, the parallel computing system should be developed. In this study, first, we developed the coupling system of the atmosphere model, ocean wave model and the coastal ocean model, in the Beowulf System, for the simulation of the storm surge. It was applied to the storm surge simulation caused by Typhoon Bart (T9918) in the Yatsushiro Sea. The atmosphere model and the ocean model have been made the parallel codes by SPMD methods. The wave-current interface model was developed by defining the wave breaking stresses. And we developed the coupling program to collect and distribute the exchanging data with the parallel system. Every models and coupler are executed at same time, and they calculate own jobs and pass data with organic system. MPMD method programming was performed to couple the models. The coupler and each models united by the separated group, and they calculated by the group unit. Also they passed message when exchanging data by global unit. The data are exchanged every 60-second model time that is the least common multiple time of the atmosphere model, the wave model and the ocean model. The model was applied to the storm surge simulation in the Yatsushiro Sea, in which we could not simulated the observed maximum surge height with the numerical model that did not include the wave breaking stress. It is confirmed that the simulation which includes the wave breaking stress effects can produce the observed maximum height, 450 cm, at Matsuai.
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
Multidimensional Data Modeling for Business Process Analysis
NASA Astrophysics Data System (ADS)
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.
Hydrodynamical and Spectral Simulations of HMXB Winds
NASA Astrophysics Data System (ADS)
Mauche, Christopher W.; Liedahl, D. A.; Plewa, T.
2006-09-01
We describe the results of a research program to develop improved models of the X-ray spectra of cosmic sources such as X-ray binaries, CVs, and AGN in which UV line-driven mass flows are photoionized by an X-ray source. Work to date has focused on high-mass X-ray binaries (HMXBs) and on Vela X-1 in particular, for which there are high-quality Chandra HETG spectra in the archive. Our research program combines FLASH hydrodynamic calculations, XSTAR photoionization calculations, HULLAC atomic data, improved calculations of the line force multiplier, X-ray emission models appropriate to X-ray photoionized plasmas, and Monte Carlo radiation transport. We will present movies of the relevant physical quantities (density, temperature, ionization parameter, velocity) from a FLASH two-dimensional time-dependent simulation of Vela X-1, maps showing the emissivity distributions of the X-ray emission lines, and a preliminary comparison of the resulting synthetic spectra to the Chandra HETG spectra. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.
Status of Ongoing Work in Software TRAs/TRLs
2010-04-29
to changes/updates being driven by corporate market dynamics • Changes not under control or under the influence of the PMO! • On programs with long...observed and reported esearc articles, peer- reviewed white papers, point papers, early conceptual models n a ca em c , experimental as c researc
Treatment Effects for Adolescent Struggling Readers: An Application of Moderated Mediation
ERIC Educational Resources Information Center
Roberts, Greg; Fletcher, Jack M.; Stuebing, Karla K.; Barth, Amy E.; Vaughn, Sharon
2013-01-01
This study used multigroup structural equations to evaluate the possibility that a theory-driven, evidence-based, yearlong reading program for sixth-grade struggling readers moderates the interrelationships among elements of the simple model of reading (i.e., listening comprehension, word reading, and reading comprehension; Hoover & Gough,…
Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.
Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir
2013-10-31
Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.
Simulating Descent and Landing of a Spacecraft
NASA Technical Reports Server (NTRS)
Balaram, J.; Jain, Abhinandan; Martin, Bryan; Lim, Christopher; Henriquez, David; McMahon, Elihu; Sohl, Garrett; Banerjee, Pranab; Steele, Robert; Bentley, Timothy
2005-01-01
The Dynamics Simulator for Entry, Descent, and Surface landing (DSENDS) software performs high-fidelity simulation of the Entry, Descent, and Landing (EDL) of a spacecraft into the atmosphere and onto the surface of a planet or a smaller body. DSENDS is an extension of the DShell and DARTS programs, which afford capabilities for mathematical modeling of the dynamics of a spacecraft as a whole and of its instruments, actuators, and other subsystems. DSENDS enables the modeling (including real-time simulation) of flight-train elements and all spacecraft responses during various phases of EDL. DSENDS provides high-fidelity models of the aerodynamics of entry bodies and parachutes plus supporting models of atmospheres. Terrain and real-time responses of terrain-imaging radar and lidar instruments can also be modeled. The program includes modules for simulation of guidance, navigation, hypersonic steering, and powered descent. Automated state-machine-driven model switching is used to represent spacecraft separations and reconfigurations. Models for computing landing contact and impact forces are expected to be added. DSENDS can be used as a stand-alone program or incorporated into a larger program that simulates operations in real time.
NPLOT: an Interactive Plotting Program for NASTRAN Finite Element Models
NASA Technical Reports Server (NTRS)
Jones, G. K.; Mcentire, K. J.
1985-01-01
The NPLOT (NASTRAN Plot) is an interactive computer graphics program for plotting undeformed and deformed NASTRAN finite element models. Developed at NASA's Goddard Space Flight Center, the program provides flexible element selection and grid point, ASET and SPC degree of freedom labelling. It is easy to use and provides a combination menu and command driven user interface. NPLOT also provides very fast hidden line and haloed line algorithms. The hidden line algorithm in NPLOT proved to be both very accurate and several times faster than other existing hidden line algorithms. A fast spatial bucket sort and horizon edge computation are used to achieve this high level of performance. The hidden line and the haloed line algorithms are the primary features that make NPLOT different from other plotting programs.
Sridhar, Vishnu B; Tian, Peifang; Dale, Anders M; Devor, Anna; Saisan, Payam A
2014-01-01
We present a database client software-Neurovascular Network Explorer 1.0 (NNE 1.0)-that uses MATLAB(®) based Graphical User Interface (GUI) for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication (Tian et al., 2010). These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization), and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.
Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe
2013-11-01
In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k - ɛ model, RNG k - ɛ model, realizable k - ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use.
Modeling Cable and Guide Channel Interaction in a High-Strength Cable-Driven Continuum Manipulator
Moses, Matthew S.; Murphy, Ryan J.; Kutzer, Michael D. M.; Armand, Mehran
2016-01-01
This paper presents several mechanical models of a high-strength cable-driven dexterous manipulator designed for surgical procedures. A stiffness model is presented that distinguishes between contributions from the cables and the backbone. A physics-based model incorporating cable friction is developed and its predictions are compared with experimental data. The data show that under high tension and high curvature, the shape of the manipulator deviates significantly from a circular arc. However, simple parametric models can fit the shape with good accuracy. The motivating application for this study is to develop a model so that shape can be predicted using easily measured quantities such as tension, so that real-time navigation may be performed, especially in minimally-invasive surgical procedures, while reducing the need for hazardous imaging methods such as fluoroscopy. PMID:27818607
Modeling Cable and Guide Channel Interaction in a High-Strength Cable-Driven Continuum Manipulator.
Moses, Matthew S; Murphy, Ryan J; Kutzer, Michael D M; Armand, Mehran
2015-12-01
This paper presents several mechanical models of a high-strength cable-driven dexterous manipulator designed for surgical procedures. A stiffness model is presented that distinguishes between contributions from the cables and the backbone. A physics-based model incorporating cable friction is developed and its predictions are compared with experimental data. The data show that under high tension and high curvature, the shape of the manipulator deviates significantly from a circular arc. However, simple parametric models can fit the shape with good accuracy. The motivating application for this study is to develop a model so that shape can be predicted using easily measured quantities such as tension, so that real-time navigation may be performed, especially in minimally-invasive surgical procedures, while reducing the need for hazardous imaging methods such as fluoroscopy.
A suite of R packages for web-enabled modeling and analysis of surface waters
NASA Astrophysics Data System (ADS)
Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.
2014-12-01
Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.
CLARA: CLAS12 Reconstruction and Analysis Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo
2016-11-01
In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.