A framework for the nationwide multimode transportation demand analysis.
DOT National Transportation Integrated Search
2010-09-01
This study attempts to analyze the impact of traffic on the US highway system considering both passenger vehicles and : trucks. For the analysis, a pseudo-dynamic traffic assignment model is proposed to estimate the time-dependent link flow : from th...
Deriving flow directions for coarse-resolution (1-4 km) gridded hydrologic modeling
NASA Astrophysics Data System (ADS)
Reed, Seann M.
2003-09-01
The National Weather Service Hydrology Laboratory (NWS-HL) is currently testing a grid-based distributed hydrologic model at a resolution (4 km) commensurate with operational, radar-based precipitation products. To implement distributed routing algorithms in this framework, a flow direction must be assigned to each model cell. A new algorithm, referred to as cell outlet tracing with an area threshold (COTAT) has been developed to automatically, accurately, and efficiently assign flow directions to any coarse-resolution grid cells using information from any higher-resolution digital elevation model. Although similar to previously published algorithms, this approach offers some advantages. Use of an area threshold allows more control over the tendency for producing diagonal flow directions. Analyses of results at different output resolutions ranging from 300 m to 4000 m indicate that it is possible to choose an area threshold that will produce minimal differences in average network flow lengths across this range of scales. Flow direction grids at a 4 km resolution have been produced for the conterminous United States.
NASA Astrophysics Data System (ADS)
Li, Xue-yan; Li, Xue-mei; Yang, Lingrun; Li, Jing
2018-07-01
Most of the previous studies on dynamic traffic assignment are based on traditional analytical framework, for instance, the idea of Dynamic User Equilibrium has been widely used in depicting both the route choice and the departure time choice. However, some recent studies have demonstrated that the dynamic traffic flow assignment largely depends on travelers' rationality degree, travelers' heterogeneity and what the traffic information the travelers have. In this paper, we develop a new self-adaptive multi agent model to depict travelers' behavior in Dynamic Traffic Assignment. We use Cumulative Prospect Theory with heterogeneous reference points to illustrate travelers' bounded rationality. We use reinforcement-learning model to depict travelers' route and departure time choosing behavior under the condition of imperfect information. We design the evolution rule of travelers' expected arrival time and the algorithm of traffic flow assignment. Compared with the traditional model, the self-adaptive multi agent model we proposed in this paper can effectively help travelers avoid the rush hour. Finally, we report and analyze the effect of travelers' group behavior on the transportation system, and give some insights into the relation between travelers' group behavior and the performance of transportation system.
Zhang, Xuejun; Lei, Jiaxing
2015-01-01
Considering reducing the airspace congestion and the flight delay simultaneously, this paper formulates the airway network flow assignment (ANFA) problem as a multiobjective optimization model and presents a new multiobjective optimization framework to solve it. Firstly, an effective multi-island parallel evolution algorithm with multiple evolution populations is employed to improve the optimization capability. Secondly, the nondominated sorting genetic algorithm II is applied for each population. In addition, a cooperative coevolution algorithm is adapted to divide the ANFA problem into several low-dimensional biobjective optimization problems which are easier to deal with. Finally, in order to maintain the diversity of solutions and to avoid prematurity, a dynamic adjustment operator based on solution congestion degree is specifically designed for the ANFA problem. Simulation results using the real traffic data from China air route network and daily flight plans demonstrate that the proposed approach can improve the solution quality effectively, showing superiority to the existing approaches such as the multiobjective genetic algorithm, the well-known multiobjective evolutionary algorithm based on decomposition, and a cooperative coevolution multiobjective algorithm as well as other parallel evolution algorithms with different migration topology. PMID:26180840
VisFlow - Web-based Visualization Framework for Tabular Data with a Subset Flow Model.
Yu, Bowen; Silva, Claudio T
2017-01-01
Data flow systems allow the user to design a flow diagram that specifies the relations between system components which process, filter or visually present the data. Visualization systems may benefit from user-defined data flows as an analysis typically consists of rendering multiple plots on demand and performing different types of interactive queries across coordinated views. In this paper, we propose VisFlow, a web-based visualization framework for tabular data that employs a specific type of data flow model called the subset flow model. VisFlow focuses on interactive queries within the data flow, overcoming the limitation of interactivity from past computational data flow systems. In particular, VisFlow applies embedded visualizations and supports interactive selections, brushing and linking within a visualization-oriented data flow. The model requires all data transmitted by the flow to be a data item subset (i.e. groups of table rows) of some original input table, so that rendering properties can be assigned to the subset unambiguously for tracking and comparison. VisFlow features the analysis flexibility of a flow diagram, and at the same time reduces the diagram complexity and improves usability. We demonstrate the capability of VisFlow on two case studies with domain experts on real-world datasets showing that VisFlow is capable of accomplishing a considerable set of visualization and analysis tasks. The VisFlow system is available as open source on GitHub.
Belcher, Wayne R.; Faunt, Claudia C.; D'Agnese, Frank A.
2002-01-01
The U.S. Geological Survey, in cooperation with the Department of Energy and other Federal, State, and local agencies, is evaluating the hydrogeologic characteristics of the Death Valley regional ground-water flow system. The ground-water flow system covers an area of about 100,000 square kilometers from latitude 35? to 38?15' North to longitude 115? to 118? West, with the flow system proper comprising about 45,000 square kilometers. The Death Valley regional ground-water flow system is one of the larger flow systems within the Southwestern United States and includes in its boundaries the Nevada Test Site, Yucca Mountain, and much of Death Valley. Part of this study includes the construction of a three-dimensional hydrogeologic framework model to serve as the foundation for the development of a steady-state regional ground-water flow model. The digital framework model provides a computer-based description of the geometry and composition of the hydrogeologic units that control regional flow. The framework model of the region was constructed by merging two previous framework models constructed for the Yucca Mountain Project and the Environmental Restoration Program Underground Test Area studies at the Nevada Test Site. The hydrologic characteristics of the region result from a currently arid climate and complex geology. Interbasinal regional ground-water flow occurs through a thick carbonate-rock sequence of Paleozoic age, a locally thick volcanic-rock sequence of Tertiary age, and basin-fill alluvium of Tertiary and Quaternary age. Throughout the system, deep and shallow ground-water flow may be controlled by extensive and pervasive regional and local faults and fractures. The framework model was constructed using data from several sources to define the geometry of the regional hydrogeologic units. These data sources include (1) a 1:250,000-scale hydrogeologic-map compilation of the region; (2) regional-scale geologic cross sections; (3) borehole information, and (4) gridded surfaces from a previous three-dimensional geologic model. In addition, digital elevation model data were used in conjunction with these data to define ground-surface altitudes. These data, properly oriented in three dimensions by using geographic information systems, were combined and gridded to produce the upper surfaces of the hydrogeologic units used in the flow model. The final geometry of the framework model is constructed as a volumetric model by incorporating the intersections of these gridded surfaces and by applying fault truncation rules to structural features from the geologic map and cross sections. The cells defining the geometry of the hydrogeologic framework model can be assigned several attributes such as lithology, hydrogeologic unit, thickness, and top and bottom altitudes.
The Numerical Simulation of Time Dependent Flow Structures Over a Natural Gravel Surface.
NASA Astrophysics Data System (ADS)
Hardy, R. J.; Lane, S. N.; Ferguson, R. I.; Parsons, D. R.
2004-05-01
Research undertaken over the last few years has demonstrated the importance of the structure of gravel river beds for understanding the interaction between fluid flow and sediment transport processes. This includes the observation of periodic high-speed fluid wedges interconnected by low-speed flow regions. Our understanding of these flows has been enhanced significantly through a series of laboratory experiments and supported by field observations. However, the potential of high resolution three dimensional Computational Fluid Dynamics (CFD) modeling has yet to be fully developed. This is largely the result of the problems of designing numerically stable meshes for use with complex bed topographies and that Reynolds averaged turbulence schemes are applied. This paper develops two novel techniques for dealing with these issues. The first is the development and validation of a method for representing the complex surface topography of gravel-bed rivers in high resolution three-dimensional computational fluid dynamic models. This is based upon a porosity treatment with a regular structured grid and the application of a porosity modification to the mass conservation equation in which: fully blocked cells are assigned a porosity of zero; fully unblocked cells are assigned a porosity of one; and partly blocked cells are assigned a porosity of between 0 and 1, according to the percentage of the cell volume that is blocked. The second is the application of Large Eddy Simulation (LES) which enables time dependent flow structures to be numerically predicted over the complex bed topographies. The regular structured grid with the embedded porosity algorithm maintains a constant grid cell size throughout the domain implying a constant filter scale for the LES simulation. This enables the prediction of coherent structures, repetitive quasi-cyclic large-scale turbulent motions, over the gravel surface which are of a similar magnitude and frequency to those previously observed in both flume and field studies. These structures are formed by topographic forcing within the domain and are scaled with the flow depth. Finally, this provides the numerical framework for the prediction of sediment transport within a time dependent framework. The turbulent motions make a significant contribution to the turbulent shear stress and the pressure fluctuations which significantly affect the forces acting on the bed and potentially control sediment motion.
Fleet Assignment Using Collective Intelligence
NASA Technical Reports Server (NTRS)
Antoine, Nicolas E.; Bieniawski, Stefan R.; Kroo, Ilan M.; Wolpert, David H.
2004-01-01
Product distribution theory is a new collective intelligence-based framework for analyzing and controlling distributed systems. Its usefulness in distributed stochastic optimization is illustrated here through an airline fleet assignment problem. This problem involves the allocation of aircraft to a set of flights legs in order to meet passenger demand, while satisfying a variety of linear and non-linear constraints. Over the course of the day, the routing of each aircraft is determined in order to minimize the number of required flights for a given fleet. The associated flow continuity and aircraft count constraints have led researchers to focus on obtaining quasi-optimal solutions, especially at larger scales. In this paper, the authors propose the application of this new stochastic optimization algorithm to a non-linear objective cold start fleet assignment problem. Results show that the optimizer can successfully solve such highly-constrained problems (130 variables, 184 constraints).
NASA Astrophysics Data System (ADS)
Saengow, Chaimongkol; Giacomin, A. Jeffrey
2018-03-01
In this paper, we provide a new exact framework for analyzing the most commonly measured behaviors in large-amplitude oscillatory shear flow (LAOS), a popular flow for studying the nonlinear physics of complex fluids. Specifically, the strain rate sweep (also called the strain sweep) is used routinely to identify the onset of nonlinearity. By the strain rate sweep, we mean a sequence of LAOS experiments conducted at the same frequency, performed one after another, with increasing shear rate amplitude. In this paper, we give exact expressions for the nonlinear complex viscosity and the corresponding nonlinear complex normal stress coefficients, for the Oldroyd 8-constant framework for oscillatory shear sweeps. We choose the Oldroyd 8-constant framework for its rich diversity of popular special cases (we list 18 of these). We evaluate the Fourier integrals of our previous exact solution to get exact expressions for the real and imaginary parts of the complex viscosity, and for the complex normal stress coefficients, as functions of both test frequency and shear rate amplitude. We explore the role of infinite shear rate viscosity on strain rate sweep responses for the special case of the corotational Jeffreys fluid. We find that raising η∞ raises the real part of the complex viscosity and lowers the imaginary. In our worked examples, we thus first use the corotational Jeffreys fluid, and then, for greater accuracy, we use the Johnson-Segalman fluid, to describe the strain rate sweep response of molten atactic polystyrene. For our comparisons with data, we use the Spriggs relations to generalize the Oldroyd 8-constant framework to multimode. Our generalization yields unequivocally, a longest fluid relaxation time, used to assign Weissenberg and Deborah numbers to each oscillatory shear flow experiment. We then locate each experiment in the Pipkin space.
Integrated consensus-based frameworks for unmanned vehicle routing and targeting assignment
NASA Astrophysics Data System (ADS)
Barnawi, Waleed T.
Unmanned aerial vehicles (UAVs) are increasingly deployed in complex and dynamic environments to perform multiple tasks cooperatively with other UAVs that contribute to overarching mission effectiveness. Studies by the Department of Defense (DoD) indicate future operations may include anti-access/area-denial (A2AD) environments which limit human teleoperator decision-making and control. This research addresses the problem of decentralized vehicle re-routing and task reassignments through consensus-based UAV decision-making. An Integrated Consensus-Based Framework (ICF) is formulated as a solution to the combined single task assignment problem and vehicle routing problem. The multiple assignment and vehicle routing problem is solved with the Integrated Consensus-Based Bundle Framework (ICBF). The frameworks are hierarchically decomposed into two levels. The bottom layer utilizes the renowned Dijkstra's Algorithm. The top layer addresses task assignment with two methods. The single assignment approach is called the Caravan Auction Algorithm (CarA) Algorithm. This technique extends the Consensus-Based Auction Algorithm (CBAA) to provide awareness for task completion by agents and adopt abandoned tasks. The multiple assignment approach called the Caravan Auction Bundle Algorithm (CarAB) extends the Consensus-Based Bundle Algorithm (CBBA) by providing awareness for lost resources, prioritizing remaining tasks, and adopting abandoned tasks. Research questions are investigated regarding the novelty and performance of the proposed frameworks. Conclusions regarding the research questions will be provided through hypothesis testing. Monte Carlo simulations will provide evidence to support conclusions regarding the research hypotheses for the proposed frameworks. The approach provided in this research addresses current and future military operations for unmanned aerial vehicles. However, the general framework implied by the proposed research is adaptable to any unmanned vehicle. Civil applications that involve missions where human observability would be limited could benefit from the independent UAV task assignment, such as exploration and fire surveillance are also notable uses for this approach.
Razzini, Katia
2015-01-01
The regulatory framework of the official controls on food safety, the criteria and methods from the planning of interventions in the field of official control to the management of information flows, and the standards described in the operation manual of the local competent authorities drafted by the Lombardy Region (2011) were evaluated. A questionnaire consisting of n. 10 questions with multiple answers draft in partnership with EPAM (the Association of Provincial Public Retail and catering businesses in Milan) to n. 107 Food service establishments of Milan shows that 92% of managers approve the introduction of a grading system. The regulatory framework is planned to support the implementation of risk assignment, unfortunately the attribution of risk category of retail and catering businesses is still different among regions. PMID:27800403
Abbas, Ahmed; Guo, Xianrong; Jing, Bing-Yi; Gao, Xin
2014-06-01
Despite significant advances in automated nuclear magnetic resonance-based protein structure determination, the high numbers of false positives and false negatives among the peaks selected by fully automated methods remain a problem. These false positives and negatives impair the performance of resonance assignment methods. One of the main reasons for this problem is that the computational research community often considers peak picking and resonance assignment to be two separate problems, whereas spectroscopists use expert knowledge to pick peaks and assign their resonances at the same time. We propose a novel framework that simultaneously conducts slice picking and spin system forming, an essential step in resonance assignment. Our framework then employs a genetic algorithm, directed by both connectivity information and amino acid typing information from the spin systems, to assign the spin systems to residues. The inputs to our framework can be as few as two commonly used spectra, i.e., CBCA(CO)NH and HNCACB. Different from the existing peak picking and resonance assignment methods that treat peaks as the units, our method is based on 'slices', which are one-dimensional vectors in three-dimensional spectra that correspond to certain ([Formula: see text]) values. Experimental results on both benchmark simulated data sets and four real protein data sets demonstrate that our method significantly outperforms the state-of-the-art methods while using a less number of spectra than those methods. Our method is freely available at http://sfb.kaust.edu.sa/Pages/Software.aspx.
Personal, Expository, Critical, and Creative: Using Writing in Mathematics Courses
ERIC Educational Resources Information Center
Braun, Benjamin
2014-01-01
This article provides a framework for creating and using writing assignments based on four types of writing: personal, expository, critical, and creative. This framework includes specific areas of student growth affected by these writing styles. Illustrative sample assignments are given throughout for each type of writing and various combinations…
A classification scheme for turbulent flows based on their joint velocity-intermittency structure
NASA Astrophysics Data System (ADS)
Keylock, C. J.; Nishimura, K.; Peinke, J.
2011-12-01
Kolmogorov's classic theory for turbulence assumed an independence between velocity increments and the value for the velocity itself. However, this assumption is questionable, particularly in complex geophysical flows. Here we propose a framework for studying velocity-intermittency coupling that is similar in essence to the popular quadrant analysis method for studying near-wall flows. However, we study the dominant (longitudinal) velocity component along with a measure of the roughness of the signal, given mathematically by its series of Hölder exponents. Thus, we permit a possible dependence between velocity and intermittency. We compare boundary layer data obtained in a wind tunnel to turbulent jets and wake flows. These flow classes all have distinct velocity-intermittency characteristics, which cause them to be readily distinguished using our technique. Our method is much simpler and quicker to apply than approaches that condition the velocity increment statistics at some scale, r, on the increment statistics at a neighbouring, larger spatial scale, r+Δ, and the velocity itself. Classification of environmental flows is then possible based on their similarities to the idealised flow classes and we demonstrate this using laboratory data for flow in a parallel-channel confluence where the region of flow recirculation in the lee of the step is discriminated as a flow class distinct from boundary layer, jet and wake flows. Hence, using our method, it is possible to assign a flow classification to complex geophysical, turbulent flows depending upon which idealised flow class they most resemble.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Hwang, Ho-Ling; Davidson, Diane
2016-07-01
The Freight Analysis Framework (FAF) integrates data from a variety of sources to create a comprehensive picture of nationwide freight movements among states and major metropolitan areas for all modes of transportation. It provides a national picture of current freight flows to, from, and within the United States, assigns selected flows to the transportation network, and projects freight flow patterns into the future. The latest release of FAF is known as FAF4 with a base year of 2012. The FAF4 origin-destination-commodity-mode (ODCM) matrix is provided at national, state, major metropolitan areas, and major gateways with significant freight activities (e.g., Elmore » Paso, Texas). The U.S. Department of Energy (DOE) is interested in using FAF4 database for its strategic planning and policy analysis, particularly in association with the transportation of energy commodities. However, the geographic specification that DOE requires is a county-level ODCM matrix. Unfortunately, the geographic regions in the FAF4 database were not available at the DOE desired detail. Due to this limitation, DOE tasked Oak Ridge National Laboratory (ORNL) to assist in generating estimates of county-level flows for selected energy commodities by mode of transportation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Ho-Ling; Hargrove, Stephanie; Chin, Shih-Miao
The Freight Analysis Framework (FAF) integrates data from a variety of sources to create a comprehensive national picture of freight movements among states and major metropolitan areas by all modes of transportation. It provides a national picture of current freight flows to, from, and within the United States, assigns the flows to the transportation network, and projects freight flow patterns into the future. The FAF4 is the fourth database of its kind, FAF1 provided estimates for truck, rail, and water tonnage for calendar year 1998, FAF2 provided a more complete picture based on the 2002 Commodity Flow Survey (CFS) andmore » FAF3 made further improvements building on the 2007 CFS. Since the first FAF effort, a number of changes in both data sources and products have taken place. The FAF4 flow matrix described in this report is used as the base-year data to forecast future freight activities, projecting shipment weights and values from year 2020 to 2045 in five-year intervals. It also provides the basis for annual estimates to the FAF4 flow matrix, aiming to provide users with the timeliest data. Furthermore, FAF4 truck freight is routed on the national highway network to produce the FAF4 network database and flow assignments for truck. This report details the data sources and methodologies applied to develop the base year 2012 FAF4 database. An overview of the FAF4 components is briefly discussed in Section 2. Effects on FAF4 from the changes in the 2012 CFS are highlighted in Section 3. Section 4 provides a general discussion on the process used in filling data gaps within the domestic CFS matrix, specifically on the estimation of CFS suppressed/unpublished cells. Over a dozen CFS OOS components of FAF4 are then addressed in Section 5 through Section 11 of this report. This includes discussions of farm-based agricultural shipments in Section 5, shipments from fishery and logging sectors in Section 6. Shipments of municipal solid wastes and debris from construction and demolition activities are covered in Section 7. Movements involving OOS industry sectors on Retail, Services, and Household/Business Moves are addressed in Section 8. Flows of OOS commodity on crude petroleum and natural gas are presented in Sections 9 and 10, respectively. Discussions regarding shipments of foreign trade, including trade with Canada/Mexico, international airfreight, and waterborne foreign trade, are then discussed in Section 11. Several appendices are also provided at the end of this report to offer additional information.« less
Developing Team Skills through a Collaborative Writing Assignment
ERIC Educational Resources Information Center
Thomas, Theda Ann
2014-01-01
Employers want students who are able to work effectively as members of a team, and expect universities to develop this ability in their graduates. This paper proposes a framework for a collaborative writing assignment that specifically develops students' ability to work in teams. The framework has been tested using two iterations of an action…
A compositional framework for reaction networks
NASA Astrophysics Data System (ADS)
Baez, John C.; Pollard, Blake S.
Reaction networks, or equivalently Petri nets, are a general framework for describing processes in which entities of various kinds interact and turn into other entities. In chemistry, where the reactions are assigned ‘rate constants’, any reaction network gives rise to a nonlinear dynamical system called its ‘rate equation’. Here we generalize these ideas to ‘open’ reaction networks, which allow entities to flow in and out at certain designated inputs and outputs. We treat open reaction networks as morphisms in a category. Composing two such morphisms connects the outputs of the first to the inputs of the second. We construct a functor sending any open reaction network to its corresponding ‘open dynamical system’. This provides a compositional framework for studying the dynamics of reaction networks. We then turn to statics: that is, steady state solutions of open dynamical systems. We construct a ‘black-boxing’ functor that sends any open dynamical system to the relation that it imposes between input and output variables in steady states. This extends our earlier work on black-boxing for Markov processes.
Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork
ERIC Educational Resources Information Center
Heinrich, Eva; Milne, John
2012-01-01
This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…
Assignment Procedures in the Air Force Procurement Management Information System.
ERIC Educational Resources Information Center
Ward, Joe H., Jr.; And Others
An overview is presented of the procedure for offering jobs in the Air Force Procurement Management Information System (PROMIS), an assignment system which makes possible the use of human resources research findings to improve individual personnel assignments. A general framework for viewing personnel assignment systems is presented; then job…
Saenz-Agudelo, P; Jones, G P; Thorrold, S R; Planes, S
2009-04-01
The application of spatially explicit models of population dynamics to fisheries management and the design marine reserve network systems has been limited due to a lack of empirical estimates of larval dispersal. Here we compared assignment tests and parentage analysis for examining larval retention and connectivity under two different gene flow scenarios using panda clownfish (Amphiprion polymnus) in Papua New Guinea. A metapopulation of panda clownfish in Bootless Bay with little or no genetic differentiation among five spatially discrete locations separated by 2-6 km provided the high gene flow scenario. The low gene flow scenario compared the Bootless Bay metapopulation with a genetically distinct population (F(ST )= 0.1) located at Schumann Island, New Britain, 1500 km to the northeast. We used assignment tests and parentage analysis based on microsatellite DNA data to identify natal origins of 177 juveniles in Bootless Bay and 73 juveniles at Schumann Island. At low rates of gene flow, assignment tests correctly classified juveniles to their source population. On the other hand, parentage analysis led to an overestimate of self-recruitment within the two populations due to the significant deviation from panmixia when both populations were pooled. At high gene flow (within Bootless Bay), assignment tests underestimated self-recruitment and connectivity among subpopulations, and grossly overestimated self-recruitment within the overall metapopulation. However, the assignment tests did identify immigrants from distant (genetically distinct) populations. Parentage analysis clearly provided the most accurate estimates of connectivity in situations of high gene flow.
Application of dynamic traffic assignment to advanced managed lane modeling.
DOT National Transportation Integrated Search
2013-11-01
In this study, a demand estimation framework is developed for assessing the managed lane (ML) : strategies by utilizing dynamic traffic assignment (DTA) modeling, instead of the traditional : approaches that are based on the static traffic assignment...
Multidisciplinary Environments: A History of Engineering Framework Development
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Gillian, Ronnie E.
2006-01-01
This paper traces the history of engineering frameworks and their use by Multidisciplinary Design Optimization (MDO) practitioners. The approach is to reference papers that have been presented at one of the ten previous Multidisciplinary Analysis and Optimization (MA&O) conferences. By limiting the search to MA&O papers, the authors can (1) identify the key ideas that led to general purpose MDO frameworks and (2) uncover roadblocks that delayed the development of these ideas. The authors make no attempt to assign credit for revolutionary ideas or to assign blame for missed opportunities. Rather, the goal is to trace the various threads of computer architecture and software framework research and to observe how these threads contributed to the commercial framework products available today.
Leadership Stability in Army Reserve Component Units
2013-01-01
or recognized, RC units could have more time because they may appear late in the force flow , particularly if AC units go earlier or the flow is...8, or deployment minus eight months), many new arrivals flowed into the unit, including many who would eventually deploy with the unit. Almost all...mobilization. Thus, those assigned are treated as 100 percent. To the right, we display the percentage (out of those assigned) who flowed into various
Three-dimensional geologic model of the Arbuckle-Simpson aquifer, south-central Oklahoma
Faith, Jason R.; Blome, Charles D.; Pantea, Michael P.; Puckette, James O.; Halihan, Todd; Osborn, Noel; Christenson, Scott; Pack, Skip
2010-01-01
The Arbuckle-Simpson aquifer of south-central Oklahoma encompasses more than 850 square kilometers and is the principal water resource for south-central Oklahoma. Rock units comprising the aquifer are characterized by limestone, dolomite, and sandstones assigned to two lower Paleozoic units: the Arbuckle and Simpson Groups. Also considered to be part of the aquifer is the underlying Cambrian-age Timbered Hills Group that contains limestone and sandstone. The highly faulted and fractured nature of the Arbuckle-Simpson units and the variable thickness (600 to 2,750 meters) increases the complexity in determining the subsurface geologic framework of this aquifer. A three-dimensional EarthVision (Trademark) geologic framework model was constructed to quantify the geometric relationships of the rock units of the Arbuckle-Simpson aquifer in the Hunton anticline area. This 3-D EarthVision (Trademark) geologic framework model incorporates 54 faults and four modeled units: basement, Arbuckle-Timbered Hills Group, Simpson Group, and post-Simpson. Primary data used to define the model's 54 faults and four modeled surfaces were obtained from geophysical logs, cores, and cuttings from 126 water and petroleum wells. The 3-D framework model both depicts the volumetric extent of the aquifer and provides the stratigraphic layer thickness and elevation data used to construct a MODFLOW version 2000 regional groundwater-flow model.
Global Optimization of Emergency Evacuation Assignments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Lee; Yuan, Fang; Chin, Shih-Miao
2006-01-01
Conventional emergency evacuation plans often assign evacuees to fixed routes or destinations based mainly on geographic proximity. Such approaches can be inefficient if the roads are congested, blocked, or otherwise dangerous because of the emergency. By not constraining evacuees to prespecified destinations, a one-destination evacuation approach provides flexibility in the optimization process. We present a framework for the simultaneous optimization of evacuation-traffic distribution and assignment. Based on the one-destination evacuation concept, we can obtain the optimal destination and route assignment by solving a one-destination traffic-assignment problem on a modified network representation. In a county-wide, large-scale evacuation case study, the one-destinationmore » model yields substantial improvement over the conventional approach, with the overall evacuation time reduced by more than 60 percent. More importantly, emergency planners can easily implement this framework by instructing evacuees to go to destinations that the one-destination optimization process selects.« less
Qualls, Joseph; Russomanno, David J.
2011-01-01
The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081
ActionMap: A web-based software that automates loci assignments to framework maps.
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-07-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).
ActionMap: a web-based software that automates loci assignments to framework maps
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-01-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426
A "Journey in Feminist Theory Together": The "Doing Feminist Theory through Digital Video" Project
ERIC Educational Resources Information Center
Hurst, Rachel Alpha Johnston
2014-01-01
"Doing Feminist Theory Through Digital Video" is an assignment I designed for my undergraduate feminist theory course, where students created a short digital video on a concept in feminist theory. I outline the assignment and the pedagogical and epistemological frameworks that structured the assignment (digital storytelling,…
A Bayesian Nonparametric Causal Model for Regression Discontinuity Designs
ERIC Educational Resources Information Center
Karabatsos, George; Walker, Stephen G.
2013-01-01
The regression discontinuity (RD) design (Thistlewaite & Campbell, 1960; Cook, 2008) provides a framework to identify and estimate causal effects from a non-randomized design. Each subject of a RD design is assigned to the treatment (versus assignment to a non-treatment) whenever her/his observed value of the assignment variable equals or…
Linking the History of Radiation Biology to the Hallmarks of Cancer
Boss, Mary-Keara; Bristow, Robert; Dewhirst, Mark W.
2014-01-01
Hanahan and Weinberg recently updated their conceptual framework of the “Hallmarks of Cancer”. The original article, published in 2000, is among the most highly cited reviews in the field of oncology. The goal of this review is to highlight important discoveries in radiation biology that pertain to the Hallmarks. We identified early studies that exemplified how ionizing radiation affects the hallmarks or how radiation was used experimentally to advance the understanding of key hallmarks. A literature search was performed to obtain relevant primary research, and topics were assigned to a particular hallmark to allow an organized, chronological account of the radiobiological advancements. The hallmarks are reviewed in an order that flows from cellular to microenvironmental effects. PMID:24811865
NASA Astrophysics Data System (ADS)
McCoy, Amy L.; Holmes, S. Rankin; Boisjolie, Brett A.
2018-03-01
Securing environmental flows in support of freshwater biodiversity is an evolving field of practice. An example of a large-scale program dedicated to restoring environmental flows is the Columbia Basin Water Transactions Program in the Pacific Northwest region of North America, which has been restoring flows in dewatered tributary habitats for imperiled salmon species over the past decade. This paper discusses a four-tiered flow restoration accounting framework for tracking the implementation and impacts of water transactions as an effective tool for adaptive management. The flow restoration accounting framework provides compliance and flow accounting information to monitor transaction efficacy. We review the implementation of the flow restoration accounting framework monitoring framework to demonstrate (a) the extent of water transactions that have been implemented over the past decade, (b) the volumes of restored flow in meeting flow targets for restoring habitat for anadromous fish species, and (c) an example of aquatic habitat enhancement that resulted from Columbia Basin Water Transactions Program investments. Project results show that from 2002 to 2015, the Columbia Basin Water Transactions Program has completed more than 450 water rights transactions, restoring approximately 1.59 million megaliters to date, with an additional 10.98 million megaliters of flow protected for use over the next 100 years. This has resulted in the watering of over 2414 stream kilometers within the Columbia Basin. We conclude with a discussion of the insights gained through the implementation of the flow restoration accounting framework. Understanding the approach and efficacy of a monitoring framework applied across a large river basin can be informative to emerging flow-restoration and adaptive management efforts in areas of conservation concern.
McCoy, Amy L; Holmes, S Rankin; Boisjolie, Brett A
2018-03-01
Securing environmental flows in support of freshwater biodiversity is an evolving field of practice. An example of a large-scale program dedicated to restoring environmental flows is the Columbia Basin Water Transactions Program in the Pacific Northwest region of North America, which has been restoring flows in dewatered tributary habitats for imperiled salmon species over the past decade. This paper discusses a four-tiered flow restoration accounting framework for tracking the implementation and impacts of water transactions as an effective tool for adaptive management. The flow restoration accounting framework provides compliance and flow accounting information to monitor transaction efficacy. We review the implementation of the flow restoration accounting framework monitoring framework to demonstrate (a) the extent of water transactions that have been implemented over the past decade, (b) the volumes of restored flow in meeting flow targets for restoring habitat for anadromous fish species, and (c) an example of aquatic habitat enhancement that resulted from Columbia Basin Water Transactions Program investments. Project results show that from 2002 to 2015, the Columbia Basin Water Transactions Program has completed more than 450 water rights transactions, restoring approximately 1.59 million megaliters to date, with an additional 10.98 million megaliters of flow protected for use over the next 100 years. This has resulted in the watering of over 2414 stream kilometers within the Columbia Basin. We conclude with a discussion of the insights gained through the implementation of the flow restoration accounting framework. Understanding the approach and efficacy of a monitoring framework applied across a large river basin can be informative to emerging flow-restoration and adaptive management efforts in areas of conservation concern.
Modeling regional freight flow assignment through intermodal terminals
DOT National Transportation Integrated Search
2005-03-01
An analytical model is developed to assign regional freight across a multimodal highway and railway network using geographic information systems. As part of the regional planning process, the model is an iterative procedure that assigns multimodal fr...
Single machine scheduling with slack due dates assignment
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Hu, Xiangpei; Wang, Xuyin
2017-04-01
This paper considers a single machine scheduling problem in which each job is assigned an individual due date based on a common flow allowance (i.e. all jobs have slack due date). The goal is to find a sequence for jobs, together with a due date assignment, that minimizes a non-regular criterion comprising the total weighted absolute lateness value and common flow allowance cost, where the weight is a position-dependent weight. In order to solve this problem, an ? time algorithm is proposed. Some extensions of the problem are also shown.
New optimization model for routing and spectrum assignment with nodes insecurity
NASA Astrophysics Data System (ADS)
Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli
2017-04-01
By adopting the orthogonal frequency division multiplexing technology, elastic optical networks can provide the flexible and variable bandwidth allocation to each connection request and get higher spectrum utilization. The routing and spectrum assignment problem in elastic optical network is a well-known NP-hard problem. In addition, information security has received worldwide attention. We combine these two problems to investigate the routing and spectrum assignment problem with the guaranteed security in elastic optical network, and establish a new optimization model to minimize the maximum index of the used frequency slots, which is used to determine an optimal routing and spectrum assignment schemes. To solve the model effectively, a hybrid genetic algorithm framework integrating a heuristic algorithm into a genetic algorithm is proposed. The heuristic algorithm is first used to sort the connection requests and then the genetic algorithm is designed to look for an optimal routing and spectrum assignment scheme. In the genetic algorithm, tailor-made crossover, mutation and local search operators are designed. Moreover, simulation experiments are conducted with three heuristic strategies, and the experimental results indicate that the effectiveness of the proposed model and algorithm framework.
Achieving Agility and Stability in Large-Scale Software Development
2013-01-16
temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon
Lampe, David C.
2009-01-01
The U.S. Geological Survey is assessing groundwater availability in the Lake Michigan Basin. As part of the assessment, a variable-density groundwater-flow model is being developed to simulate the effects of groundwater use on water availability throughout the basin. The hydrogeologic framework for the Lake Michigan Basin model was developed by grouping the bedrock geology of the study area into hydrogeologic units on the basis of the functioning of each unit as an aquifer or confining layer within the basin. Available data were evaluated based on the areal extent of coverage within the study area, and procedures were established to characterize areas with sparse data coverage. Top and bottom altitudes for each hydrogeologic unit were interpolated in a geographic information system for input to the model and compared with existing maps of subsurface formations. Fourteen bedrock hydrogeologic units, making up 17 bedrock model layers, were defined, and they range in age from the Jurassic Period red beds of central Michigan to the Cambrian Period Mount Simon Sandstone. Information on groundwater salinity in the Lake Michigan Basin was compiled to create an input dataset for the variable-density groundwater-flow simulation. Data presented in this report are referred to as 'salinity data' and are reported in terms of total dissolved solids. Salinity data were not available for each hydrogeologic unit. Available datasets were assigned to a hydrogeologic unit, entered into a spatial database, and data quality was visually evaluated. A geographic information system was used to interpolate salinity distributions for each hydrogeologic unit with available data. Hydrogeologic units with no available data either were set equal to neighboring units or were vertically interpolated by use of values from units above and below.
Marsh-Tootle, Wendy L; Funkhouser, Ellen; Frazier, Marcela G; Crenshaw, Katie; Wall, Terry C
2010-02-01
To evaluate knowledge, attitudes, and environment of primary care providers, and to develop a conceptual framework showing their impact on self-reported pre-school vision screening (PVS) behaviors. Eligible primary care providers were individuals who filed claims with Medicaid agencies in Alabama, South Carolina, or Illinois, for at least eight well child checks for children aged 3 or 4 years during 1 year. Responses were obtained on-line from providers who enrolled in the intervention arm of a randomized trial to improve PVS. We calculated a summary score per provider per facet: (1) for behavior and knowledge, each correct answer was assigned a value of +1; and (2) for attitudes and environment, responses indicating support for PVS were assigned a value of +1, and other responses were assigned -1. Responses were available from 53 participants (43 of 49 enrolled pediatricians, 8 of 14 enrolled family physicians, one general physician, and one nurse practitioner). Recognizing that amblyopia often presents without outward signs was positively related to good PVS: [odds ratio (OR) = 3.9; p = 0.06]. Reporting that "preschool VS interrupts patient flow" posed a significant barrier (OR = 0.2; p = 0.05). Providers with high summed scores on attitudes (OR = 6.0; p = 0.03), or knowledge and attitudes (OR = 11.4; p < 0.001) were significantly more likely to report good PVS behavior. There was a significant trend between the number of "good" scores on knowledge, attitudes or environment, and "good" PVS behavior (p = 0.04). PVS is influenced by positive attitudes, especially when combined with knowledge about amblyopia. Interventions to improve PVS should target multiple facets, emphasizing (1) asymptomatic children are at risk for amblyopia, (2) specific evidence-based tests have high testability and sensitivity for amblyopia in pre-school children, and (3) new tests minimize interruptions to patient flow.
A Holistic Framework for Environmental Flows Determination in Hydropower Contexts
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Bevelhimer, Mark S
2013-05-01
Among the ecological science community, the consensus view is that the natural flow regime sustains the ecological integrity of river systems. This prevailing viewpoint by many environmental stakeholders has progressively led to increased pressure on hydropower dam owners to change plant operations to affect downstream river flows with the intention of providing better conditions for aquatic biological communities. Identifying the neccessary magnitude, frequency, duration, timing, or rate of change of stream flows to meet ecological needs in a hydropower context is challenging because the ecological responses to changes in flows may not be fully known, there are usually a multitudemore » of competing users of flow, and implementing environmental flows usually comes at a price to energy production. Realistically, hydropower managers must develop a reduced set of goals that provide the most benefit to the identified ecological needs. As a part of the Department of Energy (DOE) Water Power Program, the Instream Flow Project (IFP) was carried out by Oak Ridge National Laboratory (ORNL), Pacific Northwest National Laboratory (PNNL), and Argon National Laboratory (ANL) as an attempt to develop tools aimed at defining environmental flow needs for hydropower operations. The application of these tools ranges from national to site-specific scales; thus, the utility of each tool will depend on various phases of the environmental flow process. Given the complexity and sheer volume of applications used to determine environmentally acceptable flows for hydropower, a framework is needed to organize efforts into a staged process dependent upon spatial, temporal, and functional attributes. By far, the predominant domain for determining environmental flows related to hydropower is within the Federal Energy Regulatory Commission (FERC) relicensing process. This process can take multiple years and can be very expensive depending on the scale of each hydropower project. The utility of such a framework is that it can expedite the environmental flow process by 1) organizing data and applications to identify predictable relationships between flows and ecology, and 2) suggesting when and where tools should be used in the environmental flow process. In addition to regulatory procedures, a framework should also provide the coordination for a comprehensive research agenda to guide the science of environmental flows. This research program has further reaching benefits than just environmental flow determination by providing modeling applications, data, and geospatial layers to inform potential hydropower development. We address several objectives within this document that highlight the limitations of existing environmental flow paradigms and their applications to hydropower while presenting a new framework catered towards hydropower needs. Herein, we address the following objectives: 1) Provide a brief overview of the Natural Flow Regime paradigm and existing environmental flow frameworks that have been used to determine ecologically sensitive stream flows for hydropower operations. 2) Describe a new conceptual framework to aid in determining flows needed to meet ecological objectives with regard to hydropower operations. The framework is centralized around determining predictable relationships between flow and ecological responses. 3) Provide evidence of how efforts from ORNL, PNNL, and ANL have filled some of the gaps in this broader framework, and suggest how the framework can be used to set the stage for a research agenda for environmental flow.« less
Towards catchment classification in data-scarce regions
Auerbach, Daniel A.; Buchanan, Brian P.; Alexiades, Alex V.; ...
2016-01-29
Assessing spatial variation in hydrologic processes can help to inform freshwater management and advance ecological understanding, yet many areas lack sufficient flow records on which to base classifications. Seeking to address this challenge, we apply concepts developed in data-rich settings to public, global data in order to demonstrate a broadly replicable approach to characterizing hydrologic variation. The proposed approach groups the basins associated with reaches in a river network according to key environmental drivers of hydrologic conditions. This initial study examines Colorado (USA), where long-term streamflow records permit comparison to previously distinguished flow regime types, and the Republic of Ecuador,more » where data limitations preclude such analysis. The flow regime types assigned to gages in Colorado corresponded reasonably well to the classes distinguished from environmental features. The divisions in Ecuador reflected major known biophysical gradients while also providing a higher resolution supplement to an existing depiction of freshwater ecoregions. Although freshwater policy and management decisions occur amidst uncertainty and imperfect knowledge, this classification framework offers a rigorous and transferrable means to distinguish catchments in data-scarce regions. The maps and attributes of the resulting ecohydrologic classes offer a departure point for additional study and data collection programs such as the placement of stations in under-monitored classes, and the divisions may serve as a preliminary template with which to structure conservation efforts such as environmental flow assessments.« less
Pyne, Matthew I.; Carlisle, Daren M.; Konrad, Christopher P.; Stein, Eric D.
2017-01-01
Regional classification of streams is an early step in the Ecological Limits of Hydrologic Alteration framework. Many stream classifications are based on an inductive approach using hydrologic data from minimally disturbed basins, but this approach may underrepresent streams from heavily disturbed basins or sparsely gaged arid regions. An alternative is a deductive approach, using watershed climate, land use, and geomorphology to classify streams, but this approach may miss important hydrological characteristics of streams. We classified all stream reaches in California using both approaches. First, we used Bayesian and hierarchical clustering to classify reaches according to watershed characteristics. Streams were clustered into seven classes according to elevation, sedimentary rock, and winter precipitation. Permutation-based analysis of variance and random forest analyses were used to determine which hydrologic variables best separate streams into their respective classes. Stream typology (i.e., the class that a stream reach is assigned to) is shaped mainly by patterns of high and mean flow behavior within the stream's landscape context. Additionally, random forest was used to determine which hydrologic variables best separate minimally disturbed reference streams from non-reference streams in each of the seven classes. In contrast to stream typology, deviation from reference conditions is more difficult to detect and is largely defined by changes in low-flow variables, average daily flow, and duration of flow. Our combined deductive/inductive approach allows us to estimate flow under minimally disturbed conditions based on the deductive analysis and compare to measured flow based on the inductive analysis in order to estimate hydrologic change.
Rainbow: A Framework for Analysing Computer-Mediated Pedagogical Debates
ERIC Educational Resources Information Center
Baker, Michael; Andriessen, Jerry; Lund, Kristine; van Amelsvoort, Marie; Quignard, Matthieu
2007-01-01
In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed "Rainbow," as it comprises seven principal analytical categories, to each of which a colour is assigned,…
Organizing Environmental Flow Frameworks to Meet Hydropower Mitigation Needs
NASA Astrophysics Data System (ADS)
McManamay, Ryan A.; Brewer, Shannon K.; Jager, Henriette I.; Troia, Matthew J.
2016-09-01
The global recognition of the importance of natural flow regimes to sustain the ecological integrity of river systems has led to increased societal pressure on the hydropower industry to change plant operations to improve downstream aquatic ecosystems. However, a complete reinstatement of natural flow regimes is often unrealistic when balancing water needs for ecosystems, energy production, and other human uses. Thus, stakeholders must identify a prioritized subset of flow prescriptions that meet ecological objectives in light of realistic constraints. Yet, isolating aspects of flow regimes to restore downstream of hydropower facilities is among the greatest challenges of environmental flow science due, in part, to the sheer volume of available environmental flow tools in conjunction with complex negotiation-based regulatory procedures. Herein, we propose an organizational framework that structures information and existing flow paradigms into a staged process that assists stakeholders in implementing environmental flows for hydropower facilities. The framework identifies areas where regulations fall short of the needed scientific process, and provide suggestions for stakeholders to ameliorate those situations through advanced preparation. We highlight the strengths of existing flow paradigms in their application to hydropower settings and suggest when and where tools are most applicable. Our suggested framework increases the effectiveness and efficiency of the e-flow implementation process by rapidly establishing a knowledge base and decreasing uncertainty so more time can be devoted to filling knowledge gaps. Lastly, the framework provides the structure for a coordinated research agenda to further the science of environmental flows related to hydropower environments.
Organizing environmental flow frameworks to meet hydropower mitigation needs
McManamay, Ryan A.; Brewer, Shannon K.; Jager, Henriette; Troia, Matthew J.
2016-01-01
The global recognition of the importance of natural flow regimes to sustain the ecological integrity of river systems has led to increased societal pressure on the hydropower industry to change plant operations to improve downstream aquatic ecosystems. However, a complete reinstatement of natural flow regimes is often unrealistic when balancing water needs for ecosystems, energy production, and other human uses. Thus, stakeholders must identify a prioritized subset of flow prescriptions that meet ecological objectives in light of realistic constraints. Yet, isolating aspects of flow regimes to restore downstream of hydropower facilities is among the greatest challenges of environmental flow science due, in part, to the sheer volume of available environmental flow tools in conjunction with complex negotiation-based regulatory procedures. Herein, we propose an organizational framework that structures information and existing flow paradigms into a staged process that assists stakeholders in implementing environmental flows for hydropower facilities. The framework identifies areas where regulations fall short of the needed scientific process, and provide suggestions for stakeholders to ameliorate those situations through advanced preparation. We highlight the strengths of existing flow paradigms in their application to hydropower settings and suggest when and where tools are most applicable. Our suggested framework increases the effectiveness and efficiency of the e-flow implementation process by rapidly establishing a knowledge base and decreasing uncertainty so more time can be devoted to filling knowledge gaps. Lastly, the framework provides the structure for a coordinated research agenda to further the science of environmental flows related to hydropower environments.
Organizing Environmental Flow Frameworks to Meet Hydropower Mitigation Needs.
McManamay, Ryan A; Brewer, Shannon K; Jager, Henriette I; Troia, Matthew J
2016-09-01
The global recognition of the importance of natural flow regimes to sustain the ecological integrity of river systems has led to increased societal pressure on the hydropower industry to change plant operations to improve downstream aquatic ecosystems. However, a complete reinstatement of natural flow regimes is often unrealistic when balancing water needs for ecosystems, energy production, and other human uses. Thus, stakeholders must identify a prioritized subset of flow prescriptions that meet ecological objectives in light of realistic constraints. Yet, isolating aspects of flow regimes to restore downstream of hydropower facilities is among the greatest challenges of environmental flow science due, in part, to the sheer volume of available environmental flow tools in conjunction with complex negotiation-based regulatory procedures. Herein, we propose an organizational framework that structures information and existing flow paradigms into a staged process that assists stakeholders in implementing environmental flows for hydropower facilities. The framework identifies areas where regulations fall short of the needed scientific process, and provide suggestions for stakeholders to ameliorate those situations through advanced preparation. We highlight the strengths of existing flow paradigms in their application to hydropower settings and suggest when and where tools are most applicable. Our suggested framework increases the effectiveness and efficiency of the e-flow implementation process by rapidly establishing a knowledge base and decreasing uncertainty so more time can be devoted to filling knowledge gaps. Lastly, the framework provides the structure for a coordinated research agenda to further the science of environmental flows related to hydropower environments.
The Flow Engine Framework: A Cognitive Model of Optimal Human Experience
Šimleša, Milija; Guegan, Jérôme; Blanchard, Edouard; Tarpin-Bernard, Franck; Buisine, Stéphanie
2018-01-01
Flow is a well-known concept in the fields of positive and applied psychology. Examination of a large body of flow literature suggests there is a need for a conceptual model rooted in a cognitive approach to explain how this psychological phenomenon works. In this paper, we propose the Flow Engine Framework, a theoretical model explaining dynamic interactions between rearranged flow components and fundamental cognitive processes. Using an IPO framework (Inputs – Processes – Outputs) including a feedback process, we organize flow characteristics into three logically related categories: inputs (requirements for flow), mediating and moderating cognitive processes (attentional and motivational mechanisms) and outputs (subjective and objective outcomes), describing the process of the flow. Comparing flow with an engine, inputs are depicted as flow-fuel, core processes cylinder strokes and outputs as power created to provide motion. PMID:29899807
ERIC Educational Resources Information Center
Gaske, Dan
1992-01-01
Provides a graphical framework for presenting interactions among current account flows, capital account flows, and exchange rates. Suggests that the two type of flows must be considered separately in discussions of foreign exchange equilibrium and balance of payments flows. Supplies sample graphs and instructions for applying the framework to real…
Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri
2016-01-01
This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality.
Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri
2016-01-01
This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality. PMID:26954783
Creating Dissonance in Pre-Service Teachers' Field Experiences
ERIC Educational Resources Information Center
Eisenhardt, Sara; Besnoy, Kevin; Steele, Emily
2012-01-01
The study is practical in nature and addresses the call for investigating effective aspects of field experiences in teacher preparation. The authors designed a framework of assignments requiring the pre-service teachers to collect data about two diverse elementary students in their assigned elementary classroom during the twelve weeks of their…
Gender Assignment in Contemporary Standard Russian: A Comprehensive Analysis in Optimality Theory
ERIC Educational Resources Information Center
Galbreath, Blake Lee Everett
2010-01-01
The purpose of this dissertation is to provide a comprehensive analysis of gender assignment in Contemporary Standard Russian within the framework of Optimality Theory (Prince and Smolensky 1993). The result of the dissertation is the establishment of the phonological, morphological, semantic, and faithfulness constraints necessary to assign…
The Communication Audit: A Framework for Teaching Management Communication.
ERIC Educational Resources Information Center
Shelby, Annette N.; Reinsch, N. Lamar, Jr.
1996-01-01
Describes a communication audit project used in a graduate-level management communication course. Reviews literature concerning communication audits, explains why and how an audit project is used in the author's classes, and describes specific audit-related assignments. Concludes that, although a challenging assignment, the audit is worthwhile.…
ERIC Educational Resources Information Center
Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis
2007-01-01
This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…
Organizing environmental flow frameworks to meet hydropower mitigation needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A.; Brewer, Shannon K.; Jager, Henriette I.
The global recognition of the importance of natural flow regimes to sustain the ecological integrity of river systems has led to increased societal pressure on the hydropower industry to change plant operations to improve downstream aquatic ecosystems. However, a complete reinstatement of natural flow regimes is often unrealistic when balancing water needs for ecosystems, energy production, and other human uses. Thus, stakeholders must identify a prioritized subset of flow prescriptions that meet ecological objectives in light of realistic constraints. Yet, isolating aspects of flow regimes to restore downstream of hydropower facilities is among the greatest challenges of environmental flow sciencemore » due, in part, to the sheer volume of available environmental flow tools in conjunction with complex negotiation-based regulatory procedures. Here, we propose an organizational framework that structures information and existing flow paradigms into a staged process that assists stakeholders in implementing environmental flows for hydropower facilities. The framework identifies areas where regulations fall short of the needed scientific process, and provide suggestions for stakeholders to ameliorate those situations through advanced preparation. We highlight the strengths of existing flow paradigms in their application to hydropower settings and suggest when and where tools are most applicable. In conclusion, our suggested framework increases the effectiveness and efficiency of the e-flow implementation process by rapidly establishing a knowledge base and decreasing uncertainty so more time can be devoted to filling knowledge gaps. As a result, the framework provides the structure for a coordinated research agenda to further the science of environmental flows related to hydropower environments.« less
Organizing environmental flow frameworks to meet hydropower mitigation needs
McManamay, Ryan A.; Brewer, Shannon K.; Jager, Henriette I.; ...
2016-06-25
The global recognition of the importance of natural flow regimes to sustain the ecological integrity of river systems has led to increased societal pressure on the hydropower industry to change plant operations to improve downstream aquatic ecosystems. However, a complete reinstatement of natural flow regimes is often unrealistic when balancing water needs for ecosystems, energy production, and other human uses. Thus, stakeholders must identify a prioritized subset of flow prescriptions that meet ecological objectives in light of realistic constraints. Yet, isolating aspects of flow regimes to restore downstream of hydropower facilities is among the greatest challenges of environmental flow sciencemore » due, in part, to the sheer volume of available environmental flow tools in conjunction with complex negotiation-based regulatory procedures. Here, we propose an organizational framework that structures information and existing flow paradigms into a staged process that assists stakeholders in implementing environmental flows for hydropower facilities. The framework identifies areas where regulations fall short of the needed scientific process, and provide suggestions for stakeholders to ameliorate those situations through advanced preparation. We highlight the strengths of existing flow paradigms in their application to hydropower settings and suggest when and where tools are most applicable. In conclusion, our suggested framework increases the effectiveness and efficiency of the e-flow implementation process by rapidly establishing a knowledge base and decreasing uncertainty so more time can be devoted to filling knowledge gaps. As a result, the framework provides the structure for a coordinated research agenda to further the science of environmental flows related to hydropower environments.« less
Saenz, Juan A.; Chen, Qingshan; Ringler, Todd
2015-05-19
Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less
Pre-equilibrium Longitudinal Flow in the IP-Glasma Framework for Pb+Pb Collisions at the LHC
NASA Astrophysics Data System (ADS)
McDonald, Scott; Shen, Chun; Fillion-Gourdeau, François; Jeon, Sangyong; Gale, Charles
2017-08-01
In this work, we debut a new implementation of IP-Glasma and quantify the pre-equilibrium longitudinal flow in the IP-Glasma framework. The saturation physics based IP-Glasma model naturally provides a non-zero initial longitudinal flow through its pre-equilibrium Yang-Mills evolution. A hybrid IP-Glasma+MUSIC+UrQMD frame-work is employed to test this new implementation against experimental data and to make further predictions about hadronic flow observables in Pb+Pb collisions at 5.02 TeV. Finally, the non-zero pre-equilibrium longitudinal flow of the IP-Glasma model is quantified, and its origin is briefly discussed.
Persuasive Writing, A Curriculum Design: K-12.
ERIC Educational Resources Information Center
Bennett, Susan G., Ed.
In the spirit of the Texas Hill Country Writing Project and in response to the requirements of the Texas Assessment of Basic Skills, this guide presents writing assignments reflecting a commitment to a unified writing program for kindergarten through grade twelve. The framework for the assignments is adopted from the discourse theory of James…
A Framework for Military Decision Making Under Risks.
1996-06-01
FORCE BASE, ALABAMA JUNE 1996 -. ,l woved for publtc release; QVA , jDistribution Unlixited Disclaimer The conclusions and opinions expressed in this...followed by an assignment as the operations officer for 1 st Battalion 509th Airborne Infantry Regiment. Next, LTC Schultz was assigned to Fort...iii ACKNOWLEDGMENTS ................................ iv AB STRA CT ........................ .................. v 1 INTRODUCTION
A Competitive and Experiential Assignment in Search Engine Optimization Strategy
ERIC Educational Resources Information Center
Clarke, Theresa B.; Clarke, Irvine, III
2014-01-01
Despite an increase in ad spending and demand for employees with expertise in search engine optimization (SEO), methods for teaching this important marketing strategy have received little coverage in the literature. Using Bloom's cognitive goals hierarchy as a framework, this experiential assignment provides a process for educators who may be new…
"I'm Not so Sure…": Teacher Educator Action Research into Uncertainty
ERIC Educational Resources Information Center
Rogers, Carrie
2016-01-01
Using a framework of uncertainty that is informed by Hannah Arendt's philosophy this four-semester action research project describes the creation and analysis of an assignment that allows teacher candidates to explore their own uncertainties in regards to the teaching profession. This action research project examines the assignment and its…
ERIC Educational Resources Information Center
Campbell, Laurie O.; Cox, Thomas D.
2018-01-01
Students within this study followed the ICSDR (Identify, Conceptualize/Connect, Storyboard, Develop, Review/Reflect/Revise) development model to create digital video, as a personalized and active learning assignment. The participants, graduate students in education, indicated that following the ICSDR framework for student-authored video guided…
Joint Inference of Population Assignment and Demographic History
Choi, Sang Chul; Hey, Jody
2011-01-01
A new approach to assigning individuals to populations using genetic data is described. Most existing methods work by maximizing Hardy–Weinberg and linkage equilibrium within populations, neither of which will apply for many demographic histories. By including a demographic model, within a likelihood framework based on coalescent theory, we can jointly study demographic history and population assignment. Genealogies and population assignments are sampled from a posterior distribution using a general isolation-with-migration model for multiple populations. A measure of partition distance between assignments facilitates not only the summary of a posterior sample of assignments, but also the estimation of the posterior density for the demographic history. It is shown that joint estimates of assignment and demographic history are possible, including estimation of population phylogeny for samples from three populations. The new method is compared to results of a widely used assignment method, using simulated and published empirical data sets. PMID:21775468
Flow motifs reveal limitations of the static framework to represent human interactions
NASA Astrophysics Data System (ADS)
Rocha, Luis E. C.; Blondel, Vincent D.
2013-04-01
Networks are commonly used to define underlying interaction structures where infections, information, or other quantities may spread. Although the standard approach has been to aggregate all links into a static structure, some studies have shown that the time order in which the links are established may alter the dynamics of spreading. In this paper, we study the impact of the time ordering in the limits of flow on various empirical temporal networks. By using a random walk dynamics, we estimate the flow on links and convert the original undirected network (temporal and static) into a directed flow network. We then introduce the concept of flow motifs and quantify the divergence in the representativity of motifs when using the temporal and static frameworks. We find that the regularity of contacts and persistence of vertices (common in email communication and face-to-face interactions) result on little differences in the limits of flow for both frameworks. On the other hand, in the case of communication within a dating site and of a sexual network, the flow between vertices changes significantly in the temporal framework such that the static approximation poorly represents the structure of contacts. We have also observed that cliques with 3 and 4 vertices containing only low-flow links are more represented than the same cliques with all high-flow links. The representativity of these low-flow cliques is higher in the temporal framework. Our results suggest that the flow between vertices connected in cliques depend on the topological context in which they are placed and in the time sequence in which the links are established. The structure of the clique alone does not completely characterize the potential of flow between the vertices.
Code of Federal Regulations, 2014 CFR
2014-04-01
... and its committees, including the establishment of a clear and documented risk management framework... C derivatives clearing organization are consistent with the risk management framework established by... management committee, and material risk decisions; (9) Assign responsibility and accountability for risk...
A national framework for disaster health education in Australia.
FitzGerald, Gerard J; Aitken, Peter; Arbon, Paul; Archer, Frank; Cooper, David; Leggat, Peter; Myers, Colin; Robertson, Andrew; Tarrant, Michael; Davis, Elinor R
2010-01-01
Recent events have heightened awareness of disaster health issues and the need to prepare the health workforce to plan for and respond to major incidents. This has been reinforced at an international level by the World Association for Disaster and Emergency Medicine, which has proposed an international educational framework. The aim of this paper is to outline the development of a national educational framework for disaster health in Australia. The framework was developed on the basis of the literature and the previous experience of members of a National Collaborative for Disaster Health Education and Research. The Collaborative was brought together in a series of workshops and teleconferences, utilizing a modified Delphi technique to finalize the content at each level of the framework and to assign a value to the inclusion of that content at the various levels. The framework identifies seven educational levels along with educational outcomes for each level. The framework also identifies the recommended contents at each level and assigns a rating of depth for each component. The framework is not intended as a detailed curriculum, but rather as a guide for educationalists to develop specific programs at each level. This educational framework will provide an infrastructure around which future educational programs in Disaster Health in Australia may be designed and delivered. It will permit improved articulation for students between the various levels and greater consistency between programs so that operational responders may have a consistent language and operational approach to the management of major events.
A Framework to Assess the Cumulative Hydrological Impacts of Dams on flow Regime
NASA Astrophysics Data System (ADS)
Wang, Y.; Wang, D.
2016-12-01
In this study we proposed a framework to assess the cumulative impact of dams on hydrological regime, and the impacts of the Three Gorges Dam on flow regime in Yangtze River were investigated with the framework. We reconstructed the unregulated flow series to compare with the regulated flow series in the same period. Eco-surplus and eco-deficit and the Indicators of Hydrologic Alteration parameters were used to examine the hydrological regime change. Among IHA parameters, Wilcoxon signed-rank test and Principal Components Analysis identified the representative indicators of hydrological alterations. Eco-surplus and eco-deficit showed that the reservoir also changed the seasonal regime of the flows in autumn and winter. Annual extreme flows and October flows changes lead to negative ecological implications downstream from the Three Gorges Dam. Ecological operation for the Three Gorges Dam is necessary to mitigate the negative effects on the river ecosystem in the middle reach of Yangtze River. The framework proposed here could be a robust method to assess the cumulative impacts of reservoir operation.
Coupled Effects of non-Newtonian Rheology and Aperture Variability on Flow in a Single Fracture
NASA Astrophysics Data System (ADS)
Di Federico, V.; Felisa, G.; Lauriola, I.; Longo, S.
2017-12-01
Modeling of non-Newtonian flow in fractured media is essential in hydraulic fracturing and drilling operations, EOR, environmental remediation, and to understand magma intrusions. An important step in the modeling effort is a detailed understanding of flow in a single fracture, as the fracture aperture is spatially variable. A large bibliography exists on Newtonian and non-Newtonian flow in variable aperture fractures. Ultimately, stochastic or deterministic modeling leads to the flowrate under a given pressure gradient as a function of the parameters describing the aperture variability and the fluid rheology. Typically, analytical or numerical studies are performed adopting a power-law (Oswald-de Waele) model. Yet the power-law model, routinely used e.g. for hydro-fracturing modeling, does not characterize real fluids at low and high shear rates. A more appropriate rheological model is provided by e.g. the four-parameter Carreau constitutive equation, which is in turn approximated by the more tractable truncated power-law model. Moreover, fluids of interest may exhibit yield stress, which requires the Bingham or Herschel-Bulkely model. This study employs different rheological models in the context of flow in variable aperture fractures, with the aim of understanding the coupled effect of rheology and aperture spatial variability with a simplified model. The aperture variation, modeled within a stochastic or deterministic framework, is taken to be one-dimensional and i) perpendicular; ii) parallel to the flow direction; for stochastic modeling, the influence of different distribution functions is examined. Results for the different rheological models are compared with those obtained for the pure power-law. The adoption of the latter model leads to overestimation of the flowrate, more so for large aperture variability. The presence of yield stress also induces significant changes in the resulting flowrate for assigned external pressure gradient.
Agent Based Modeling of Air Carrier Behavior for Evaluation of Technology Equipage and Adoption
NASA Technical Reports Server (NTRS)
Horio, Brant M.; DeCicco, Anthony H.; Stouffer, Virginia L.; Hasan, Shahab; Rosenbaum, Rebecca L.; Smith, Jeremy C.
2014-01-01
As part of ongoing research, the National Aeronautics and Space Administration (NASA) and LMI developed a research framework to assist policymakers in identifying impacts on the U.S. air transportation system (ATS) of potential policies and technology related to the implementation of the Next Generation Air Transportation System (NextGen). This framework, called the Air Transportation System Evolutionary Simulation (ATS-EVOS), integrates multiple models into a single process flow to best simulate responses by U.S. commercial airlines and other ATS stakeholders to NextGen-related policies, and in turn, how those responses impact the ATS. Development of this framework required NASA and LMI to create an agent-based model of airline and passenger behavior. This Airline Evolutionary Simulation (AIRLINE-EVOS) models airline decisions about tactical airfare and schedule adjustments, and strategic decisions related to fleet assignments, market prices, and equipage. AIRLINE-EVOS models its own heterogeneous population of passenger agents that interact with airlines; this interaction allows the model to simulate the cycle of action-reaction as airlines compete with each other and engage passengers. We validated a baseline configuration of AIRLINE-EVOS against Airline Origin and Destination Survey (DB1B) data and subject matter expert opinion, and we verified the ATS-EVOS framework and agent behavior logic through scenario-based experiments. These experiments demonstrated AIRLINE-EVOS's capabilities in responding to an input price shock in fuel prices, and to equipage challenges in a series of analyses based on potential incentive policies for best equipped best served, optimal-wind routing, and traffic management initiative exemption concepts..
NASA Astrophysics Data System (ADS)
Soltanmohammadi, Hossein; Osanloo, Morteza; Aghajani Bazzazi, Abbas
2009-08-01
This study intends to take advantage of a previously developed framework for mined land suitability analysis (MLSA) consisted of economical, social, technical and mine site factors to achieve a partial and also a complete pre-order of feasible post-mining land-uses. Analysis by an outranking multi-attribute decision-making (MADM) technique, called PROMETHEE (preference ranking organization method for enrichment evaluation), was taken into consideration because of its clear advantages on the field of MLSA as compared with MADM ranking techniques. Application of the proposed approach on a mined land can be completed through some successive steps. First, performance of the MLSA attributes is scored locally by each individual decision maker (DM). Then the assigned performance scores are normalized and the deviation amplitudes of non-dominated alternatives are calculated. Weights of the attributes are calculated by another MADM technique namely, analytical hierarchy process (AHP) in a separate procedure. Using the Gaussian preference function beside the weights, the preference indexes of the land-use alternatives are obtained. Calculation of the outgoing and entering flows of the alternatives and one by one comparison of these values will lead to partial pre-order of them and calculation of the net flows, will lead to a ranked preference for each land-use. At the final step, utilizing the PROMETHEE group decision support system which incorporates judgments of all the DMs, a consensual ranking can be derived. In this paper, preference order of post-mining land-uses for a hypothetical mined land has been derived according to judgments of one DM to reveal applicability of the proposed approach.
NASA Astrophysics Data System (ADS)
Kincaid, T. R.; Meyer, B. A.
2009-12-01
In groundwater flow modeling, aquifer permeability is typically defined through model calibration. Since the pattern and size of conduits are part of a karstic permeability framework, those parameters should be constrainable through the same process given a sufficient density of measured conditions. H2H Associates has completed a dual-permeability steady-state model of groundwater flow through the western Santa Fe River Basin, Florida from which a 380.9 km network of saturated conduits was delineated through model calibration to heads and spring discharges. Two calibration datasets were compiled describing average high-water and average low-water conditions based on heads at 145 wells and discharge from 18 springs for the high-water scenario and heads at 188 wells and discharge from 9 springs for the low-water scenario. An initial conduit network was defined by assigning paths along mapped conduits and inferring paths along potentiometric troughs between springs and swallets that had been connected by groundwater tracing. These initial conduit assignments accounted for only 13.75 and 34.1 km of the final conduit network respectively. The model was setup using FEFLOW™ where conduits were described as discrete features embedded in a porous matrix. Flow in the conduits was described by the Manning-Strickler equation where variables for conduit area and roughness were used to adjust the volume and velocity of spring flows. Matrix flow was described by Darcy’s law where hydraulic conductivity variations were limited to three geologically defined internally homogeneous zones that ranged from ~2E-6 m/s to ~4E-3 m/s. Recharge for both the high-water and low-water periods was determined through a water budget analysis where variations were restricted to nine zones defined by land-use. All remaining variations in observed head were then assumed to be due to conduits. The model was iteratively calibrated to the high-water and low-water datasets wherein the location, size and roughness of the conduits were assigned as needed to accurately simulate observed heads and spring discharges while bounding simulated velocities by the tracer test results. Conduit diameters were adjusted to support high-water spring discharges but the locations were best determined by calibration to the low-water head field. The final model calibrated to within 5% of the total head change across the model region at 143 of the 145 wells in the high-water scenario and at 176 of the 188 wells in the low-water scenario. Simulated spring discharges fell within 13% of the observed range under high-water conditions and to within 100% of the observed range under low-water conditions. Simulated velocities ranged from as low as 10-4 m/day in the matrix to as high as 10+3 m/day in the largest conduits. The significance of these results that we emphasize here is two-fold. First, plausible karstic groundwater flow conditions can be reasonably simulated if adequate efforts are made to include springs, swallets, caves, and traced flow paths. And second, detailed saturated conduit networks can be delineated from careful evaluation of hydraulic head data particularly when dense datasets can be constructed by correlating values obtained from different wells under similar hydraulic periods.
Looking for a Possible Framework to Teach Contemporary Art in Primary School
ERIC Educational Resources Information Center
Vahter, Edna
2016-01-01
Traditionally, the learning of arts in the Estonian primary school has meant completion of practical assignments given by the teacher. The new national curriculum for basic school adopted in 2010 sets out new requirements for art education where the emphasis, in addition to practical assignments, is on discussion and understanding of art. The…
A Taxonomy of Digital Media Types for Learner-Generated Digital Media Assignments
ERIC Educational Resources Information Center
Reyna, Jorge; Hanham, Jose; Meier, Peter
2017-01-01
The notion of students as co-creators of content in higher education is gaining popularity, with an ever-increasing emphasis on the development of digital media assignments. In a separate paper, the authors introduced the Digital Media Literacies Framework, which is composed of three interrelated domains: (1) conceptual, (2) functional, and (3)…
ERIC Educational Resources Information Center
Vos, Hans J.
An approach to simultaneous optimization of assignments of subjects to treatments followed by an end-of-mastery test is presented using the framework of Bayesian decision theory. Focus is on demonstrating how rules for the simultaneous optimization of sequences of decisions can be found. The main advantages of the simultaneous approach, compared…
ERIC Educational Resources Information Center
Silva, Pedro
2017-01-01
There are several technological tools which aim to support first year students' challenges, especially when it comes to academic writing. This paper analyses one of these tools, Wiley's AssignMentor. The Technological Pedagogical Content Knowledge framework was used to systematise this analysis. The paper showed an alignment between the tools'…
Sense of place: Mount Desert Island residents and Acadia National Park
Nicole L. Ballinger; Robert E. Manning
1998-01-01
The framework of sense of place, developed by humanistic geographers, has been employed by researchers in their efforts to explain the range of attachments, values, and meanings assigned to natural areas. This study used an exploratory approach to address the range of values and meanings assigned by local residents to places in Acadia National Park. Qualitative...
Students' Understanding and Perceptions of Assigned Team Roles in a Classroom Laboratory Environment
ERIC Educational Resources Information Center
Ott, Laura E.; Kephart, Kerrie; Stolle-McAllister, Kathleen; LaCourse, William R.
2018-01-01
Using a cooperative learning framework in a quantitative reasoning laboratory course, students were assigned to static teams of four in which they adopted roles that rotated regularly. The roles included: team leader, protocol manager, data recorder, and researcher. Using a mixed-methods approach, we investigated students' perceptions of the team…
Constructing cardiovascular fitness knowledge in physical education
Zhang, Tan; Chen, Ang; Chen, Senlin; Hong, Deockki; Loflin, Jerry; Ennis, Catherine
2015-01-01
In physical education, it has become necessary for children to learn kinesiological knowledge for understanding the benefits of physical activity and developing a physically active lifestyle. This study was conducted to determine the extent to which cognitive assignments about healthful living and fitness contributed to knowledge growth on cardiorespiratory fitness and health. Fourth grade students (N = 616) from 15 randomly sampled urban elementary schools completed 34 cognitive assignments related to the cardiorespiratory physical activities they were engaged in across 10 lessons. Performance on the assignments were analyzed in relation to their knowledge gain measured using a standardized knowledge test. A multivariate discriminant analysis revealed that the cognitive assignments contributed to knowledge gain but the contribution varied assignment by assignment. A multiple regression analysis indicated that students’ assignment performance by lesson contributed positively to their knowledge growth scores. A content analysis based on the constructivist learning framework showed that observing–reasoning assignments contributed the most to knowledge growth. Analytical and analytical–application assignments contributed less than the constructivist theories would predict. PMID:25995702
Neural Meta-Memes Framework for Combinatorial Optimization
NASA Astrophysics Data System (ADS)
Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon
In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).
A task scheduler framework for self-powered wireless sensors.
Nordman, Mikael M
2003-10-01
The cost and inconvenience of cabling is a factor limiting widespread use of intelligent sensors. Recent developments in short-range, low-power radio seem to provide an opening to this problem, making development of wireless sensors feasible. However, for these sensors the energy availability is a main concern. The common solution is either to use a battery or to harvest ambient energy. The benefit of harvested ambient energy is that the energy feeder can be considered as lasting a lifetime, thus it saves the user from concerns related to energy management. The problem is, however, the unpredictability and unsteady behavior of ambient energy sources. This becomes a main concern for sensors that run multiple tasks at different priorities. This paper proposes a new scheduler framework that enables the reliable assignment of task priorities and scheduling in sensors powered by ambient energy. The framework being based on environment parameters, virtual queues, and a state machine with transition conditions, dynamically manages task execution according to priorities. The framework is assessed in a test system powered by a solar panel. The results show the functionality of the framework and how task execution reliably is handled without violating the priority scheme that has been assigned to it.
A Markov Random Field Framework for Protein Side-Chain Resonance Assignment
NASA Astrophysics Data System (ADS)
Zeng, Jianyang; Zhou, Pei; Donald, Bruce Randall
Nuclear magnetic resonance (NMR) spectroscopy plays a critical role in structural genomics, and serves as a primary tool for determining protein structures, dynamics and interactions in physiologically-relevant solution conditions. The current speed of protein structure determination via NMR is limited by the lengthy time required in resonance assignment, which maps spectral peaks to specific atoms and residues in the primary sequence. Although numerous algorithms have been developed to address the backbone resonance assignment problem [68,2,10,37,14,64,1,31,60], little work has been done to automate side-chain resonance assignment [43, 48, 5]. Most previous attempts in assigning side-chain resonances depend on a set of NMR experiments that record through-bond interactions with side-chain protons for each residue. Unfortunately, these NMR experiments have low sensitivity and limited performance on large proteins, which makes it difficult to obtain enough side-chain resonance assignments. On the other hand, it is essential to obtain almost all of the side-chain resonance assignments as a prerequisite for high-resolution structure determination. To overcome this deficiency, we present a novel side-chain resonance assignment algorithm based on alternative NMR experiments measuring through-space interactions between protons in the protein, which also provide crucial distance restraints and are normally required in high-resolution structure determination. We cast the side-chain resonance assignment problem into a Markov Random Field (MRF) framework, and extend and apply combinatorial protein design algorithms to compute the optimal solution that best interprets the NMR data. Our MRF framework captures the contact map information of the protein derived from NMR spectra, and exploits the structural information available from the backbone conformations determined by orientational restraints and a set of discretized side-chain conformations (i.e., rotamers). A Hausdorff-based computation is employed in the scoring function to evaluate the probability of side-chain resonance assignments to generate the observed NMR spectra. The complexity of the assignment problem is first reduced by using a dead-end elimination (DEE) algorithm, which prunes side-chain resonance assignments that are provably not part of the optimal solution. Then an A* search algorithm is used to find a set of optimal side-chain resonance assignments that best fit the NMR data. We have tested our algorithm on NMR data for five proteins, including the FF Domain 2 of human transcription elongation factor CA150 (FF2), the B1 domain of Protein G (GB1), human ubiquitin, the ubiquitin-binding zinc finger domain of the human Y-family DNA polymerase Eta (pol η UBZ), and the human Set2-Rpb1 interacting domain (hSRI). Our algorithm assigns resonances for more than 90% of the protons in the proteins, and achieves about 80% correct side-chain resonance assignments. The final structures computed using distance restraints resulting from the set of assigned side-chain resonances have backbone RMSD 0.5 - 1.4 Å and all-heavy-atom RMSD 1.0 - 2.2 Å from the reference structures that were determined by X-ray crystallography or traditional NMR approaches. These results demonstrate that our algorithm can be successfully applied to automate side-chain resonance assignment and high-quality protein structure determination. Since our algorithm does not require any specific NMR experiments for measuring the through-bond interactions with side-chain protons, it can save a significant amount of both experimental cost and spectrometer time, and hence accelerate the NMR structure determination process.
Smart-Grid Backbone Network Real-Time Delay Reduction via Integer Programming.
Pagadrai, Sasikanth; Yilmaz, Muhittin; Valluri, Pratyush
2016-08-01
This research investigates an optimal delay-based virtual topology design using integer linear programming (ILP), which is applied to the current backbone networks such as smart-grid real-time communication systems. A network traffic matrix is applied and the corresponding virtual topology problem is solved using the ILP formulations that include a network delay-dependent objective function and lightpath routing, wavelength assignment, wavelength continuity, flow routing, and traffic loss constraints. The proposed optimization approach provides an efficient deterministic integration of intelligent sensing and decision making, and network learning features for superior smart grid operations by adaptively responding the time-varying network traffic data as well as operational constraints to maintain optimal virtual topologies. A representative optical backbone network has been utilized to demonstrate the proposed optimization framework whose simulation results indicate that superior smart-grid network performance can be achieved using commercial networks and integer programming.
Some implementational issues of convection schemes for finite volume formulations
NASA Technical Reports Server (NTRS)
Thakur, Siddharth; Shyy, Wei
1993-01-01
Two higher-order upwind schemes - second-order upwind and QUICK - are examined in terms of their interpretation, implementation as well as performance for a recirculating flow in a lid-driven cavity, in the context of a control volume formulation using the SIMPLE algorithm. The present formulation of these schemes is based on a unified framework wherein the first-order upwind scheme is chosen as the basis, with the remaining terms being assigned to the source term. The performance of these schemes is contrasted with the first-order upwind and second-order central difference schemes. Also addressed in this study is the issue of boundary treatment associated with these higher-order upwind schemes. Two different boundary treatments - one that uses a two-point scheme consistently within a given control volume at the boundary, and the other that maintains consistency of flux across the interior face between the adjacent control volumes - are formulated and evaluated.
Some implementational issues of convection schemes for finite-volume formulations
NASA Technical Reports Server (NTRS)
Thakur, Siddharth; Shyy, Wei
1993-01-01
Two higher-order upwind schemes - second-order upwind and QUICK - are examined in terms of their interpretation, implementations, as well as performance for a recirculating flow in a lid-driven cavity, in the context of a control-volume formulation using the SIMPLE algorithm. The present formulation of these schemes is based on a unified framework wherein the first-order upwind scheme is chosen as the basis, with the remaining terms being assigned to the source term. The performance of these schemes is contrasted with the first-order upwind and second-order central difference schemes. Also addressed in this study is the issue of boundary treatment associated with these higher-order upwind schemes. Two different boundary treatments - one that uses a two-point scheme consistently within a given control volume at the boundary, and the other that maintains consistency of flux across the interior face between the adjacent control volumes - are formulated and evaluated.
NASA Astrophysics Data System (ADS)
Ramezanpour, Abolfazl; Mashaghi, Alireza
2017-07-01
A fundamental problem in medicine and biology is to assign states, e.g. healthy or diseased, to cells, organs or individuals. State assignment or making a diagnosis is often a nontrivial and challenging process and, with the advent of omics technologies, the diagnostic challenge is becoming more and more serious. The challenge lies not only in the increasing number of measured properties and dynamics of the system (e.g. cell or human body) but also in the co-evolution of multiple states and overlapping properties, and degeneracy of states. We develop, from first principles, a generic rational framework for state assignment in cell biology and medicine, and demonstrate its applicability with a few simple theoretical case studies from medical diagnostics. We show how disease-related statistical information can be used to build a comprehensive model that includes the relevant dependencies between clinical and laboratory findings (signs) and diseases. In particular, we include disease-disease and sign-sign interactions and study how one can infer the probability of a disease in a patient with given signs. We perform comparative analysis with simple benchmark models to check the performances of our models. We find that including interactions can significantly change the statistical importance of the signs and diseases. This first principles approach, as we show, facilitates the early diagnosis of disease by taking interactions into accounts, and enables the construction of consensus diagnostic flow charts. Additionally, we envision that our approach will find applications in systems biology, and in particular, in characterizing the phenome via the metabolome, the proteome, the transcriptome, and the genome.
Fair sharing of resources in a supply network with constraints.
Carvalho, Rui; Buzna, Lubos; Just, Wolfram; Helbing, Dirk; Arrowsmith, David K
2012-04-01
This paper investigates the effect of network topology on the fair allocation of network resources among a set of agents, an all-important issue for the efficiency of transportation networks all around us. We analyze a generic mechanism that distributes network capacity fairly among existing flow demands. The problem can be solved by semianalytical methods on a nearest-neighbor graph with one source and sink pair, when transport occurs over shortest paths. For this setup, we uncover a broad range of patterns of intersecting shortest paths as a function of the distance between the source and the sink. When the number of intersections is the maximum and the distance between the source and the sink is large, we find that a fair allocation implies a decrease of at least 50% from the maximum throughput. We also find that the histogram of the flow allocations assigned to the agents decays as a power law with exponent -1. Our semianalytical framework suggests possible explanations for the well-known reduction of the throughput in fair allocations. It also suggests that the combination of network topology and routing rules can lead to highly uneven (but fair) distributions of resources, a remark of caution to network designers.
Das Bremerhavener Grundwasser im Klimawandel - Eine FREEWAT-Fallstudie
NASA Astrophysics Data System (ADS)
Panteleit, Björn; Jensen, Sven; Seiter, Katherina; Siebert, Yvonne
2018-01-01
A 3D structural model was created for the state of Bremen based on an extensive borehole database. Parameters were assigned to the model by interpretation and interpolation of the borehole descriptions. This structural model was transferred into a flow model via the FREEWAT platform, an open-source plug-in of the free QGIS software, with connection to the MODFLOW code. This groundwater management tool is intended for long-term use. As a case study for the FREEWAT Project, possible effects of climate change on groundwater levels in the Bremerhaven area have been simulated. In addition to the calibration year 2010, scenarios with a sea-level rise and decreasing groundwater recharge were simulated for the years 2040, 2070 and 2100. In addition to seawater intrusion in the coastal area, declining groundwater levels are also a concern. Possibilities for future groundwater management already include active control of the water level of a lake and the harbor basin. With the help of a focused groundwater monitoring program based on the model results, the planned flow model can become an important forecasting tool for groundwater management within the framework of the planned continuous model management and for representing the effects of changing climatic conditions and mitigation measures.
Fair sharing of resources in a supply network with constraints
NASA Astrophysics Data System (ADS)
Carvalho, Rui; Buzna, Lubos; Just, Wolfram; Helbing, Dirk; Arrowsmith, David K.
2012-04-01
This paper investigates the effect of network topology on the fair allocation of network resources among a set of agents, an all-important issue for the efficiency of transportation networks all around us. We analyze a generic mechanism that distributes network capacity fairly among existing flow demands. The problem can be solved by semianalytical methods on a nearest-neighbor graph with one source and sink pair, when transport occurs over shortest paths. For this setup, we uncover a broad range of patterns of intersecting shortest paths as a function of the distance between the source and the sink. When the number of intersections is the maximum and the distance between the source and the sink is large, we find that a fair allocation implies a decrease of at least 50% from the maximum throughput. We also find that the histogram of the flow allocations assigned to the agents decays as a power law with exponent -1. Our semianalytical framework suggests possible explanations for the well-known reduction of the throughput in fair allocations. It also suggests that the combination of network topology and routing rules can lead to highly uneven (but fair) distributions of resources, a remark of caution to network designers.
Information Transfer in the Brain: Insights from a Unified Approach
NASA Astrophysics Data System (ADS)
Marinazzo, Daniele; Wu, Guorong; Pellicoro, Mario; Stramaglia, Sebastiano
Measuring directed interactions in the brain in terms of information flow is a promising approach, mathematically treatable and amenable to encompass several methods. In this chapter we propose some approaches rooted in this framework for the analysis of neuroimaging data. First we will explore how the transfer of information depends on the network structure, showing how for hierarchical networks the information flow pattern is characterized by exponential distribution of the incoming information and a fat-tailed distribution of the outgoing information, as a signature of the law of diminishing marginal returns. This was reported to be true also for effective connectivity networks from human EEG data. Then we address the problem of partial conditioning to a limited subset of variables, chosen as the most informative ones for the driver node.We will then propose a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be associated to the sign of the contribution. Applications are reported for EEG and fMRI data.
A framework for estimating potential fluid flow from digital imagery
NASA Astrophysics Data System (ADS)
Luttman, Aaron; Bollt, Erik M.; Basnayake, Ranil; Kramer, Sean; Tufillaro, Nicholas B.
2013-09-01
Given image data of a fluid flow, the flow field, ⟨u,v⟩, governing the evolution of the system can be estimated using a variational approach to optical flow. Assuming that the flow field governing the advection is the symplectic gradient of a stream function or the gradient of a potential function—both falling under the category of a potential flow—it is natural to re-frame the optical flow problem to reconstruct the stream or potential function directly rather than the components of the flow individually. There are several advantages to this framework. Minimizing a functional based on the stream or potential function rather than based on the components of the flow will ensure that the computed flow is a potential flow. Next, this approach allows a more natural method for imposing scientific priors on the computed flow, via regularization of the optical flow functional. Also, this paradigm shift gives a framework—rather than an algorithm—and can be applied to nearly any existing variational optical flow technique. In this work, we develop the mathematical formulation of the potential optical flow framework and demonstrate the technique on synthetic flows that represent important dynamics for mass transport in fluid flows, as well as a flow generated by a satellite data-verified ocean model of temperature transport.
Geologic Map of the Middle East Rift Geothermal Subzone, Kilauea Volcano, Hawaii
Trusdell, Frank A.; Moore, Richard B.
2006-01-01
K'lauea is an active shield volcano in the southeastern part of the Island of Hawai'i. The middle east rift zone (MERZ) map includes about 27 square kilometers of the MERZ and shows the distribution of the products of 37 separate eruptions during late Holocene time. Lava flows erupted during 1983-96 have reached the mapped area. The subaerial part of the MERZ is 3-4 km wide and about 18 km long. It is a constructional ridge, 50-150 m above the adjoining terrain, marked by low spatter ramparts and cones as high as 60 m. Lava typically flowed either northeast or southeast, depending on vent location relative to the topographic crest of the rift zone. The MERZ receives more than 100 in. of rainfall annually and is covered by tropical rain forest. Vegetation begins to grow on lava a few months after its eruption. Relative heights of trees can be a guide to relative ages of underlying lava flows, but proximity to faults, presence of easily weathered cinders, and human activity also affect the rate of growth. The rocks have been grouped into five basic age groups. The framework for the ages assigned is provided by eight radiocarbon ages from previous mapping by the authors and a single date from the current mapping effort. The numerical ages are supplemented by observations of stratigraphic relations, degree of weathering, soil development, and vegetative cover.
2004-12-01
handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using
Ecological subregion codes by county, coterminous United States
Victor A. Rudis
1999-01-01
This publication presents the National Hierarchical Framework of Ecological Units (ECOMAP 1993) by county for the coterminous United States. Assignment of the framework to individual counties is based on the predominant area by province and section to facilitate integration of county-referenced information with areas of uniform ecological potential. Included are maps...
Locating Community Action Outreach Projects in the Scholarship of Media Literacy Pedagogy
ERIC Educational Resources Information Center
Crandall, Heather
2016-01-01
This paper compares frameworks in recent critical media literacy scholarship with trends found in eight semesters of media literacy community action outreach assignments to explore how these frameworks can function as curricular tools for media literacy practitioners. Besides potential tools for media literacy pedagogy, this examination of recent…
Effects of Early Writing Intervention Delivered within a Data-Based Instruction Framework
ERIC Educational Resources Information Center
Jung, Pyung-Gang; McMaster, Kristen L.; delMas, Robert C.
2017-01-01
We examined effects of research-based early writing intervention delivered within a data-based instruction (DBI) framework for children with intensive needs. We randomly assigned 46 students with and without disabilities in Grades 1 to 3 within classrooms to either treatment or control. Treatment students received research-based early writing…
Activities with Parents on the Computer: An Ecological Framework
ERIC Educational Resources Information Center
Paiva, João C.; Morais, Carla; Moreira, Luciano
2017-01-01
This paper proposes an ecological framework "Activities with Parents on the Computer" (APC) to bridge home and school contexts by involving parents and students in digital media based assignments. An exploratory case-study was conducted based on ten parent-child dyads that engaged in an APC at home. Attitudes were assessed through a…
Flow assignment model for quantitative analysis of diverting bulk freight from road to railway
Liu, Chang; Wang, Jiaxi; Xiao, Jie; Liu, Siqi; Wu, Jianping; Li, Jian
2017-01-01
Since railway transport possesses the advantage of high volume and low carbon emissions, diverting some freight from road to railway will help reduce the negative environmental impacts associated with transport. This paper develops a flow assignment model for quantitative analysis of diverting truck freight to railway. First, a general network which considers road transportation, railway transportation, handling and transferring is established according to all the steps in the whole transportation process. Then general functions which embody the factors which the shippers will pay attention to when choosing mode and path are formulated. The general functions contain the congestion cost on road, the capacity constraints of railways and freight stations. Based on the general network and general cost function, a user equilibrium flow assignment model is developed to simulate the flow distribution on the general network under the condition that all shippers choose transportation mode and path independently. Since the model is nonlinear and challenging, we adopt a method that uses tangent lines to constitute envelope curve to linearize it. Finally, a numerical example is presented to test the model and show the method of making quantitative analysis of bulk freight modal shift between road and railway. PMID:28771536
Yi, Chucai; Tian, Yingli
2012-09-01
In this paper, we propose a novel framework to extract text regions from scene images with complex backgrounds and multiple text appearances. This framework consists of three main steps: boundary clustering (BC), stroke segmentation, and string fragment classification. In BC, we propose a new bigram-color-uniformity-based method to model both text and attachment surface, and cluster edge pixels based on color pairs and spatial positions into boundary layers. Then, stroke segmentation is performed at each boundary layer by color assignment to extract character candidates. We propose two algorithms to combine the structural analysis of text stroke with color assignment and filter out background interferences. Further, we design a robust string fragment classification based on Gabor-based text features. The features are obtained from feature maps of gradient, stroke distribution, and stroke width. The proposed framework of text localization is evaluated on scene images, born-digital images, broadcast video images, and images of handheld objects captured by blind persons. Experimental results on respective datasets demonstrate that the framework outperforms state-of-the-art localization algorithms.
Novel, Effective, Whole: Toward a NEW Framework for Evaluations of Creative Products
ERIC Educational Resources Information Center
Henriksen, Danah; Mishra, Punya; Mehta, Rohit
2015-01-01
Creativity is increasingly viewed as an important 21st century skill that needs to be taught in schools. This emphasis on creativity is often reflected by having students engage in open-ended, project based activities and assignments. A key challenge faced by educators is how such assignments are to be evaluated. An in-depth review of existing…
ERIC Educational Resources Information Center
Finkenstaedt-Quinn, Solaire A.; Halim, Audrey S.; Chambers, Timothy G.; Moon, Alena; Goldman, R. S.; Gere, Anne Ruggles; Shultz, Ginger V.
2017-01-01
We conducted a study to examine how a writing-to-learn assignment influenced student learning of polymer behavior. In particular, we examined the role of specific content and a rhetorical framework as well as a structured writing process including peer review and revision. The student-generated writing was analyzed via a content-directed rubric.…
A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics
NASA Astrophysics Data System (ADS)
Haaf, Ezra; Heudorfer, Benedikt; Stahl, Kerstin; Barthel, Roland
2017-04-01
Fluctuation patterns in groundwater hydrographs are generally assumed to contain information on aquifer characteristics, climate and environmental controls. However, attempts to disentangle this information and map the dominant controls have been few. This is due to the substantial heterogeneity and complexity of groundwater systems, which is reflected in the abundance of morphologies of groundwater time series. To describe the structure and shape of hydrographs, descriptive terms like "slow"/ "fast" or "flashy"/ "inert" are frequently used, which are subjective, irreproducible and limited. This lack of objective and refined concepts limit approaches for regionalization of hydrogeological characteristics as well as our understanding of dominant processes controlling groundwater dynamics. Therefore, we propose a novel framework for groundwater hydrograph characterization in an attempt to categorize morphologies explicitly and quantitatively based on perceptual concepts of aspects of the dynamics. This quantitative framework is inspired by the existing and operational eco-hydrological classification frameworks for streamflow. The need for a new framework for groundwater systems is justified by the fundamental differences between the state variable groundwater head and the flow variable streamflow. Conceptually, we extracted exemplars of specific dynamic patterns, attributing descriptive terms for means of systematisation. Metrics, primarily taken from streamflow literature, were subsequently adapted to groundwater and assigned to the described patterns for means of quantification. In this study, we focused on the particularities of groundwater as a state variable. Furthermore, we investigated the descriptive skill of individual metrics as well as their usefulness for groundwater hydrographs. The ensemble of categorized metrics result in a framework, which can be used to describe and quantify groundwater dynamics. It is a promising tool for the setup of a successful similarity classification framework for groundwater hydrographs. However, the overabundance of metrics available calls for a systematic redundancy analysis of the metrics, which we describe in a second study (Heudorfer et al., 2017). Heudorfer, B., Haaf, E., Barthel, R., Stahl, K., 2017. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.
Adaptive protection algorithm and system
Hedrick, Paul [Pittsburgh, PA; Toms, Helen L [Irwin, PA; Miller, Roger M [Mars, PA
2009-04-28
An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.
Hohashi, Naohiro; Honda, Junko
2011-11-01
Although the number of employees on overseas assignments accompanied by their families has increased steadily, little is known about the effects of this experience on family functioning. Japanese families on family-accompanied assignments living in Hong Kong were compared with families living in Japan (consisting of 135 and 248 paired partners, respectively). Applying an ecological framework, family functioning was examined using the Feetham Family Functioning Survey-Japanese (FFFS-J). Japanese wives living in Hong Kong rated family functioning lower, particularly in the area of "relationship between family and family members." Between paired marital partners living in Hong Kong, the level of satisfaction in the area of "relationship between family and society" was significantly lower for wives than for husbands. This study provides application of the family ecological framework in families in a multicultural environment and identifies potential areas for family assessment and intervention that may of interest to health care professionals who care for families living away from their home countries.
A Conceptual Framework for Improving Critical Care Patient Flow and Bed Use.
Mathews, Kusum S; Long, Elisa F
2015-06-01
High demand for intensive care unit (ICU) services and limited bed availability have prompted hospitals to address capacity planning challenges. Simulation modeling can examine ICU bed assignment policies, accounting for patient acuity, to reduce ICU admission delays. To provide a framework for data-driven modeling of ICU patient flow, identify key measurable outcomes, and present illustrative analysis demonstrating the impact of various bed allocation scenarios on outcomes. A description of key inputs for constructing a queuing model was outlined, and an illustrative simulation model was developed to reflect current triage protocol within the medical ICU and step-down unit (SDU) at a single tertiary-care hospital. Patient acuity, arrival rate, and unit length of stay, consisting of a "service time" and "time to transfer," were estimated from 12 months of retrospective data (n = 2,710 adult patients) for 36 ICU and 15 SDU staffed beds. Patient priority was based on acuity and whether the patient originated in the emergency department. The model simulated the following hypothetical scenarios: (1) varied ICU/SDU sizes, (2) reserved ICU beds as a triage strategy, (3) lower targets for time to transfer out of the ICU, and (4) ICU expansion by up to four beds. Outcomes included ICU admission wait times and unit occupancy. With current bed allocation, simulated wait time averaged 1.13 (SD, 1.39) hours. Reallocating all SDU beds as ICU decreased overall wait times by 7.2% to 1.06 (SD, 1.39) hours and increased bed occupancy from 80 to 84%. Reserving the last available bed for acute patients reduced wait times for acute patients from 0.84 (SD, 1.12) to 0.31 (SD, 0.30) hours, but tripled subacute patients' wait times from 1.39 (SD, 1.81) to 4.27 (SD, 5.44) hours. Setting transfer times to wards for all ICU/SDU patients to 1 hour decreased wait times for incoming ICU patients, comparable to building one to two additional ICU beds. Hospital queuing and simulation modeling with empiric data inputs can evaluate how changes in ICU bed assignment could impact unit occupancy levels and patient wait times. Trade-offs associated with dedicating resources for acute patients versus expanding capacity for all patients can be examined.
Ryan A. McManamay; Donald J. Orth; Charles A. Dolloff; Emmaneul A. Firmpong
2012-01-01
River regulation has resulted in substantial losses in habitat connectivity, biodiversity and ecosystem services. River managers are faced with a growing need to protect the key aspects of the natural flow regime. A practical approach to providing environmental flow standards is to create a regional framework by classifying unregulated streams into groups of similar...
Quantum probability assignment limited by relativistic causality.
Han, Yeong Deok; Choi, Taeseung
2016-03-14
Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saenz, Juan A.; Chen, Qingshan; Ringler, Todd
Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less
Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trebotich, D
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscousmore » flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.« less
Modeling complex biological flows in multi-scale systems using the APDEC framework
NASA Astrophysics Data System (ADS)
Trebotich, David
2006-09-01
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.
NASA Astrophysics Data System (ADS)
Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun
2018-07-01
Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.
Sensor Based Framework for Secure Multimedia Communication in VANET
Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon
2010-01-01
Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462
NASA Technical Reports Server (NTRS)
Mielke, Roland V. (Inventor); Stoughton, John W. (Inventor)
1990-01-01
Computationally complex primitive operations of an algorithm are executed concurrently in a plurality of functional units under the control of an assignment manager. The algorithm is preferably defined as a computationally marked graph contianing data status edges (paths) corresponding to each of the data flow edges. The assignment manager assigns primitive operations to the functional units and monitors completion of the primitive operations to determine data availability using the computational marked graph of the algorithm. All data accessing of the primitive operations is performed by the functional units independently of the assignment manager.
A test of geographic assignment using isotope tracers in feathers of known origin
Wunder, Michael B.; Kester, C.L.; Knopf, F.L.; Rye, R.O.
2005-01-01
We used feathers of known origin collected from across the breeding range of a migratory shorebird to test the use of isotope tracers for assigning breeding origins. We analyzed δD, δ13C, and δ15N in feathers from 75 mountain plover (Charadrius montanus) chicks sampled in 2001 and from 119 chicks sampled in 2002. We estimated parameters for continuous-response inverse regression models and for discrete-response Bayesian probability models from data for each year independently. We evaluated model predictions with both the training data and by using the alternate year as an independent test dataset. Our results provide weak support for modeling latitude and isotope values as monotonic functions of one another, especially when data are pooled over known sources of variation such as sample year or location. We were unable to make even qualitative statements, such as north versus south, about the likely origin of birds using both δD and δ13C in inverse regression models; results were no better than random assignment. Probability models provided better results and a more natural framework for the problem. Correct assignment rates were highest when considering all three isotopes in the probability framework, but the use of even a single isotope was better than random assignment. The method appears relatively robust to temporal effects and is most sensitive to the isotope discrimination gradients over which samples are taken. We offer that the problem of using isotope tracers to infer geographic origin is best framed as one of assignment, rather than prediction.
Evolution of 3-D geologic framework modeling and its application to groundwater flow studies
Blome, Charles D.; Smith, David V.
2012-01-01
In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
Cell transmission model of dynamic assignment for urban rail transit networks.
Xu, Guangming; Zhao, Shuo; Shi, Feng; Zhang, Feilian
2017-01-01
For urban rail transit network, the space-time flow distribution can play an important role in evaluating and optimizing the space-time resource allocation. For obtaining the space-time flow distribution without the restriction of schedules, a dynamic assignment problem is proposed based on the concept of continuous transmission. To solve the dynamic assignment problem, the cell transmission model is built for urban rail transit networks. The priority principle, queuing process, capacity constraints and congestion effects are considered in the cell transmission mechanism. Then an efficient method is designed to solve the shortest path for an urban rail network, which decreases the computing cost for solving the cell transmission model. The instantaneous dynamic user optimal state can be reached with the method of successive average. Many evaluation indexes of passenger flow can be generated, to provide effective support for the optimization of train schedules and the capacity evaluation for urban rail transit network. Finally, the model and its potential application are demonstrated via two numerical experiments using a small-scale network and the Beijing Metro network.
Ant colony optimization for solving university facility layout problem
NASA Astrophysics Data System (ADS)
Mohd Jani, Nurul Hafiza; Mohd Radzi, Nor Haizan; Ngadiman, Mohd Salihin
2013-04-01
Quadratic Assignment Problems (QAP) is classified as the NP hard problem. It has been used to model a lot of problem in several areas such as operational research, combinatorial data analysis and also parallel and distributed computing, optimization problem such as graph portioning and Travel Salesman Problem (TSP). In the literature, researcher use exact algorithm, heuristics algorithm and metaheuristic approaches to solve QAP problem. QAP is largely applied in facility layout problem (FLP). In this paper we used QAP to model university facility layout problem. There are 8 facilities that need to be assigned to 8 locations. Hence we have modeled a QAP problem with n ≤ 10 and developed an Ant Colony Optimization (ACO) algorithm to solve the university facility layout problem. The objective is to assign n facilities to n locations such that the minimum product of flows and distances is obtained. Flow is the movement from one to another facility, whereas distance is the distance between one locations of a facility to other facilities locations. The objective of the QAP is to obtain minimum total walking (flow) of lecturers from one destination to another (distance).
Poff, N.L.; Richter, B.D.; Arthington, A.H.; Bunn, S.E.; Naiman, R.J.; Kendy, E.; Acreman, M.; Apse, C.; Bledsoe, B.P.; Freeman, Mary C.; Henriksen, J.; Jacobson, R.B.; Kennen, J.G.; Merritt, D.M.; O'Keeffe, J. H.; Olden, J.D.; Rogers, K.; Tharme, R.E.; Warner, A.
2010-01-01
The flow regime is a primary determinant of the structure and function of aquatic and riparian ecosystems for streams and rivers. Hydrologic alteration has impaired riverine ecosystems on a global scale, and the pace and intensity of human development greatly exceeds the ability of scientists to assess the effects on a river-by-river basis. Current scientific understanding of hydrologic controls on riverine ecosystems and experience gained from individual river studies support development of environmental flow standards at the regional scale. 2. This paper presents a consensus view from a group of international scientists on a new framework for assessing environmental flow needs for many streams and rivers simultaneously to foster development and implementation of environmental flow standards at the regional scale. This framework, the ecological limits of hydrologic alteration (ELOHA), is a synthesis of a number of existing hydrologic techniques and environmental flow methods that are currently being used to various degrees and that can support comprehensive regional flow management. The flexible approach allows scientists, water-resource managers and stakeholders to analyse and synthesise available scientific information into ecologically based and socially acceptable goals and standards for management of environmental flows. 3. The ELOHA framework includes the synthesis of existing hydrologic and ecological databases from many rivers within a user-defined region to develop scientifically defensible and empirically testable relationships between flow alteration and ecological responses. These relationships serve as the basis for the societally driven process of developing regional flow standards. This is to be achieved by first using hydrologic modelling to build a 'hydrologic foundation' of baseline and current hydrographs for stream and river segments throughout the region. Second, using a set of ecologically relevant flow variables, river segments within the region are classified into a few distinctive flow regime types that are expected to have different ecological characteristics. These river types can be further subclassified according to important geomorphic features that define hydraulic habitat features. Third, the deviation of current-condition flows from baseline-condition flow is determined. Fourth, flow alteration-ecological response relationships are developed for each river type, based on a combination of existing hydroecological literature, expert knowledge and field studies across gradients of hydrologic alteration. 4. Scientific uncertainty will exist in the flow alteration-ecological response relationships, in part because of the confounding of hydrologic alteration with other important environmental determinants of river ecosystem condition (e.g. temperature). Application of the ELOHA framework should therefore occur in a consensus context where stakeholders and decision-makers explicitly evaluate acceptable risk as a balance between the perceived value of the ecological goals, the economic costs involved and the scientific uncertainties in functional relationships between ecological responses and flow alteration. 5. The ELOHA framework also should proceed in an adaptive management context, where collection of monitoring data or targeted field sampling data allows for testing of the proposed flow alteration-ecological response relationships. This empirical validation process allows for a fine-tuning of environmental flow management targets. The ELOHA framework can be used both to guide basic research in hydroecology and to further implementation of more comprehensive environmental flow management of freshwater sustainability on a global scale. ?? 2009 Blackwell Publishing Ltd.
Flow status of three transboundary rivers in Northern Greece as a tool for hydro-diplomacy
NASA Astrophysics Data System (ADS)
Hatzigiannakis, Eyaggelos; Hatzispiroglou, Ioannis; Arampatzis, Georgios; Ilia, Andreas; Pantelakis, Dimitrios; Filintas, Agathos; Panagopoulos, Andreas
2015-04-01
The aim of this paper is to examine how the river flow monitoring consists a tool for hydro-diplomacy. Management of transboundary catchments and the demand of common water resources, often comprise the cause of conflicts and tension threatening the peaceful coexistence of nations. The Water Framework Directive 2000/60/EU sets a base for water management contributing to common approaches, common goals, common principles as well as providing new definitions and measures for Europe's water resources. In northern Greece the main renewable resources are "imported" (over 25% of its water reserves) and for this reason the implementation of continuous flow measurements throughout the year is necessary, even though difficult to achieve. This paper focuses on the three largest transboundary rivers in Northern Greece. Axios and Strymonas river flow across the region of Central Macedonia in Northern Greece. Axios flows from FYROM to Greece, and Strymonas from Bulgaria to Greece. Nestos river flows from Bulgaria to Greece. The Greek part is in the region of Eastern Macedonia and Thrace in Northern Greece. Significant productive agricultural areas around these rivers are irrigated from them so they are very important for the local society. Measurements of the river flow velocity and the flow depth have been made at bridges. The frequency of the measurements is roughly monthly, because it is expected a significant change in the depth flow and discharge. A series of continuously flow measure-ments were performed during 2013 and 2014 using flowmeters (Valeport and OTT type). The cross-section characteristics, the river flow velocity of segments and the mean water flow velocity and discharge total profile were measured and calculated re-spectively. Measurements are conducted in the framework of the national water resources monitoring network, which is realised in compliance to the Water Framework Directive under the supervision and coordination of the Hellenic Ministry for the Environment and Climate Change. This project is elaborated in the framework of the operational program "Environment and Sustainable Development" which is co-funded by the National Strategic Reference Framework (NSRF) and the Public Investment Program (PIP).
On Maximizing the Lifetime of Wireless Sensor Networks by Optimally Assigning Energy Supplies
Asorey-Cacheda, Rafael; García-Sánchez, Antonio Javier; García-Sánchez, Felipe; García-Haro, Joan; Gonzalez-Castaño, Francisco Javier
2013-01-01
The extension of the network lifetime of Wireless Sensor Networks (WSN) is an important issue that has not been appropriately solved yet. This paper addresses this concern and proposes some techniques to plan an arbitrary WSN. To this end, we suggest a hierarchical network architecture, similar to realistic scenarios, where nodes with renewable energy sources (denoted as primary nodes) carry out most message delivery tasks, and nodes equipped with conventional chemical batteries (denoted as secondary nodes) are those with less communication demands. The key design issue of this network architecture is the development of a new optimization framework to calculate the optimal assignment of renewable energy supplies (primary node assignment) to maximize network lifetime, obtaining the minimum number of energy supplies and their node assignment. We also conduct a second optimization step to additionally minimize the number of packet hops between the source and the sink. In this work, we present an algorithm that approaches the results of the optimization framework, but with much faster execution speed, which is a good alternative for large-scale WSN networks. Finally, the network model, the optimization process and the designed algorithm are further evaluated and validated by means of computer simulation under realistic conditions. The results obtained are discussed comparatively. PMID:23939582
On maximizing the lifetime of Wireless Sensor Networks by optimally assigning energy supplies.
Asorey-Cacheda, Rafael; García-Sánchez, Antonio Javier; García-Sánchez, Felipe; García-Haro, Joan; González-Castano, Francisco Javier
2013-08-09
The extension of the network lifetime of Wireless Sensor Networks (WSN) is an important issue that has not been appropriately solved yet. This paper addresses this concern and proposes some techniques to plan an arbitrary WSN. To this end, we suggest a hierarchical network architecture, similar to realistic scenarios, where nodes with renewable energy sources (denoted as primary nodes) carry out most message delivery tasks, and nodes equipped with conventional chemical batteries (denoted as secondary nodes) are those with less communication demands. The key design issue of this network architecture is the development of a new optimization framework to calculate the optimal assignment of renewable energy supplies (primary node assignment) to maximize network lifetime, obtaining the minimum number of energy supplies and their node assignment. We also conduct a second optimization step to additionally minimize the number of packet hops between the source and the sink. In this work, we present an algorithm that approaches the results of the optimization framework, but with much faster execution speed, which is a good alternative for large-scale WSN networks. Finally, the network model, the optimization process and the designed algorithm are further evaluated and validated by means of computer simulation under realistic conditions. The results obtained are discussed comparatively.
ERIC Educational Resources Information Center
Lupart, Judy L.; Mulcahy, Robert F.
Memory performance differences of mental age matched (9-12 years) educable mentally retarded (EMR) (n=56) and normal (n=56) children were examined in two experiments using the F. Craik and R. Lockhart levels of processing framework. In experiment 1, Ss were randomly assigned to an incidental, intentional, or planned intentional learning condition,…
ERIC Educational Resources Information Center
Caton, Hope; Greenhill, Darrel
2014-01-01
This paper describes how a gamified rewards and penalties framework was used to increase attendance and engagement in a level six undergraduate computing module teaching game production. The framework was applied to the same module over two consecutive years: a control year and a trial year. In both years the tutor, assignments and assessment…
Towards a comprehensive assessment and framework for low and high flow water risks
NASA Astrophysics Data System (ADS)
Motschmann, Alina; Huggel, Christian; Drenkhan, Fabian; León, Christian
2017-04-01
Driven by international organizations such as the Intergovernmental Panel on Climate Change (IPCC) the past years have seen a move from a vulnerability concept of climate change impacts towards a risk framework. Risk is now conceived at the intersection of climate-driven hazard and socioeconomic-driven vulnerability and exposure. The concept of risk so far has been mainly adopted for sudden-onset events. However, for slow-onset and cumulative climate change impacts such as changing water resources there is missing clarity and experience how to apply a risk framework. Research has hardly dealt with the challenge of how to integrate both low and high flow risks in a common framework. Comprehensive analyses of risks related to water resources considering climate change within multi-dimensional drivers across different scales are complex and often missing in climate-sensitive mountain regions where data scarcity and inconsistencies represent important limitations. Here we review existing vulnerability and risk assessments of low and high flow water conditions and identify critical conceptual and practical gaps. Based on this, we develop an integrated framework for low and high flow water risks which is applicable to both past and future conditions. The framework explicitly considers a water balance model simulating both water supply and demand on a daily basis. We test and apply this new framework in the highly glacierized Santa River catchment (SRC, Cordillera Blanca, Peru), representative for many developing mountain regions with both low and high flow water risks and poor data availability. In fact, in the SRC, both low and high flow hazards, such as droughts and floods, play a central role especially for agricultural, hydropower, domestic and mining use. During the dry season (austral winter) people are increasingly affected by water scarcity due to shrinking glaciers supplying melt water. On the other hand during the wet season (austral summer) high flow water risks are associated with hazards such as floods and debris flows and high socioeconomic vulnerability and exposure of e. g. infrastructure. Nonetheless, comprehensive water resource risk studies have barely been developed in the SRC and other developing high-mountain regions. To consider all components of risks as well as the economic and social conditions for different processes, a comprehensive risk assessment is needed. The urgency of this matter is emphasized by recent social conflicts in the SRC and the tropical Andes in general, related to prevailing drought conditions in combination with weak state institutions and unequal decision-making as well as differentiated perspectives on low flow versus high flow risks.
Unsupervised Framework to Monitor Lake Dynamics
NASA Technical Reports Server (NTRS)
Chen, Xi C. (Inventor); Boriah, Shyam (Inventor); Khandelwal, Ankush (Inventor); Kumar, Vipin (Inventor)
2016-01-01
A method of reducing processing time when assigning geographic areas to land cover labels using satellite sensor values includes a processor receiving a feature value for each pixel in a time series of frames of satellite sensor values, each frame containing multiple pixels and each frame covering a same geographic location. For each sub-area of the geographic location, the sub-area is assigned to one of at least three land cover labels. The processor determines a fraction function for a first sub-area assigned to a first land cover label. The sub-areas that were assigned to the first land cover label are reassigned to one of the second land cover label and the third land cover label based on the fraction functions of the sub-areas.
A Detailed Study and Synthesis of Flow Observables in the IP-Glasma+MUSIC+UrQMD Framework
NASA Astrophysics Data System (ADS)
McDonald, Scott; Shen, Chun; Fillion-Gourdeau, François; Jeon, Sangyong; Gale, Charles
2017-11-01
In this work we use the IP-Glasma+MUSIC+UrQMD framework to systematically study a wide range of hadronic flow observables at 2.76 TeV. In addition to the single particle spectra and anisotropic flow coefficients vn previously studied in [S. McDonald, C. Shen, F. Fillion-Gourdeau, S. Jeon and C. Gale, arxiv:arXiv:1609.02958 [hep-ph
Paper recycling framework, the "Wheel of Fiber".
Ervasti, Ilpo; Miranda, Ruben; Kauranen, Ilkka
2016-06-01
At present, there is no reliable method in use that unequivocally describes paper industry material flows and makes it possible to compare geographical regions with each other. A functioning paper industry Material Flow Account (MFA) that uses uniform terminology and standard definitions for terms and structures is necessary. Many of the presently used general level MFAs, which are called frameworks in this article, stress the importance of input and output flows but do not provide a uniform picture of material recycling. Paper industry is an example of a field in which recycling plays a key role. Additionally, terms related to paper industry recycling, such as collection rate, recycling rate, and utilization rate, are not defined uniformly across regions and time. Thus, reliably comparing material recycling activity between geographical regions or calculating any regional summaries is difficult or even impossible. The objective of this study is to give a partial solution to the problem of not having a reliable method in use that unequivocally describes paper industry material flows. This is done by introducing a new material flow framework for paper industry in which the flow and stage structure supports the use of uniform definitions for terms related to paper recycling. This new framework is termed the Detailed Wheel of Fiber. Copyright © 2016 Elsevier Ltd. All rights reserved.
Robust boundary treatment for open-channel flows in divergence-free incompressible SPH
NASA Astrophysics Data System (ADS)
Pahar, Gourabananda; Dhar, Anirban
2017-03-01
A robust Incompressible Smoothed Particle Hydrodynamics (ISPH) framework is developed to simulate specified inflow and outflow boundary conditions for open-channel flow. Being purely divergence-free, the framework offers smoothed and structured pressure distribution. An implicit treatment of Pressure Poison Equation and Dirichlet boundary condition is applied on free-surface to minimize error in velocity-divergence. Beyond inflow and outflow threshold, multiple layers of dummy particles are created according to specified boundary condition. Inflow boundary acts as a soluble wave-maker. Fluid particles beyond outflow threshold are removed and replaced with dummy particles with specified boundary velocity. The framework is validated against different cases of open channel flow with different boundary conditions. The model can efficiently capture flow evolution and vortex generation for random geometry and variable boundary conditions.
49 CFR 173.121 - Class 3-Assignment of packing group.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) The viscosity and flash point are in accordance with the following table: Flow time t in seconds Jet... shall be performed are as follows: (i) Viscosity test. The flow time in seconds is determined at 23 °C...
Multi-Source Autonomous Response for Targeting and Monitoring of Volcanic Activity
NASA Technical Reports Server (NTRS)
Davies, Ashley G.; Doubleday, Joshua R.; Tran, Daniel Q.
2014-01-01
The study of volcanoes is important for both purely scientific and human survival reasons. From a scientific standpoint, volcanic gas and ash emissions contribute significantly to the terrestrial atmosphere. Ash depositions and lava flows can also greatly affect local environments. From a human survival standpoint, many people live within the reach of active volcanoes, and therefore can be endangered by both atmospheric (ash, debris) toxicity and lava flow. There are many potential information sources that can be used to determine how to best monitor volcanic activity worldwide. These are of varying temporal frequency, spatial regard, method of access, and reliability. The problem is how to incorporate all of these inputs in a general framework to assign/task/reconfigure assets to monitor events in a timely fashion. In situ sensing can provide a valuable range of complementary information such as seismographic, discharge, acoustic, and other data. However, many volcanoes are not instrumented with in situ sensors, and those that have sensor networks are restricted to a relatively small numbers of point sensors. Consequently, ideal volcanic study synergistically combines space and in situ measurements. This work demonstrates an effort to integrate spaceborne sensing from MODIS (Terra and Aqua), ALI (EO-1), Worldview-2, and in situ sensing in an automated scheme to improve global volcano monitoring. Specifically, it is a "sensor web" concept in which a number of volcano monitoring systems are linked together to monitor volcanic activity more accurately, and this activity measurement automatically tasks space assets to acquire further satellite imagery of ongoing volcanic activity. A general framework was developed for evidence combination that accounts for multiple information sources in a scientist-directed fashion to weigh inputs and allocate observations based on the confidence of an events occurrence, rarity of the event at that location, and other scientists' inputs. The software framework uses multiple source languages and is a general framework for combining inputs and incrementally submitting observation requests/reconfigurations, accounting for prior requests. The autonomous aspect of operations is unique, especially in the context of the wide range of inputs that includes manually inputted electronic reports (such as the Air Force Weather Advisories), automated satellite-based detection methods (such as MODVOLC and GOESVOLC), and in situ sensor networks.
49 CFR 173.124 - Class 4, Divisions 4.1, 4.2 and 4.3-Definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... material must be determined using the testing protocol from Figure 14.2 (Flow Chart for Assigning Self... HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS REGULATIONS... Assignments and Exceptions for Hazardous Materials Other Than Class 1 and Class 7 § 173.124 Class 4, Divisions...
Drag Reduction of an Airfoil Using Deep Learning
NASA Astrophysics Data System (ADS)
Jiang, Chiyu; Sun, Anzhu; Marcus, Philip
2017-11-01
We reduced the drag of a 2D airfoil by starting with a NACA-0012 airfoil and used deep learning methods. We created a database which consists of simulations of 2D external flow over randomly generated shapes. We then developed a machine learning framework for external flow field inference given input shapes. Past work which utilized machine learning in Computational Fluid Dynamics focused on estimations of specific flow parameters, but this work is novel in the inference of entire flow fields. We further showed that learned flow patterns are transferable to cases that share certain similarities. This study illustrates the prospects of deeper integration of data-based modeling into current CFD simulation frameworks for faster flow inference and more accurate flow modeling.
Distributed Computing Framework for Synthetic Radar Application
NASA Technical Reports Server (NTRS)
Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael
2006-01-01
We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.
USDA-ARS?s Scientific Manuscript database
An improved modeling framework for capturing the effects of dynamic resistance to overland flow is developed for intensively managed landscapes. The framework builds on the WEPP model but it removes the limitations of the “equivalent” plane and static roughness assumption. The enhanced model therefo...
Chimaera simulation of complex states of flowing matter
2016-01-01
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro–meso–micro levels through suitable ‘mutations’ of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698031
An Excel Workbook for Identifying Redox Processes in Ground Water
Jurgens, Bryant C.; McMahon, Peter B.; Chapelle, Francis H.; Eberts, Sandra M.
2009-01-01
The reduction/oxidation (redox) condition of ground water affects the concentration, transport, and fate of many anthropogenic and natural contaminants. The redox state of a ground-water sample is defined by the dominant type of reduction/oxidation reaction, or redox process, occurring in the sample, as inferred from water-quality data. However, because of the difficulty in defining and applying a systematic redox framework to samples from diverse hydrogeologic settings, many regional water-quality investigations do not attempt to determine the predominant redox process in ground water. Recently, McMahon and Chapelle (2008) devised a redox framework that was applied to a large number of samples from 15 principal aquifer systems in the United States to examine the effect of redox processes on water quality. This framework was expanded by Chapelle and others (in press) to use measured sulfide data to differentiate between iron(III)- and sulfate-reducing conditions. These investigations showed that a systematic approach to characterize redox conditions in ground water could be applied to datasets from diverse hydrogeologic settings using water-quality data routinely collected in regional water-quality investigations. This report describes the Microsoft Excel workbook, RedoxAssignment_McMahon&Chapelle.xls, that assigns the predominant redox process to samples using the framework created by McMahon and Chapelle (2008) and expanded by Chapelle and others (in press). Assignment of redox conditions is based on concentrations of dissolved oxygen (O2), nitrate (NO3-), manganese (Mn2+), iron (Fe2+), sulfate (SO42-), and sulfide (sum of dihydrogen sulfide [aqueous H2S], hydrogen sulfide [HS-], and sulfide [S2-]). The logical arguments for assigning the predominant redox process to each sample are performed by a program written in Microsoft Visual Basic for Applications (VBA). The program is called from buttons on the main worksheet. The number of samples that can be analyzed is only limited by the number of rows in Excel (65,536 for Excel 2003 and XP; and 1,048,576 for Excel 2007), and is therefore appropriate for large datasets.
Shera, E. Brooks
1988-01-01
A detection system is provided for identifying individual particles or molecules having characteristic emission in a flow train of the particles in a flow cell. A position sensitive sensor is located adjacent the flow cell in a position effective to detect the emissions from the particles within the flow cell and to assign spatial and temporal coordinates for the detected emissions. A computer is then enabled to predict spatial and temporal coordinates for the particle in the flow train as a function of a first detected emission. Comparison hardware or software then compares subsequent detected spatial and temporal coordinates with the predicted spatial and temporal coordinates to determine whether subsequently detected emissions originate from a particle in the train of particles. In one embodiment, the particles include fluorescent dyes which are excited to fluoresce a spectrum characteristic of the particular particle. Photones are emitted adjacent at least one microchannel plate sensor to enable spatial and temporal coordinates to be assigned. The effect of comparing detected coordinates with predicted coordinates is to define a moving sample volume which effectively precludes the effects of background emissions.
Using parentage analysis to examine gene flow and spatial genetic structure.
Kane, Nolan C; King, Matthew G
2009-04-01
Numerous approaches have been developed to examine recent and historical gene flow between populations, but few studies have used empirical data sets to compare different approaches. Some methods are expected to perform better under particular scenarios, such as high or low gene flow, but this, too, has rarely been tested. In this issue of Molecular Ecology, Saenz-Agudelo et al. (2009) apply assignment tests and parentage analysis to microsatellite data from five geographically proximal (2-6 km) and one much more distant (1500 km) panda clownfish populations, showing that parentage analysis performed better in situations of high gene flow, while their assignment tests did better with low gene flow. This unusually complete data set is comprised of multiple exhaustively sampled populations, including nearly all adults and large numbers of juveniles, enabling the authors to ask questions that in many systems would be impossible to answer. Their results emphasize the importance of selecting the right analysis to use, based on the underlying model and how well its assumptions are met by the populations to be analysed.
Shera, E.B.
1987-10-07
A detection system is provided for identifying individual particles or molecules having characteristic emission in a flow train of the particles in a flow cell. A position sensitive sensor is located adjacent the flow cell in a position effective to detect the emissions from the particles within the flow cell and to assign spatial and temporal coordinates for the detected emissions. A computer is then enabled to predict spatial and temporal coordinates for the particle in the flow train as a function of a first detected emission. Comparison hardware or software then compares subsequent detected spatial and temporal coordinates with the predicted spatial and temporal coordinates to determine whether subsequently detected emissions originate from a particle in the train of particles. In one embodiment, the particles include fluorescent dyes which are excited to fluoresce a spectrum characteristic of the particular particle. Photons are emitted adjacent at least one microchannel plate sensor to enable spatial and temporal coordinates to be assigned. The effect of comparing detected coordinates with predicted coordinates is to define a moving sample volume which effectively precludes the effects of background emissions. 3 figs.
Procrastination, Flow, and Academic Performance in Real Time Using the Experience Sampling Method.
Sumaya, Isabel C; Darling, Emily
2018-01-01
The authors' aim was to first provide an alternative methodology in the assessment of procrastination and flow that would not reply on retrospective or prospective self-reports. Using real-time assessment of both procrastination and flow, the authors investigated how these factors impact academic performance by using the Experience Sampling Method. They assessed flow by measuring student self-reported skill versus challenge, and procrastination by measuring the days to completion of an assignment. Procrastination and flow were measured for six days before a writing assignment due date while students (n = 14) were enrolled in a research methods course. Regardless of status of flow, both the nonflow and flow groups showed high levels of procrastination. Students who experienced flow as they worked on their paper, in real time, earned significantly higher grades (M = 3.05 ± 0.30: an average grade of B) as compared with the nonflow group (M = 1.16 ± 0.33: an average grade of D; p = .007). Additionally, students experiencing flow were more accurate in predicting their grade (difference scores, flow M = 0.12 ± 0.33 vs. nonflow M = 1.39 ± 0.29; p = .015). Students in the nonflow group were nearly a grade and a half off in their prediction of their grade on the paper. To the authors' knowledge, the study is the first to provide experimental evidence showing differences in academic performance between students experiencing flow and nonflow students.
Bertlich, Mattis; Ihler, Fritz; Sharaf, Kariem; Weiss, Bernhard G; Strupp, Michael; Canis, Martin
2014-10-01
Betahistine is a histamine-like drug that is used in the treatment of Ménière's disease. It is commonly believed that betahistine increases cochlear blood flow and thus decreases the endolymphatic hydrops that is the cause of Ménière's. Despite common clinical use, there is little understanding of the kinetics or effects of its metabolites. This study investigated the effect of the betahistine metabolites aminoethylpyridine, hydroxyethylpyridine, and pyridylacetic acid on cochlear microcirculation. Guinea pigs were randomly assigned to one of the groups: placebo, betahistine, or equimolar amounts of aminoethylpyridine, hydroxyethylpyridine, or pyridylacetic acid. Cochlear blood flow and mean arterial pressure were recorded for three minutes before and 15 minutes after treatment. Thirty Dunkin-Hartley guinea pigs assigned to one of five groups with six guinea pigs per group. Betahistine, aminoethylpyridine, and hydroxyethylpyridine caused a significant increase in cochlear blood flow in comparison to placebo. The effect seen under aminoethylpyridin was greatest. The group treated with pyridylacetic acid showed no significant effect on cochlear blood flow. Aminoethylpyridine and hydroxyethylpyridine are, like betahistine, able to increase cochlear blood flow significantly. The effect of aminoethylpyridine was greatest. Pyridylacetic acid had no effect on cochlear microcirculation.
Fragment assignment in the cloud with eXpress-D
2013-01-01
Background Probabilistic assignment of ambiguously mapped fragments produced by high-throughput sequencing experiments has been demonstrated to greatly improve accuracy in the analysis of RNA-Seq and ChIP-Seq, and is an essential step in many other sequence census experiments. A maximum likelihood method using the expectation-maximization (EM) algorithm for optimization is commonly used to solve this problem. However, batch EM-based approaches do not scale well with the size of sequencing datasets, which have been increasing dramatically over the past few years. Thus, current approaches to fragment assignment rely on heuristics or approximations for tractability. Results We present an implementation of a distributed EM solution to the fragment assignment problem using Spark, a data analytics framework that can scale by leveraging compute clusters within datacenters–“the cloud”. We demonstrate that our implementation easily scales to billions of sequenced fragments, while providing the exact maximum likelihood assignment of ambiguous fragments. The accuracy of the method is shown to be an improvement over the most widely used tools available and can be run in a constant amount of time when cluster resources are scaled linearly with the amount of input data. Conclusions The cloud offers one solution for the difficulties faced in the analysis of massive high-thoughput sequencing data, which continue to grow rapidly. Researchers in bioinformatics must follow developments in distributed systems–such as new frameworks like Spark–for ways to port existing methods to the cloud and help them scale to the datasets of the future. Our software, eXpress-D, is freely available at: http://github.com/adarob/express-d. PMID:24314033
Chimaera simulation of complex states of flowing matter.
Succi, S
2016-11-13
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro-meso-micro levels through suitable 'mutations' of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
A Conceptual Framework for Adaptive Project Management in the Department of Defense
2016-04-30
schedule work) established a core set of principles that went unchallenged until the start of the 21st century. This belief that managing...detailed planning, task decomposition and assignment of hours at the start of a project as unnecessary, often wasted effort that sacrifices accuracy...with the illusion of precision. Work, at the task level, is best assigned by the team performing the work as close as possible to the actual start
Take-Home Experiments in Undergraduate Fluid Mechanics Education
NASA Astrophysics Data System (ADS)
Cimbala, John
2007-11-01
Hands-on take-home experiments, assigned as homework, are useful as supplements to traditional in-class demonstrations and laboratories. Students borrow the equipment from the department's equipment room, and perform the experiment either at home or in the student lounge or student shop work area. Advantages include: (1) easy implementation, especially for large classes, (2) low cost and easy duplication of multiple units, (3) no loss of lecture time since the take-home experiment is self-contained with all necessary instructions, and (4) negligible increase in student or teaching assistant work load since the experiment is assigned as a homework problem in place of a traditional pen and paper problem. As an example, a pump flow take-home experiment was developed, implemented, and assessed in our introductory junior-level fluid mechanics course at Penn State. The experimental apparatus consists of a bucket, tape measure, submersible aquarium pump, tubing, measuring cup, and extension cord. We put together twenty sets at a total cost of less than 20 dollars per set. Students connect the tube to the pump outlet, submerge the pump in water, and measure the volume flow rate produced at various outflow elevations. They record and plot volume flow rate as a function of outlet elevation, and compare with predictions based on the manufacturer's pump performance curve (head versus volume flow rate) and flow losses. The homework assignment includes an online pre-test and post-test to assess the change in students' understanding of the principles of pump performance. The results of the assessment support a significant learning gain following the completion of the take-home experiment.
ERIC Educational Resources Information Center
de Pablos, Patricia Ordonez
2006-01-01
Purpose: The purpose of this paper is to analyse knowledge transfers in transnational corporations. Design/methodology/approach: The paper develops a conceptual framework for the analysis of knowledge flow transfers in transnationals. Based on this theoretical framework, the paper propose's research hypotheses and builds a causal model that links…
Achieving Agility and Stability in Large-Scale Software Development
2013-01-16
temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon
Modelling information flow along the human connectome using maximum flow.
Lyoo, Youngwook; Kim, Jieun E; Yoon, Sujung
2018-01-01
The human connectome is a complex network that transmits information between interlinked brain regions. Using graph theory, previously well-known network measures of integration between brain regions have been constructed under the key assumption that information flows strictly along the shortest paths possible between two nodes. However, it is now apparent that information does flow through non-shortest paths in many real-world networks such as cellular networks, social networks, and the internet. In the current hypothesis, we present a novel framework using the maximum flow to quantify information flow along all possible paths within the brain, so as to implement an analogy to network traffic. We hypothesize that the connection strengths of brain networks represent a limit on the amount of information that can flow through the connections per unit of time. This allows us to compute the maximum amount of information flow between two brain regions along all possible paths. Using this novel framework of maximum flow, previous network topological measures are expanded to account for information flow through non-shortest paths. The most important advantage of the current approach using maximum flow is that it can integrate the weighted connectivity data in a way that better reflects the real information flow of the brain network. The current framework and its concept regarding maximum flow provides insight on how network structure shapes information flow in contrast to graph theory, and suggests future applications such as investigating structural and functional connectomes at a neuronal level. Copyright © 2017 Elsevier Ltd. All rights reserved.
Optimal erasure protection for scalably compressed video streams with limited retransmission.
Taubman, David; Thie, Johnson
2005-08-01
This paper shows how the priority encoding transmission (PET) framework may be leveraged to exploit both unequal error protection and limited retransmission for RD-optimized delivery of streaming media. Previous work on scalable media protection with PET has largely ignored the possibility of retransmission. Conversely, the PET framework has not been harnessed by the substantial body of previous work on RD optimized hybrid forward error correction/automatic repeat request schemes. We limit our attention to sources which can be modeled as independently compressed frames (e.g., video frames), where each element in the scalable representation of each frame can be transmitted in one or both of two transmission slots. An optimization algorithm determines the level of protection which should be assigned to each element in each slot, subject to transmission bandwidth constraints. To balance the protection assigned to elements which are being transmitted for the first time with those which are being retransmitted, the proposed algorithm formulates a collection of hypotheses concerning its own behavior in future transmission slots. We show how the PET framework allows for a decoupled optimization algorithm with only modest complexity. Experimental results obtained with Motion JPEG2000 compressed video demonstrate that substantial performance benefits can be obtained using the proposed framework.
Agent-based Large-Scale Emergency Evacuation Using Real-Time Open Government Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Wei; Liu, Cheng; Bhaduri, Budhendra L
The open government initiatives have provided tremendous data resources for the transportation system and emergency services in urban areas. This paper proposes a traffic simulation framework using high temporal resolution demographic data and real time open government data for evacuation planning and operation. A comparison study using real-world data in Seattle, Washington is conducted to evaluate the framework accuracy and evacuation efficiency. The successful simulations of selected area prove the concept to take advantage open government data, open source data, and high resolution demographic data in emergency management domain. There are two aspects of parameters considered in this study: usermore » equilibrium (UE) conditions of traffic assignment model (simple Non-UE vs. iterative UE) and data temporal resolution (Daytime vs. Nighttime). Evacuation arrival rate, average travel time, and computation time are adopted as Measure of Effectiveness (MOE) for evacuation performance analysis. The temporal resolution of demographic data has significant impacts on urban transportation dynamics during evacuation scenarios. Better evacuation performance estimation can be approached by integrating both Non-UE and UE scenarios. The new framework shows flexibility in implementing different evacuation strategies and accuracy in evacuation performance. The use of this framework can be explored to day-to-day traffic assignment to support daily traffic operations.« less
Aiewsakun, Pakorn; Simmonds, Peter
2018-02-20
The International Committee on Taxonomy of Viruses (ICTV) classifies viruses into families, genera and species and provides a regulated system for their nomenclature that is universally used in virus descriptions. Virus taxonomic assignments have traditionally been based upon virus phenotypic properties such as host range, virion morphology and replication mechanisms, particularly at family level. However, gene sequence comparisons provide a clearer guide to their evolutionary relationships and provide the only information that may guide the incorporation of viruses detected in environmental (metagenomic) studies that lack any phenotypic data. The current study sought to determine whether the existing virus taxonomy could be reproduced by examination of genetic relationships through the extraction of protein-coding gene signatures and genome organisational features. We found large-scale consistency between genetic relationships and taxonomic assignments for viruses of all genome configurations and genome sizes. The analysis pipeline that we have called 'Genome Relationships Applied to Virus Taxonomy' (GRAViTy) was highly effective at reproducing the current assignments of viruses at family level as well as inter-family groupings into orders. Its ability to correctly differentiate assigned viruses from unassigned viruses, and classify them into the correct taxonomic group, was evaluated by threefold cross-validation technique. This predicted family membership of eukaryotic viruses with close to 100% accuracy and specificity potentially enabling the algorithm to predict assignments for the vast corpus of metagenomic sequences consistently with ICTV taxonomy rules. In an evaluation run of GRAViTy, over one half (460/921) of (near)-complete genome sequences from several large published metagenomic eukaryotic virus datasets were assigned to 127 novel family-level groupings. If corroborated by other analysis methods, these would potentially more than double the number of eukaryotic virus families in the ICTV taxonomy. A rapid and objective means to explore metagenomic viral diversity and make informed recommendations for their assignments at each taxonomic layer is essential. GRAViTy provides one means to make rule-based assignments at family and order levels in a manner that preserves the integrity and underlying organisational principles of the current ICTV taxonomy framework. Such methods are increasingly required as the vast virosphere is explored.
ERIC Educational Resources Information Center
Stanford Univ., CA. School Mathematics Study Group.
This is the second unit of a 15-unit School Mathematics Study Group (SMSG) mathematics text for high school students. Topics presented in the first chapter (Informal Algorithms and Flow Charts) include: changing a flat tire; algorithms, flow charts, and computers; assignment and variables; input and output; using a variable as a counter; decisions…
Empirical model for the volume-change behavior of debris flows
Cannon, S.H.; ,
1993-01-01
The potential travel down hillsides; movement stops where the volume-change behavior of flows as they travel down hillsides ; movement stops where the volume of actively flowing debris becomes negligible. The average change in volume over distance for 26 recent debris flows in the Honolulu area was assumed to be a function of the slope over which the debris flow traveled, the degree of flow confinement by the channel, and an assigned value for the type of vegetation through which the debris flow traveled. Analysis of the data yielded a relation that can be incorporated into digital elevation models to characterize debris-flow travel on Oahu.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vernon, Christopher R.; Arntzen, Evan V.; Richmond, Marshall C.
Assessing the environmental benefits of proposed flow modification to large rivers provides invaluable insight into future hydropower project operations and relicensing activities. Providing a means to quantitatively define flow-ecology relationships is integral in establishing flow regimes that are mutually beneficial to power production and ecological needs. To compliment this effort an opportunity to create versatile tools that can be applied to broad geographic areas has been presented. In particular, integration with efforts standardized within the ecological limits of hydrologic alteration (ELOHA) is highly advantageous (Poff et al. 2010). This paper presents a geographic information system (GIS) framework for large rivermore » classification that houses a base geomorphic classification that is both flexible and accurate, allowing for full integration with other hydrologic models focused on addressing ELOHA efforts. A case study is also provided that integrates publically available National Hydrography Dataset Plus Version 2 (NHDPlusV2) data, Modular Aquatic Simulation System two-dimensional (MASS2) hydraulic data, and field collected data into the framework to produce a suite of flow-ecology related outputs. The case study objective was to establish areas of optimal juvenile salmonid rearing habitat under varying flow regimes throughout an impounded portion of the lower Snake River, USA (Figure 1) as an indicator to determine sites where the potential exists to create additional shallow water habitat. Additionally, an alternative hydrologic classification useable throughout the contiguous United States which can be coupled with the geomorphic aspect of this framework is also presented. This framework provides the user with the ability to integrate hydrologic and ecologic data into the base geomorphic aspect of this framework within a geographic information system (GIS) to output spatiotemporally variable flow-ecology relationship scenarios.« less
Towards Automated Structure-Based NMR Resonance Assignment
NASA Astrophysics Data System (ADS)
Jang, Richard; Gao, Xin; Li, Ming
We propose a general framework for solving the structure-based NMR backbone resonance assignment problem. The core is a novel 0-1 integer programming model that can start from a complete or partial assignment, generate multiple assignments, and model not only the assignment of spins to residues, but also pairwise dependencies consisting of pairs of spins to pairs of residues. It is still a challenge for automated resonance assignment systems to perform the assignment directly from spectra without any manual intervention. To test the feasibility of this for structure-based assignment, we integrated our system with our automated peak picking and sequence-based resonance assignment system to obtain an assignment for the protein TM1112 with 91% recall and 99% precision without manual intervention. Since using a known structure has the potential to allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data, we work towards the goal of automated structure-based assignment using only such labeled data. Our system reduced the assignment error of Xiong-Pandurangan-Bailey-Kellogg's contact replacement (CR) method, which to our knowledge is the most error-tolerant method for this problem, by 5 folds on average. By using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for Ubiquitin, where the type prediction accuracy is 83%, we achieved 91% assignment accuracy, compared to the 59% accuracy that was obtained without correcting for typing errors.
Machine learning in a graph framework for subcortical segmentation
NASA Astrophysics Data System (ADS)
Guo, Zhihui; Kashyap, Satyananda; Sonka, Milan; Oguz, Ipek
2017-02-01
Automated and reliable segmentation of subcortical structures from human brain magnetic resonance images is of great importance for volumetric and shape analyses in quantitative neuroimaging studies. However, poor boundary contrast and variable shape of these structures make the automated segmentation a tough task. We propose a 3D graph-based machine learning method, called LOGISMOS-RF, to segment the caudate and the putamen from brain MRI scans in a robust and accurate way. An atlas-based tissue classification and bias-field correction method is applied to the images to generate an initial segmentation for each structure. Then a 3D graph framework is utilized to construct a geometric graph for each initial segmentation. A locally trained random forest classifier is used to assign a cost to each graph node. The max-flow algorithm is applied to solve the segmentation problem. Evaluation was performed on a dataset of T1-weighted MRI's of 62 subjects, with 42 images used for training and 20 images for testing. For comparison, FreeSurfer, FSL and BRAINSCut approaches were also evaluated using the same dataset. Dice overlap coefficients and surface-to-surfaces distances between the automated segmentation and expert manual segmentations indicate the results of our method are statistically significantly more accurate than the three other methods, for both the caudate (Dice: 0.89 +/- 0.03) and the putamen (0.89 +/- 0.03).
Vulnerability detection using data-flow graphs and SMT solvers
2016-10-31
concerns. The framework is modular and pipelined to allow scalable analysis on distributed systems. Our vulnerability detection framework employs machine...Design We designed the framework to be modular to enable flexible reuse and extendibility. In its current form, our framework performs the following
A Level-set based framework for viscous simulation of particle-laden supersonic flows
NASA Astrophysics Data System (ADS)
Das, Pratik; Sen, Oishik; Jacobs, Gustaaf; Udaykumar, H. S.
2017-06-01
Particle-laden supersonic flows are important in natural and industrial processes, such as, volcanic eruptions, explosions, pneumatic conveyance of particle in material processing etc. Numerical study of such high-speed particle laden flows at the mesoscale calls for a numerical framework which allows simulation of supersonic flow around multiple moving solid objects. Only a few efforts have been made toward development of numerical frameworks for viscous simulation of particle-fluid interaction in supersonic flow regime. The current work presents a Cartesian grid based sharp-interface method for viscous simulations of interaction between supersonic flow with moving rigid particles. The no-slip boundary condition is imposed at the solid-fluid interfaces using a modified ghost fluid method (GFM). The current method is validated against the similarity solution of compressible boundary layer over flat-plate and benchmark numerical solution for steady supersonic flow over cylinder. Further validation is carried out against benchmark numerical results for shock induced lift-off of a cylinder in a shock tube. 3D simulation of steady supersonic flow over sphere is performed to compare the numerically obtained drag co-efficient with experimental results. A particle-resolved viscous simulation of shock interaction with a cloud of particles is performed to demonstrate that the current method is suitable for large-scale particle resolved simulations of particle-laden supersonic flows.
A Study of Flow Theory in the Foreign Language Classroom.
ERIC Educational Resources Information Center
Egbert, Joy
2003-01-01
Focuses on the relationship between flow experiences and language learning. Flow theory suggests that flow experiences can lead to optimal learning. Findings suggest flow does exist in the foreign language classroom and that flow theory offers an interesting and useful framework for conceptualizing and evaluating language learning activities.…
A Conceptual Framework for the Indirect Method of Reporting Net Cash Flow from Operating Activities
ERIC Educational Resources Information Center
Wang, Ting J.
2010-01-01
This paper describes the fundamental concept of the reconciliation behind the indirect method of the statement of cash flows. A conceptual framework is presented to demonstrate how accrual and cash-basis accounting methods relate to each other and to illustrate the concept of reconciling these two accounting methods. The conceptual framework…
Reframing landscape fragmentation's effects on ecosystem services.
Mitchell, Matthew G E; Suarez-Castro, Andrés F; Martinez-Harms, Maria; Maron, Martine; McAlpine, Clive; Gaston, Kevin J; Johansen, Kasper; Rhodes, Jonathan R
2015-04-01
Landscape structure and fragmentation have important effects on ecosystem services, with a common assumption being that fragmentation reduces service provision. This is based on fragmentation's expected effects on ecosystem service supply, but ignores how fragmentation influences the flow of services to people. Here we develop a new conceptual framework that explicitly considers the links between landscape fragmentation, the supply of services, and the flow of services to people. We argue that fragmentation's effects on ecosystem service flow can be positive or negative, and use our framework to construct testable hypotheses about the effects of fragmentation on final ecosystem service provision. Empirical efforts to apply and test this framework are critical to improving landscape management for multiple ecosystem services. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Gas-Kinetic Scheme for Reactive Flows
NASA Technical Reports Server (NTRS)
Lian,Youg-Sheng; Xu, Kun
1998-01-01
In this paper, the gas-kinetic BGK scheme for the compressible flow equations is extended to chemical reactive flow. The mass fraction of the unburnt gas is implemented into the gas kinetic equation by assigning a new internal degree of freedom to the particle distribution function. The new variable can be also used to describe fluid trajectory for the nonreactive flows. Due to the gas-kinetic BGK model, the current scheme basically solves the Navier-Stokes chemical reactive flow equations. Numerical tests validate the accuracy and robustness of the current kinetic method.
NASA Astrophysics Data System (ADS)
Li, Ni; Huai, Wenqing; Wang, Shaodan
2017-08-01
C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.
Sensor assignment to mission in AI-TECD
NASA Astrophysics Data System (ADS)
Ganger, Robert; de Mel, Geeth; Pham, Tien; Rudnicki, Ronald; Schreiber, Yonatan
2016-05-01
Sensor-mission assignment involves the allocation of sensors and other information-providing resources to missions in order to cover the information needs of the individual tasks within each mission. The importance of efficient and effective means to find appropriate resources for tasks is exacerbated in the coalition context where the operational environment is dynamic and a multitude of critically important tasks need to achieve their collective goals to meet the objectives of the coalition. The Sensor Assignment to Mission (SAM) framework—a research product of the International Technology Alliance in Network and Information Sciences (NIS-ITA) program—provided the first knowledge intensive resource selection approach for the sensor network domain so that contextual information could be used to effectively select resources for tasks in coalition environments. Recently, CUBRC, Inc. was tasked with operationalizing the SAM framework through the use of the I2WD Common Core Ontologies for the Communications-Electronics Research, Development and Engineering Center (CERDEC) sponsored Actionable Intelligence Technology Enabled Capabilities Demonstration (AI-TECD). The demonstration event took place at Fort Dix, New Jersey during July 2015, and this paper discusses the integration and the successful demonstration of the SAM framework within the AI-TECD, lessons learned, and its potential impact in future operations.
Surface phenomena revealed by in situ imaging: studies from adhesion, wear and cutting
NASA Astrophysics Data System (ADS)
Viswanathan, Koushik; Mahato, Anirban; Yeung, Ho; Chandrasekar, Srinivasan
2017-03-01
Surface deformation and flow phenomena are ubiquitous in mechanical processes. In this work we present an in situ imaging framework for studying a range of surface mechanical phenomena at high spatial resolution and across a range of time scales. The in situ framework is capable of resolving deformation and flow fields quantitatively in terms of surface displacements, velocities, strains and strain rates. Three case studies are presented demonstrating the power of this framework for studying surface deformation. In the first, the origin of stick-slip motion in adhesive polymer interfaces is investigated, revealing a intimate link between stick-slip and surface wave propagation. Second, the role of flow in mediating formation of surface defects and wear particles in metals is analyzed using a prototypical sliding process. It is shown that conventional post-mortem observation and inference can lead to erroneous conclusions with regard to formation of surface cracks and wear particles. The in situ framework is shown to unambiguously capture delamination wear in sliding. Third, material flow and surface deformation in a typical cutting process is analyzed. It is shown that a long-standing problem in the cutting of annealed metals is resolved by the imaging, with other benefits such as estimation of energy dissipation and power from the flow fields. In closure, guidelines are provided for profitably exploiting in situ observations to study large-strain deformation, flow and friction phenomena at surfaces that display a variety of time-scales.
Villa, Stefano; Prenestini, Anna; Giusepi, Isabella
2014-04-01
Through a comparative study of six Italian hospitals, the paper develops and tests a framework to analyze hospital-wide patient flow performance. The framework adopts a system-wide approach to patient flow management and is structured around three different levels: (1) the hospital, (2) the pipelines (possible patient journeys within the hospital) and (3) the production units (physical spaces, such as operating rooms, where service delivery takes places). The focus groups and the data analysis conducted within the study support that the model is a useful tool to investigate hospital-wide implications of patient flows. The paper provides also evidence about the causes of hospital patient flow problems. Particularly, while shortage of capacity does not seem to be a relevant driver, our data shows that patient flow variability caused by inadequate allocation of capacity does represent a key problem. Results also show that the lack of coordination between different pipelines and production units is critical. Finally, the problem of overlapping between elective and unscheduled cases can be solved by setting aside a certain level of capacity for unexpected peaks. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals
Matt, Dominik T.
2017-01-01
Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system. PMID:29065578
Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals.
Arcidiacono, Gabriele; Matt, Dominik T; Rauch, Erwin
2017-01-01
Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system.
Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals
Arcidiacono, Gabriele; Matt, Dominik T.; Rauch, Erwin
2017-01-01
Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system. © 2017 Gabriele Arcidiacono et al.
The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.
ERIC Educational Resources Information Center
New York Association of Training and Employment Professionals, Albany.
This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…
Improvement of a 2D numerical model of lava flows
NASA Astrophysics Data System (ADS)
Ishimine, Y.
2013-12-01
I propose an improved procedure that reduces an improper dependence of lava flow directions on the orientation of Digital Elevation Model (DEM) in two-dimensional simulations based on Ishihara et al. (in Lava Flows and Domes, Fink, JH eds., 1990). The numerical model for lava flow simulations proposed by Ishihara et al. (1990) is based on two-dimensional shallow water model combined with a constitutive equation for a Bingham fluid. It is simple but useful because it properly reproduces distributions of actual lava flows. Thus, it has been regarded as one of pioneer work of numerical simulations of lava flows and it is still now widely used in practical hazard prediction map for civil defense officials in Japan. However, the model include an improper dependence of lava flow directions on the orientation of DEM because the model separately assigns the condition for the lava flow to stop due to yield stress for each of two orthogonal axes of rectangular calculating grid based on DEM. This procedure brings a diamond-shaped distribution as shown in Fig. 1 when calculating a lava flow supplied from a point source on a virtual flat plane although the distribution should be circle-shaped. To improve the drawback, I proposed a modified procedure that uses the absolute value of yield stress derived from both components of two orthogonal directions of the slope steepness to assign the condition for lava flows to stop. This brings a better result as shown in Fig. 2. Fig. 1. (a) Contour plots calculated with the original model of Ishihara et al. (1990). (b) Contour plots calculated with a proposed model.
NASA Astrophysics Data System (ADS)
Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia
2015-04-01
Climate uncertainty can affect water resources availability and management decisions. Sustainable water resources management therefore requires evaluation of policy and management decisions under a wide range of possible future water supply conditions. This study proposes a risk-based framework to integrate water supply uncertainty into a forward-looking decision making context. To apply this framework, a stochastic reconstruction scheme is used to generate a large ensemble of flow series. For the Rocky Mountain basins considered here, two key characteristics of the annual hydrograph are its annual flow volume and the timing of the seasonal flood peak. These are perturbed to represent natural randomness and potential changes due to future climate. 30-year series of perturbed flows are used as input to the SWAMP model - an integrated water resources model that simulates regional water supply-demand system and estimates economic productivity of water and other sustainability indicators, including system vulnerability and resilience. The simulation results are used to construct 2D-maps of net revenue of a particular water sector; e.g., hydropower, or for all sectors combined. Each map cell represents a risk scenario of net revenue based on a particular annual flow volume, timing of the peak flow, and 200 stochastic realizations of flow series. This framework is demonstrated for a water resources system in the Saskatchewan River Basin (SaskRB) in Saskatchewan, Canada. Critical historical drought sequences, derived from tree-ring reconstructions of several hundred years of annual river flows, are used to evaluate the system's performance (net revenue risk) under extremely low flow conditions and also to locate them on the previously produced 2D risk maps. This simulation and analysis framework is repeated under various reservoir operation strategies (e.g., maximizing flood protection or maximizing water supply security); development proposals, such as irrigation expansion; and change in energy prices. Such risk-based analysis demonstrates relative reduction/increase of risk associated with management and policy decisions and allow decision makers to explore the relative importance of policy versus natural water supply change in a water resources system.
USDA-ARS?s Scientific Manuscript database
The objective was to examine the effect of maternal nutrient restriction followed by realimentation during mid-gestation on uterine blood flow (BF). On Day 30 of pregnancy, lactating, multiparous Simmental beef cows were assigned randomly to treatments: control (CON; 100% National Research Council; ...
50 CFR 679.93 - Amendment 80 Program recordkeeping, permits, monitoring, and catch accounting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED... space to accommodate a minimum of 10 observer sampling baskets. This space must be within or adjacent to... observers assigned to the vessel. (8) Belt and flow operations. The vessel operator stops the flow of fish...
76 FR 30322 - Notice of Availability of Government-Owned Inventions; Available for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-25
... below are assigned to the United States Government as represented by the Secretary of the Navy. U.S... ``Automatic Clock Synchronization and Distribution Circuit for Counter Clock Flow Pipelined Systems'' issued... Flow and Metallic Conformal Coating of Conductive Templates'' issued on October 12, 2010; U.S. Patent...
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
2015-01-01
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Kozachuk, Olesia; Yusenko, Kirill; Noei, Heshmat; Wang, Yuemin; Walleck, Stephan; Glaser, Thorsten; Fischer, Roland A
2011-08-14
Phase-pure crystalline thin films of a mixed-valence Ru(2)(II,III) metal-organic framework with 1,3,5-benzenetricarboxylate (btc) as a linker were solvothermally grown on amorphous alumina and silica surfaces. Based on the Rietveld refinement, the structure of Ru-MOF was assigned to be analogous to [Cu(3)(btc)(2)] (HKUST-1). This journal is © The Royal Society of Chemistry 2011
DOT National Transportation Integrated Search
1997-01-01
The success of Advanced Traveler Information Systems (ATIS) and Advanced Traffic Management Systems (ATMS) depends on the availability and dissemination of timely and accurate estimates of current and emerging traffic network conditions. Real-time Dy...
DOT National Transportation Integrated Search
2011-01-01
This study develops an enhanced transportation planning framework by augmenting the sequential four-step : planning process with post-processing techniques. The post-processing techniques are incorporated through a feedback : mechanism and aim to imp...
Application and Validation of Remaining Service Interval Framework for Pavements
DOT National Transportation Integrated Search
2016-10-01
The pavement remaining service interval (RSI) terminology was developed to remove confusion caused by the multitude of meanings assigned to the various forms of pavement remaining service life (RSL). The RSI concept considers the complete maintenance...
Hoffman, Robert A; Wang, Lili; Bigos, Martin; Nolan, John P
2012-09-01
Results from a standardization study cosponsored by the International Society for Advancement of Cytometry (ISAC) and the US National Institute of Standards and Technology (NIST) are reported. The study evaluated the variability of assigning intensity values to fluorophore standard beads by bead manufacturers and the variability of cross calibrating the standard beads to stained polymer beads (hard-dyed beads) using different flow cytometers. Hard dyed beads are generally not spectrally matched to the fluorophores used to stain cells, and spectral response varies among flow cytometers. Thus if hard dyed beads are used as fluorescence calibrators, one expects calibration for specific fluorophores (e.g., FITC or PE) to vary among different instruments. Using standard beads surface-stained with specific fluorophores (FITC, PE, APC, and Pacific Blue™), the study compared the measured intensity of fluorophore standard beads to that of hard dyed beads through cross calibration on 133 different flow cytometers. Using robust CV as a measure of variability, the variation of cross calibrated values was typically 20% or more for a particular hard dyed bead in a specific detection channel. The variation across different instrument models was often greater than the variation within a particular instrument model. As a separate part of the study, NIST and four bead manufacturers used a NIST supplied protocol and calibrated fluorophore solution standards to assign intensity values to the fluorophore beads. Values assigned to the reference beads by different groups varied by orders of magnitude in most cases, reflecting differences in instrumentation used to perform the calibration. The study concluded that the use of any spectrally unmatched hard dyed bead as a general fluorescence calibrator must be verified and characterized for every particular instrument model. Close interaction between bead manufacturers and NIST is recommended to have reliable and uniformly assigned fluorescence standard beads. Copyright © 2012 International Society for Advancement of Cytometry.
Analysis of Complex Valve and Feed Systems
NASA Technical Reports Server (NTRS)
Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Cavallo, Peter; Dash, Sanford
2007-01-01
A numerical framework for analysis of complex valve systems supports testing of propulsive systems by simulating key valve and control system components in the test loop. In particular, it is designed to enhance the analysis capability in terms of identifying system transients and quantifying the valve response to these transients. This system has analysis capability for simulating valve motion in complex systems operating in diverse flow regimes ranging from compressible gases to cryogenic liquids. A key feature is the hybrid, unstructured framework with sub-models for grid movement and phase change including cryogenic cavitations. The multi-element unstructured framework offers improved predictions of valve performance characteristics under steady conditions for structurally complex valves such as pressure regulator valve. Unsteady simulations of valve motion using this computational approach have been carried out for various valves in operation at Stennis Space Center such as the split-body valve and the 10-in. (approx.25.4-cm) LOX (liquid oxygen) valve and the 4-in. (approx.10 cm) Y-pattern valve (liquid nitrogen). Such simulations make use of variable grid topologies, thereby permitting solution accuracy and resolving important flow physics in the seat region of the moving valve. An advantage to this software includes possible reduction in testing costs incurred due to disruptions relating to unexpected flow transients or functioning of valve/flow control systems. Prediction of the flow anomalies leading to system vibrations, flow resonance, and valve stall can help in valve scheduling and significantly reduce the need for activation tests. This framework has been evaluated for its ability to predict performance metrics like flow coefficient for cavitating venturis and valve coefficient curves, and could be a valuable tool in predicting and understanding anomalous behavior of system components at rocket propulsion testing and design sites.
47 CFR 32.6532 - Network administration expense.
Code of Federal Regulations, 2013 CFR
2013-10-01
... includes such activities as controlling traffic flow, administering traffic measuring and monitoring devices, assigning equipment and load balancing, collecting and summarizing traffic data, administering...
47 CFR 32.6532 - Network administration expense.
Code of Federal Regulations, 2012 CFR
2012-10-01
... includes such activities as controlling traffic flow, administering traffic measuring and monitoring devices, assigning equipment and load balancing, collecting and summarizing traffic data, administering...
47 CFR 32.6532 - Network administration expense.
Code of Federal Regulations, 2014 CFR
2014-10-01
... includes such activities as controlling traffic flow, administering traffic measuring and monitoring devices, assigning equipment and load balancing, collecting and summarizing traffic data, administering...
iNJclust: Iterative Neighbor-Joining Tree Clustering Framework for Inferring Population Structure.
Limpiti, Tulaya; Amornbunchornvej, Chainarong; Intarapanich, Apichart; Assawamakin, Anunchai; Tongsima, Sissades
2014-01-01
Understanding genetic differences among populations is one of the most important issues in population genetics. Genetic variations, e.g., single nucleotide polymorphisms, are used to characterize commonality and difference of individuals from various populations. This paper presents an efficient graph-based clustering framework which operates iteratively on the Neighbor-Joining (NJ) tree called the iNJclust algorithm. The framework uses well-known genetic measurements, namely the allele-sharing distance, the neighbor-joining tree, and the fixation index. The behavior of the fixation index is utilized in the algorithm's stopping criterion. The algorithm provides an estimated number of populations, individual assignments, and relationships between populations as outputs. The clustering result is reported in the form of a binary tree, whose terminal nodes represent the final inferred populations and the tree structure preserves the genetic relationships among them. The clustering performance and the robustness of the proposed algorithm are tested extensively using simulated and real data sets from bovine, sheep, and human populations. The result indicates that the number of populations within each data set is reasonably estimated, the individual assignment is robust, and the structure of the inferred population tree corresponds to the intrinsic relationships among populations within the data.
A computational fluid dynamics simulation framework for ventricular catheter design optimization.
Weisenberg, Sofy H; TerMaath, Stephanie C; Barbier, Charlotte N; Hill, Judith C; Killeffer, James A
2017-11-10
OBJECTIVE Cerebrospinal fluid (CSF) shunts are the primary treatment for patients suffering from hydrocephalus. While proven effective in symptom relief, these shunt systems are plagued by high failure rates and often require repeated revision surgeries to replace malfunctioning components. One of the leading causes of CSF shunt failure is obstruction of the ventricular catheter by aggregations of cells, proteins, blood clots, or fronds of choroid plexus that occlude the catheter's small inlet holes or even the full internal catheter lumen. Such obstructions can disrupt CSF diversion out of the ventricular system or impede it entirely. Previous studies have suggested that altering the catheter's fluid dynamics may help to reduce the likelihood of complete ventricular catheter failure caused by obstruction. However, systematic correlation between a ventricular catheter's design parameters and its performance, specifically its likelihood to become occluded, still remains unknown. Therefore, an automated, open-source computational fluid dynamics (CFD) simulation framework was developed for use in the medical community to determine optimized ventricular catheter designs and to rapidly explore parameter influence for a given flow objective. METHODS The computational framework was developed by coupling a 3D CFD solver and an iterative optimization algorithm and was implemented in a high-performance computing environment. The capabilities of the framework were demonstrated by computing an optimized ventricular catheter design that provides uniform flow rates through the catheter's inlet holes, a common design objective in the literature. The baseline computational model was validated using 3D nuclear imaging to provide flow velocities at the inlet holes and through the catheter. RESULTS The optimized catheter design achieved through use of the automated simulation framework improved significantly on previous attempts to reach a uniform inlet flow rate distribution using the standard catheter hole configuration as a baseline. While the standard ventricular catheter design featuring uniform inlet hole diameters and hole spacing has a standard deviation of 14.27% for the inlet flow rates, the optimized design has a standard deviation of 0.30%. CONCLUSIONS This customizable framework, paired with high-performance computing, provides a rapid method of design testing to solve complex flow problems. While a relatively simplified ventricular catheter model was used to demonstrate the framework, the computational approach is applicable to any baseline catheter model, and it is easily adapted to optimize catheters for the unique needs of different patients as well as for other fluid-based medical devices.
Documentation for the MODFLOW 6 framework
Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.
2017-08-10
MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.
Alternatives to the Randomized Controlled Trial
West, Stephen G.; Duan, Naihua; Pequegnat, Willo; Gaist, Paul; Des Jarlais, Don C.; Holtgrave, David; Szapocznik, José; Fishbein, Martin; Rapkin, Bruce; Clatts, Michael; Mullen, Patricia Dolan
2008-01-01
Public health researchers are addressing new research questions (e.g., effects of environmental tobacco smoke, Hurricane Katrina) for which the randomized controlled trial (RCT) may not be a feasible option. Drawing on the potential outcomes framework (Rubin Causal Model) and Campbellian perspectives, we consider alternative research designs that permit relatively strong causal inferences. In randomized encouragement designs, participants are randomly invited to participate in one of the treatment conditions, but are allowed to decide whether to receive treatment. In quantitative assignment designs, treatment is assigned on the basis of a quantitative measure (e.g., need, merit, risk). In observational studies, treatment assignment is unknown and presumed to be nonrandom. Major threats to the validity of each design and statistical strategies for mitigating those threats are presented. PMID:18556609
Simulation of Groundwater Flow in the Coastal Plain Aquifer System of Virginia
Heywood, Charles E.; Pope, Jason P.
2009-01-01
The groundwater model documented in this report simulates the transient evolution of water levels in the aquifers and confining units of the Virginia Coastal Plain and adjacent portions of Maryland and North Carolina since 1890. Groundwater withdrawals have lowered water levels in Virginia Coastal Plain aquifers and have resulted in drawdown in the Potomac aquifer exceeding 200 feet in some areas. The discovery of the Chesapeake Bay impact crater and a revised conceptualization of the Potomac aquifer are two major changes to the hydrogeologic framework that have been incorporated into the groundwater model. The spatial scale of the model was selected on the basis of the primary function of the model of assessing the regional water-level responses of the confined aquifers beneath the Coastal Plain. The local horizontal groundwater flow through the surficial aquifer is not intended to be accurately simulated. Representation of recharge, evapotranspiration, and interaction with surface-water features, such as major rivers, lakes, the Chesapeake Bay, and the Atlantic Ocean, enable simulation of shallow flow-system details that influence locations of recharge to and discharge from the deeper confined flow system. The increased density of groundwater associated with the transition from fresh to salty groundwater near the Atlantic Ocean affects regional groundwater flow and was simulated with the Variable Density Flow Process of SEAWAT (a U.S. Geological Survey program for simulation of three-dimensional variable-density groundwater flow and transport). The groundwater density distribution was generated by a separate 108,000-year simulation of Pleistocene freshwater flushing around the Chesapeake Bay impact crater during transient sea-level changes. Specified-flux boundaries simulate increasing groundwater underflow out of the model domain into Maryland and minor underflow from the Piedmont Province into the model domain. Reported withdrawals accounted for approximately 75 percent of the total groundwater withdrawn from Coastal Plain aquifers during the year 2000. Unreported self-supplied withdrawals were simulated in the groundwater model by specifying their probable locations, magnitudes, and aquifer assignments on the basis of a separate study of domestic-well characteristics in Virginia. The groundwater flow model was calibrated to 7,183 historic water-level observations from 497 observation wells with the parameter-estimation codes UCODE-2005 and PEST. Most water-level observations were from the Potomac aquifer system, which permitted a more complex spatial distribution of simulated hydraulic conductivity within the Potomac aquifer than was possible for other aquifers. Zone, function, and pilot-point approaches were used to distribute assigned hydraulic properties within the aquifer system. The good fit (root mean square error = 3.6 feet) of simulated to observed water levels and reasonableness of the estimated parameter values indicate the model is a good representation of the physical groundwater flow system. The magnitudes and temporal and spatial distributions of residuals indicate no appreciable model bias. The model is intended to be useful for predicting changes in regional groundwater levels in the confined aquifer system in response to future pumping. Because the transient release of water stored in low-permeability confining units is simulated, drawdowns resulting from simulated pumping stresses may change substantially through time before reaching steady state. Consequently, transient simulations of water levels at different future times will be more accurate than a steady-state simulation for evaluating probable future aquifer-system responses to proposed pumping.
Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B
2008-01-01
Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.
A QoS Framework with Traffic Request in Wireless Mesh Network
NASA Astrophysics Data System (ADS)
Fu, Bo; Huang, Hejiao
In this paper, we consider major issues in ensuring greater Quality-of-Service (QoS) in Wireless Mesh Networks (WMNs), specifically with regard to reliability and delay. To this end, we use traffic request to record QoS requirements of data flows. In order to achieve required QoS for all data flows efficiently and with high portability, we develop Network State Update Algorithm. All assumptions, definitions, and algorithms are made exclusively with WMNs in mind, guaranteeing the portability of our framework to various environments in WMNs. The simulation results in proof that our framework is correct.
NASA Astrophysics Data System (ADS)
2018-05-01
Eigenvalues and eigenvectors, together, constitute the eigenstructure of the system. The design of vibrating systems aimed at satisfying specifications on eigenvalues and eigenvectors, which is commonly known as eigenstructure assignment, has drawn increasing interest over the recent years. The most natural mathematical framework for such problems is constituted by the inverse eigenproblems, which consist in the determination of the system model that features a desired set of eigenvalues and eigenvectors. Although such a problem is intrinsically challenging, several solutions have been proposed in the literature. The approaches to eigenstructure assignment can be basically divided into passive control and active control.
Williams, Lester J.; Kuniansky, Eve L.
2015-04-08
The hydrogeologic framework for the Floridan aquifer system has been revised throughout its extent in Florida and parts of Georgia, Alabama, and South Carolina. The updated framework generally conforms to the original framework established by the U.S. Geological Survey in the 1980s, except for adjustments made to the internal boundaries of the Upper and Lower Floridan aquifers and the individual higher and contrasting lower permeability zones within these aquifers. The system behaves as one aquifer over much of its extent; although subdivided vertically into two aquifer units, the Upper and Lower Floridan aquifers. In the previous framework, discontinuous numbered middle confining units (MCUI–VII) were used to subdivide the system. In areas where less-permeable rocks do not occur within the middle part of the system, the system was previously considered one aquifer and named the Upper Floridan aquifer. In intervening years, more detailed data have been collected in local areas, resulting in some of the same lithostratigraphic units in the Floridan aquifer system being assigned to the Upper or Lower Floridan aquifer in different parts of the State of Florida. Additionally, some of the numbered middle confining units are found to have hydraulic properties within the same order of magnitude as the aquifers. A new term “composite unit” is introduced for lithostratigraphic units that cannot be defined as either a confining or aquifer unit over their entire extent. This naming convention is a departure from the previous framework, in that stratigraphy is used to consistently subdivide the aquifer system into upper and lower aquifers across the State of Florida. This lithostratigraphic mapping approach does not change the concept of flow within the system. The revised boundaries of the Floridan aquifer system were mapped by considering results from local studies and regional correlations of lithostratigraphic and hydrogeologic units or zones. Additional zones within the aquifers have been incorporated into the framework to allow finer delineation of permeability variations within the aquifer system. These additional zones can be used to progressively divide the system for assessing groundwater and surface-water interaction, saltwater intrusion, and offshore movement of groundwater at greater detail if necessary. The lateral extent of the updip boundary of the Floridan aquifer system is modified from previous work based on newer data and inclusion of parts of the updip clastic facies. The carbonate and clastic facies form a gradational sequence, generally characterized by limestone of successively younger units that extend progressively farther updip. Because of the gradational nature of the carbonate-clastic sequence, some of the updip clastic aquifers have been included in the Floridan aquifer system, the Southeastern Coastal Plain aquifer system, or both. Thus, the revised updip limit includes some of these clastic facies. Additionally, the updip limit of the most productive part of the Floridan aquifer system was revised and indicates the approximate updip limit of the carbonate facies. The extent and altitude of the freshwater-saltwater interface in the aquifer system has been mapped to define the freshwater part of the flow system.
Bacles, C F E; Ennos, R A
2008-10-01
Paternity analysis based on microsatellite marker genotyping was used to infer contemporary genetic connectivity by pollen of three population remnants of the wind-pollinated, wind-dispersed tree Fraxinus excelsior, in a deforested Scottish landscape. By deterministically accounting for genotyping error and comparing a range of assignment methods, individual-based paternity assignments were used to derive population-level estimates of gene flow. Pollen immigration into a 300 ha landscape represents between 43 and 68% of effective pollination, mostly depending on assignment method. Individual male reproductive success is unequal, with 31 of 48 trees fertilizing one seed or more, but only three trees fertilizing more than ten seeds. Spatial analysis suggests a fat-tailed pollen dispersal curve with 85% of detected pollination occurring within 100 m, and 15% spreading between 300 and 1900 m from the source. Identification of immigrating pollen sourced from two neighbouring remnants indicates further effective dispersal at 2900 m. Pollen exchange among remnants is driven by population size rather than geographic distance, with larger remnants acting predominantly as pollen donors, and smaller remnants as pollen recipients. Enhanced wind dispersal of pollen in a barren landscape ensures that the seed produced within the catchment includes genetic material from a wide geographic area. However, gene flow estimates based on analysis of non-dispersed seeds were shown to underestimate realized gene immigration into the remnants by a factor of two suggesting that predictive landscape conservation requires integrated estimates of post-recruitment gene flow occurring via both pollen and seed.
Interregional flows of ecosystem services: Concepts, typology and four cases
Schröter, Matthias; Koellner, Thomas; Alkemade, Rob; Arnhold, Sebastian; Bagstad, Kenneth J.; Frank, Karin; Erb, Karl-Heinz; Kastner, Thomas; Kissinger, Meidad; Liu, Jianguo; López-Hoffman, Laura; Maes, Joachim; Marques, Alexandra; Martín-López, Berta; Meyer, Carsten; Schulp, Catharina J. E.; Thober, Jule; Wolff, Sarah; Bonn, Aletta
2018-01-01
Conserving and managing global natural capital requires an understanding of the complexity of flows of ecosystem services across geographic boundaries. Failing to understand and to incorporate these flows into national and international ecosystem assessments leads to incomplete and potentially skewed conclusions, impairing society’s ability to identify sustainable management and policy choices. In this paper, we synthesise existing knowledge and develop a conceptual framework for analysing interregional ecosystem service flows. We synthesise the types of such flows, the characteristics of sending and receiving socio-ecological systems, and the impacts of ecosystem service flows on interregional sustainability. Using four cases (trade of certified coffee, migration of northern pintails, flood protection in the Danube watershed, and information on giant pandas), we test the conceptual framework and show how an enhanced understanding of interregional telecouplings in socio-ecological systems can inform ecosystem service-based decision making and governance with respect to sustainability goals.
Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems
Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia; ...
2017-09-05
Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less
Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia
Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less
An, Ming-Wen; Lu, Xin; Sargent, Daniel J; Mandrekar, Sumithra J
2015-01-01
A phase II design with an option for direct assignment (stop randomization and assign all patients to experimental treatment based on interim analysis, IA) for a predefined subgroup was previously proposed. Here, we illustrate the modularity of the direct assignment option by applying it to the setting of two predefined subgroups and testing for separate subgroup main effects. We power the 2-subgroup direct assignment option design with 1 IA (DAD-1) to test for separate subgroup main effects, with assessment of power to detect an interaction in a post-hoc test. Simulations assessed the statistical properties of this design compared to the 2-subgroup balanced randomized design with 1 IA, BRD-1. Different response rates for treatment/control in subgroup 1 (0.4/0.2) and in subgroup 2 (0.1/0.2, 0.4/0.2) were considered. The 2-subgroup DAD-1 preserves power and type I error rate compared to the 2-subgroup BRD-1, while exhibiting reasonable power in a post-hoc test for interaction. The direct assignment option is a flexible design component that can be incorporated into broader design frameworks, while maintaining desirable statistical properties, clinical appeal, and logistical simplicity.
Unifying Temporal and Structural Credit Assignment Problems
NASA Technical Reports Server (NTRS)
Agogino, Adrian K.; Tumer, Kagan
2004-01-01
Single-agent reinforcement learners in time-extended domains and multi-agent systems share a common dilemma known as the credit assignment problem. Multi-agent systems have the structural credit assignment problem of determining the contributions of a particular agent to a common task. Instead, time-extended single-agent systems have the temporal credit assignment problem of determining the contribution of a particular action to the quality of the full sequence of actions. Traditionally these two problems are considered different and are handled in separate ways. In this article we show how these two forms of the credit assignment problem are equivalent. In this unified frame-work, a single-agent Markov decision process can be broken down into a single-time-step multi-agent process. Furthermore we show that Monte-Carlo estimation or Q-learning (depending on whether the values of resulting actions in the episode are known at the time of learning) are equivalent to different agent utility functions in a multi-agent system. This equivalence shows how an often neglected issue in multi-agent systems is equivalent to a well-known deficiency in multi-time-step learning and lays the basis for solving time-extended multi-agent problems, where both credit assignment problems are present.
Marchigiano, Gail; Eduljee, Nina; Harvey, Kimberly
2011-01-01
Clinical assignments in nursing education provide opportunities for students to develop thinking skills vital to the effective delivery of patient care. The purpose of the present study was to examine students' perceived levels of confidence for using thinking skills when completing two types of clinical assignments. Clinical educators and managers are challenged to develop teaching and learning strategies that help students think critically and reflectively and transfer these skills into sound nursing practice. This study is based on the theoretical framework of critical thinking within the nursing process framework. Undergraduate nursing students (n=51) completed surveys indicating their confidence in using seven thinking skills for nursing care. Students indicated significantly more confidence when implementing the journal format as compared with the care plan format when analysing information, determining relevance, making connections, selecting appropriate information, applying relevant knowledge and evaluating outcomes. The findings of the present study propose a new approach for enhancing students' thinking skills. Journaling is an effective strategy for enhancing students' thinking skills. Nursing managers are in key organisational positions for supporting and promoting the use of the journal format and building supportive and collaborative learning environments for students to develop thinking skills for managing patient care. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.
Inferential Framework for Autonomous Cryogenic Loading Operations
NASA Technical Reports Server (NTRS)
Luchinsky, Dmitry G.; Khasin, Michael; Timucin, Dogan; Sass, Jared; Perotti, Jose; Brown, Barbara
2017-01-01
We address problem of autonomous cryogenic management of loading operations on the ground and in space. As a step towards solution of this problem we develop a probabilistic framework for inferring correlations parameters of two-fluid cryogenic flow. The simulation of two-phase cryogenic flow is performed using nearly-implicit scheme. A concise set of cryogenic correlations is introduced. The proposed approach is applied to an analysis of the cryogenic flow in experimental Propellant Loading System built at NASA KSC. An efficient simultaneous optimization of a large number of model parameters is demonstrated and a good agreement with the experimental data is obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Orth, Dr. Donald J; Dolloff, Dr. Charles A
2013-01-01
In order for habitat restoration in regulated rivers to be effective at large scales, broadly applicable frameworks are needed that provide measurable objectives and contexts for management. The Ecological Limits of Hydrologic Alteration (ELOHA) framework was created as a template to assess hydrologic alterations, develop relationships between altered streamflow and ecology, and establish environmental flow standards. We tested the utility of ELOHA in informing flow restoration applications for fish and riparian communities in regulated rivers in the Upper Tennessee River Basin (UTRB). We followed the steps of ELOHA to generate flow alteration-ecological response relationships and then determined whether those relationshipsmore » could predict fish and riparian responses to flow restoration in the Cheoah River, a regulated system within the UTRB. Although ELOHA provided a robust template to construct hydrologic information and predict hydrology for ungaged locations, our results do not support the assertion that over-generalized univariate relationships between flow and ecology can produce results sufficient to guide management in regulated rivers. After constructing multivariate models, we successfully developed predictive relationships between flow alterations and fish/riparian responses. In accordance with model predictions, riparian encroachment displayed consistent decreases with increases in flow magnitude in the Cheoah River; however, fish richness did not increase as predicted four years post- restoration. Our results suggest that altered temperature and substrate and the current disturbance regime may have reduced opportunities for fish species colonization. Our case study highlights the need for interdisciplinary science in defining environmental flows for regulated rivers and the need for adaptive management approaches once flows are restored.« less
NASA Astrophysics Data System (ADS)
Kellogg, Kevin; Liu, Peiyuan; Lamarche, Casey; Hrenya, Christine
2017-11-01
In flows of cohesive particles, agglomerates will readily form and break. These agglomerates are expected to complicate how particles interact with the surrounding fluid in multiphase flows, and consequently how the solids flow. In this work, a dilute flow of particles driven by gas against gravity is studied. A continuum framework, composed of a population balance to predict the formation of agglomerates, and kinetic-theory-based balances, is used to predict the flow of particles. The closures utilized for the birth and death rates due to aggregation and breakage in the population balance take into account how the impact velocity (the granular temperature) affects the outcome of a collision as aggregation, rebound, or breakage. The agglomerate size distribution and solids velocity predicted by the continuum framework are compared to discrete element method (DEM) simulations, as well to experimental results of particles being entrained from the riser of a fluidized bed. Dow Corning Corporation.
Delay Banking for Managing Air Traffic
NASA Technical Reports Server (NTRS)
Green, Steve
2008-01-01
Delay banking has been invented to enhance air-traffic management in a way that would increase the degree of fairness in assigning arrival, departure, and en-route delays and trajectory deviations to aircraft impacted by congestion in the national airspace system. In delay banking, an aircraft operator (airline, military, general aviation, etc.) would be assigned a numerical credit when any of their flights are delayed because of an air-traffic flow restriction. The operator could subsequently bid against other operators competing for access to congested airspace to utilize part or all of its accumulated credit. Operators utilize credits to obtain higher priority for the same flight, or other flights operating at the same time, or later, in the same airspace, or elsewhere. Operators could also trade delay credits, according to market rules that would be determined by stakeholders in the national airspace system. Delay banking would be administered by an independent third party who would use delay banking automation to continually monitor flights, allocate delay credits, maintain accounts of delay credits for participating airlines, mediate bidding and the consumption of credits of winning bidders, analyze potential transfers of credits within and between operators, implement accepted transfers, and ensure fair treatment of all participating operators. A flow restriction can manifest itself in the form of a delay in assigned takeoff time, a reduction in assigned airspeed, a change in the position for the aircraft in a queue of all aircraft in a common stream of traffic (e.g., similar route), a change in the planned altitude profile for an aircraft, or change in the planned route for the aircraft. Flow restrictions are typically imposed to mitigate traffic congestion at an airport or in a region of airspace, particularly congestion due to inclement weather, or the unavailability of a runway or region of airspace. A delay credit would be allocated to an operator of a flight that has accepted, or upon which was imposed, a flow restriction. The amount of the credit would increase with the amount of delay caused by the flow restriction, the exact amount depending on which of several candidate formulas is eventually chosen. For example, according to one formula, there would be no credit for a delay smaller than some threshold value (e.g., 30 seconds) and the amount of the credit for a longer delay would be set at the amount of the delay minus the threshold value. Optionally, the value of a delay credit could be made to decay with time according to a suitable formula (e.g., an exponential decay). Also, optionally, a transaction charge could be assessed against the value of a delay credit that an operator used on a flight different from the one for which the delay originated or that was traded with a different operator. The delay credits accumulated by a given airline could be utilized in various ways. For example, an operator could enter a bid for priority handling in a new flow restriction that impacts one or more of the operator s flights; if the bid were unsuccessful, all or a portion of the credit would be returned to the bidder. If the bid pertained to a single aircraft that was in a queue, delay credits could be consumed in moving the aircraft to an earlier position within the queue. In the case of a flow restriction involving a choice of alternate routes, planned altitude profile, aircraft spacing, or other non-queue flow restrictions, delay credits could be used to bid for an alternative assignment.
Fleet Assignment Using Collective Intelligence
NASA Technical Reports Server (NTRS)
Antoine, Nicolas E.; Bieniawski, Stefan R.; Kroo, Ilan M.; Wolpert, David H.
2004-01-01
Airline fleet assignment involves the allocation of aircraft to a set of flights legs in order to meet passenger demand, while satisfying a variety of constraints. Over the course of the day, the routing of each aircraft is determined in order to minimize the number of required flights for a given fleet. The associated flow continuity and aircraft count constraints have led researchers to focus on obtaining quasi-optimal solutions, especially at larger scales. In this paper, the authors propose the application of an agent-based integer optimization algorithm to a "cold start" fleet assignment problem. Results show that the optimizer can successfully solve such highly- constrained problems (129 variables, 184 constraints).
Evolutionary squeaky wheel optimization: a new framework for analysis.
Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K
2011-01-01
Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*
Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-01-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314
Benninger, Elizabeth; Savahl, Shazly
2016-01-01
This study aimed to explore how children construct and assign meaning to the "self" within two urban communities of Cape Town in South Africa. Using a child participation methodological framework data were collected using Photovoice and community maps with 54 participants between the ages of 9 and 12. Feelings of safety, social connectedness, and children's spaces were found to be central to the ways in which the participants constructed and assigned meaning to the "self." The study provides implications for intervention programmes aimed at improving children's well-being to be inclusive of activities aimed at improving children's self-concept, including the construction of safe spaces for children to play, learn, and form meaningful relationships.
Using MODFLOW drains to simulate groundwater flow in a karst environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, J.; Tomasko, D.; Glennon, M.A.
1998-07-01
Modeling groundwater flow in a karst environment is both numerically challenging and highly uncertain because of potentially complex flowpaths and a lack of site-specific information. This study presents the results of MODFLOW numerical modeling in which drain cells in a finite-difference model are used as analogs for preferential flowpaths or conduits in karst environments. In this study, conduits in mixed-flow systems are simulated by assigning connected pathways of drain cells from the locations of tracer releases, sinkholes, or other karst features to outlet springs along inferred flowpaths. These paths are determined by the locations of losing stream segments, ephemeral streammore » beds, geophysical surveys, fracture lineaments, or other surficial characteristics, combined with the results of dye traces. The elevations of the drains at the discharge ends of the inferred flowpaths are estimated from field data and are adjusted when necessary during model calibration. To simulate flow in a free-flowing conduit, a high conductance is assigned to each drain to eliminate the need for drain-specific information that would be very difficult to obtain. Calculations were performed for a site near Hohenfels, Germany. The potentiometric surface produced by the simulations agreed well with field data. The head contours in the vicinity of the karst features behaved in a manner consistent with a flow system having both diffuse and conduit components, and the sum of the volumetric flow out of the drain cells agreed closely with spring discharges and stream flows. Because of the success of this approach, it is recommended for regional studies in which little site-specific information (e.g., location, number, size, and conductivity of fractures and conduits) is available, and general flow characteristics are desired.« less
Smart licensing and environmental flows: Modeling framework and sensitivity testing
NASA Astrophysics Data System (ADS)
Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.
2011-12-01
Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.
Ansara, Y Gavriel
2015-10-01
Recent Australian legislative and policy changes can benefit people of trans and/or non-binary experience (e.g. men assigned female with stereotypically 'female' bodies, women assigned male with stereotypically 'male' bodies, and people who identify as genderqueer, agender [having no gender], bi-gender [having two genders] or another gender option). These populations often experience cisgenderism, which previous research defined as 'the ideology that invalidates people's own understanding of their genders and bodies'. Some documented forms of cisgenderism include pathologising (treating people's genders and bodies as disordered) and misgendering (disregarding people's own understanding and classifications of their genders and bodies). This system of classifying people's lived experiences of gender and body invalidation is called the cisgenderism framework. Applying the cisgenderism framework in the ageing and aged care sector can enhance service providers' ability to meet the needs of older people of trans and/or non-binary experience. © 2015 AJA Inc.
NASA Astrophysics Data System (ADS)
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-05-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.
LAV@HAZARD: a Web-GIS Framework for Real-Time Forecasting of Lava Flow Hazards
NASA Astrophysics Data System (ADS)
Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.
2014-12-01
Crucial to lava flow hazard assessment is the development of tools for real-time prediction of flow paths, flow advance rates, and final flow lengths. Accurate prediction of flow paths and advance rates requires not only rapid assessment of eruption conditions (especially effusion rate) but also improved models of lava flow emplacement. Here we present the LAV@HAZARD web-GIS framework, which combines spaceborne remote sensing techniques and numerical simulations for real-time forecasting of lava flow hazards. By using satellite-derived discharge rates to drive a lava flow emplacement model, LAV@HAZARD allows timely definition of parameters and maps essential for hazard assessment, including the propagation time of lava flows and the maximum run-out distance. We take advantage of the flexibility of the HOTSAT thermal monitoring system to process satellite images coming from sensors with different spatial, temporal and spectral resolutions. HOTSAT was designed to ingest infrared satellite data acquired by the MODIS and SEVIRI sensors to output hot spot location, lava thermal flux and discharge rate. We use LAV@HAZARD to merge this output with the MAGFLOW physics-based model to simulate lava flow paths and to update, in a timely manner, flow simulations. Thus, any significant changes in lava discharge rate are included in the predictions. A significant benefit in terms of computational speed was obtained thanks to the parallel implementation of MAGFLOW on graphic processing units (GPUs). All this useful information has been gathered into the LAV@HAZARD platform which, due to the high degree of interactivity, allows generation of easily readable maps and a fast way to explore alternative scenarios. We will describe and demonstrate the operation of this framework using a variety of case studies pertaining to Mt Etna, Sicily. Although this study was conducted on Mt Etna, the approach used is designed to be applicable to other volcanic areas around the world.
Remais, Justin V; Xiao, Ning; Akullian, Adam; Qiu, Dongchuan; Blair, David
2011-04-01
For many pathogens with environmental stages, or those carried by vectors or intermediate hosts, disease transmission is strongly influenced by pathogen, host, and vector movements across complex landscapes, and thus quantitative measures of movement rate and direction can reveal new opportunities for disease management and intervention. Genetic assignment methods are a set of powerful statistical approaches useful for establishing population membership of individuals. Recent theoretical improvements allow these techniques to be used to cost-effectively estimate the magnitude and direction of key movements in infectious disease systems, revealing important ecological and environmental features that facilitate or limit transmission. Here, we review the theory, statistical framework, and molecular markers that underlie assignment methods, and we critically examine recent applications of assignment tests in infectious disease epidemiology. Research directions that capitalize on use of the techniques are discussed, focusing on key parameters needing study for improved understanding of patterns of disease.
A regional-scale ecological risk framework for environmental flow evaluations
NASA Astrophysics Data System (ADS)
O'Brien, Gordon C.; Dickens, Chris; Hines, Eleanor; Wepener, Victor; Stassen, Retha; Quayle, Leo; Fouchy, Kelly; MacKenzie, James; Graham, P. Mark; Landis, Wayne G.
2018-02-01
Environmental flow (E-flow) frameworks advocate holistic, regional-scale, probabilistic E-flow assessments that consider flow and non-flow drivers of change in a socio-ecological context as best practice. Regional-scale ecological risk assessments of multiple stressors to social and ecological endpoints, which address ecosystem dynamism, have been undertaken internationally at different spatial scales using the relative-risk model since the mid-1990s. With the recent incorporation of Bayesian belief networks into the relative-risk model, a robust regional-scale ecological risk assessment approach is available that can contribute to achieving the best practice recommendations of E-flow frameworks. PROBFLO is a holistic E-flow assessment method that incorporates the relative-risk model and Bayesian belief networks (BN-RRM) into a transparent probabilistic modelling tool that addresses uncertainty explicitly. PROBFLO has been developed to evaluate the socio-ecological consequences of historical, current and future water resource use scenarios and generate E-flow requirements on regional spatial scales. The approach has been implemented in two regional-scale case studies in Africa where its flexibility and functionality has been demonstrated. In both case studies the evidence-based outcomes facilitated informed environmental management decision making, with trade-off considerations in the context of social and ecological aspirations. This paper presents the PROBFLO approach as applied to the Senqu River catchment in Lesotho and further developments and application in the Mara River catchment in Kenya and Tanzania. The 10 BN-RRM procedural steps incorporated in PROBFLO are demonstrated with examples from both case studies. PROBFLO can contribute to the adaptive management of water resources and contribute to the allocation of resources for sustainable use of resources and address protection requirements.
NASA Technical Reports Server (NTRS)
Wendel, Thomas R.; Boland, Joseph R.; Hahne, David E.
1991-01-01
Flight-control laws are developed for a wind-tunnel aircraft model flying at a high angle of attack by using a synthesis technique called direct eigenstructure assignment. The method employs flight guidelines and control-power constraints to develop the control laws, and gain schedules and nonlinear feedback compensation provide a framework for considering the nonlinear nature of the attack angle. Linear and nonlinear evaluations show that the control laws are effective, a conclusion that is further confirmed by a scale model used for free-flight testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less
Cultural Pedagogy and Bridges to Literacy: Home and Kindergarten.
ERIC Educational Resources Information Center
Korat, Ofra
2001-01-01
Presents five key aspects of cultural pedagogy theory: social interactions, self-identity, externalization of inner thought, educational institutions, and narratives. Views these aspects as critical vehicles to fostering the child's literacy development. Notes that within this framework, great importance has been assigned to the cooperative link…
Causal Responsibility and Counterfactuals
ERIC Educational Resources Information Center
Lagnado, David A.; Gerstenberg, Tobias; Zultan, Ro'i
2013-01-01
How do people attribute responsibility in situations where the contributions of multiple agents combine to produce a joint outcome? The prevalence of over-determination in such cases makes this a difficult problem for counterfactual theories of causal responsibility. In this article, we explore a general framework for assigning responsibility in…
ERIC Educational Resources Information Center
Mathena, Traci Johnson
2000-01-01
Middle school teacher describes a framework that gives inexperienced, anxious writers the confidence to write. The process, called doing prompts, stems from analyzing prompts or writing assignments that outline the topic for a piece of writing. The process involves analyzing the prompt being called for, completing a graphic organizer, composing…
Social Work Internship in Public Housing: An Interdisciplinary Experience
ERIC Educational Resources Information Center
Kurren, Oscar; Lister, Paul
1976-01-01
Principles shaping the focus of the social work internship program at the University of Hawaii included: an interdisciplinary framework providing for faculty and student development from the Schools of Public Health, Engineering, Architecture, Business Administration, and Social Work; and total responsibility for task assignment, affording…
Methods for compressible multiphase flows and their applications
NASA Astrophysics Data System (ADS)
Kim, H.; Choe, Y.; Kim, H.; Min, D.; Kim, C.
2018-06-01
This paper presents an efficient and robust numerical framework to deal with multiphase real-fluid flows and their broad spectrum of engineering applications. A homogeneous mixture model incorporated with a real-fluid equation of state and a phase change model is considered to calculate complex multiphase problems. As robust and accurate numerical methods to handle multiphase shocks and phase interfaces over a wide range of flow speeds, the AUSMPW+_N and RoeM_N schemes with a system preconditioning method are presented. These methods are assessed by extensive validation problems with various types of equation of state and phase change models. Representative realistic multiphase phenomena, including the flow inside a thermal vapor compressor, pressurization in a cryogenic tank, and unsteady cavitating flow around a wedge, are then investigated as application problems. With appropriate physical modeling followed by robust and accurate numerical treatments, compressible multiphase flow physics such as phase changes, shock discontinuities, and their interactions are well captured, confirming the suitability of the proposed numerical framework to wide engineering applications.
Brewer, Shannon; McManamay, Ryan A.; Miller, Andrew D.; ...
2016-05-13
Environmental flows represent a legal mechanism to balance existing and future water uses and sustain non-use values. Here, we identify current challenges, provide examples where they are important, and suggest research advances that would benefit environmental flow science. Specifically, environmental flow science would benefit by (1) developing approaches to address streamflow needs in highly modified landscapes where historic flows do not provide reasonable comparisons, (2) integrating water quality needs where interactions are apparent with quantity but not necessarily the proximate factor of the ecological degradation, especially as frequency and magnitudes of inflows to bays and estuaries, (3) providing a bettermore » understanding of the ecological needs of native species to offset the often unintended consequences of benefiting non-native species or their impact on flows, (4) improving our understanding of the non-use economic value to balance consumptive economic values, and (5) increasing our understanding of the stakeholder socioeconomic spatial distribution of attitudes and perceptions across the landscape. Environmental flow science is still an emerging interdisciplinary field and by integrating socioeconomic disciplines and developing new frameworks to accommodate our altered landscapes, we should help advance environmental flow science and likely increase successful implementation of flow standards.« less
NASA Astrophysics Data System (ADS)
Brewer, Shannon K.; McManamay, Ryan A.; Miller, Andrew D.; Mollenhauer, Robert; Worthington, Thomas A.; Arsuffi, Tom
2016-08-01
Environmental flows represent a legal mechanism to balance existing and future water uses and sustain non-use values. Here, we identify current challenges, provide examples where they are important, and suggest research advances that would benefit environmental flow science. Specifically, environmental flow science would benefit by (1) developing approaches to address streamflow needs in highly modified landscapes where historic flows do not provide reasonable comparisons, (2) integrating water quality needs where interactions are apparent with quantity but not necessarily the proximate factor of the ecological degradation, especially as frequency and magnitudes of inflows to bays and estuaries, (3) providing a better understanding of the ecological needs of native species to offset the often unintended consequences of benefiting non-native species or their impact on flows, (4) improving our understanding of the non-use economic value to balance consumptive economic values, and (5) increasing our understanding of the stakeholder socioeconomic spatial distribution of attitudes and perceptions across the landscape. Environmental flow science is still an emerging interdisciplinary field and by integrating socioeconomic disciplines and developing new frameworks to accommodate our altered landscapes, we should help advance environmental flow science and likely increase successful implementation of flow standards.
Brewer, Shannon K; McManamay, Ryan A; Miller, Andrew D; Mollenhauer, Robert; Worthington, Thomas A; Arsuffi, Tom
2016-08-01
Environmental flows represent a legal mechanism to balance existing and future water uses and sustain non-use values. Here, we identify current challenges, provide examples where they are important, and suggest research advances that would benefit environmental flow science. Specifically, environmental flow science would benefit by (1) developing approaches to address streamflow needs in highly modified landscapes where historic flows do not provide reasonable comparisons, (2) integrating water quality needs where interactions are apparent with quantity but not necessarily the proximate factor of the ecological degradation, especially as frequency and magnitudes of inflows to bays and estuaries, (3) providing a better understanding of the ecological needs of native species to offset the often unintended consequences of benefiting non-native species or their impact on flows, (4) improving our understanding of the non-use economic value to balance consumptive economic values, and (5) increasing our understanding of the stakeholder socioeconomic spatial distribution of attitudes and perceptions across the landscape. Environmental flow science is still an emerging interdisciplinary field and by integrating socioeconomic disciplines and developing new frameworks to accommodate our altered landscapes, we should help advance environmental flow science and likely increase successful implementation of flow standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brewer, Shannon; McManamay, Ryan A.; Miller, Andrew D.
Environmental flows represent a legal mechanism to balance existing and future water uses and sustain non-use values. Here, we identify current challenges, provide examples where they are important, and suggest research advances that would benefit environmental flow science. Specifically, environmental flow science would benefit by (1) developing approaches to address streamflow needs in highly modified landscapes where historic flows do not provide reasonable comparisons, (2) integrating water quality needs where interactions are apparent with quantity but not necessarily the proximate factor of the ecological degradation, especially as frequency and magnitudes of inflows to bays and estuaries, (3) providing a bettermore » understanding of the ecological needs of native species to offset the often unintended consequences of benefiting non-native species or their impact on flows, (4) improving our understanding of the non-use economic value to balance consumptive economic values, and (5) increasing our understanding of the stakeholder socioeconomic spatial distribution of attitudes and perceptions across the landscape. Environmental flow science is still an emerging interdisciplinary field and by integrating socioeconomic disciplines and developing new frameworks to accommodate our altered landscapes, we should help advance environmental flow science and likely increase successful implementation of flow standards.« less
Brewer, Shannon K.; McManamay, Ryan A.; Miller, Andrew D.; Mollenhauer, Robert; Worthington, Thomas A.; Arsuffi, Tom
2016-01-01
Environmental flows represent a legal mechanism to balance existing and future water uses and sustain non-use values. Here, we identify current challenges, provide examples where they are important, and suggest research advances that would benefit environmental flow science. Specifically, environmental flow science would benefit by (1) developing approaches to address streamflow needs in highly modified landscapes where historic flows do not provide reasonable comparisons, (2) integrating water quality needs where interactions are apparent with quantity but not necessarily the proximate factor of the ecological degradation, especially as frequency and magnitudes of inflows to bays and estuaries, (3) providing a better understanding of the ecological needs of native species to offset the often unintended consequences of benefiting non-native species or their impact on flows, (4) improving our understanding of the non-use economic value to balance consumptive economic values, and (5) increasing our understanding of the stakeholder socioeconomic spatial distribution of attitudes and perceptions across the landscape. Environmental flow science is still an emerging interdisciplinary field and by integrating socioeconomic disciplines and developing new frameworks to accommodate our altered landscapes, we should help advance environmental flow science and likely increase successful implementation of flow standards.
Fully nonlinear theory of transcritical shallow-water flow past topography
NASA Astrophysics Data System (ADS)
El, Gennady; Grimshaw, Roger; Smyth, Noel
2010-05-01
In this talk recent results on the generation of undular bores in one-dimensional fully nonlinear shallow-water flows past localised topographies will be presented. The description is made in the framework of the forced Su-Gardner (a.k.a. 1D Green-Naghdi) system of equations, with a primary focus on the transcritical regime when the Froude number of the oncoming flow is close to unity. A combination of the local transcritical hydraulic solution over the localized topography, which produces upstream and downstream hydraulic jumps, and unsteady undular bore solutions describing the resolution of these hydraulic jumps, is used to describe various flow regimes depending on the combination of the topography height and the Froude number. We take advantage of the recently developed modulation theory of Su-Gardner undular bores to derive the main parameters of transcritical fully nonlinear shallow-water flow, such as the leading solitary wave amplitudes for the upstream and downstream undular bores, the speeds of the undular bores edges and the drag force. Our results confirm that most of the features of the previously developed description in the framework of the uni-directional forced KdV model hold up qualitatively for finite amplitude waves, while the quantitative description can be obtained in the framework of the bi-directional forced Su-Gardner system.
Transport induced by mean-eddy interaction: II. Analysis of transport processes
NASA Astrophysics Data System (ADS)
Ide, Kayo; Wiggins, Stephen
2015-03-01
We present a framework for the analysis of transport processes resulting from the mean-eddy interaction in a flow. The framework is based on the Transport Induced by the Mean-Eddy Interaction (TIME) method presented in a companion paper (Ide and Wiggins, 2014) [1]. The TIME method estimates the (Lagrangian) transport across stationary (Eulerian) boundaries defined by chosen streamlines of the mean flow. Our framework proceeds after first carrying out a sequence of preparatory steps that link the flow dynamics to the transport processes. This includes the construction of the so-called "instantaneous flux" as the Hovmöller diagram. Transport processes are studied by linking the signals of the instantaneous flux field to the dynamical variability of the flow. This linkage also reveals how the variability of the flow contributes to the transport. The spatio-temporal analysis of the flux diagram can be used to assess the efficiency of the variability in transport processes. We apply the method to the double-gyre ocean circulation model in the situation where the Rossby-wave mode dominates the dynamic variability. The spatio-temporal analysis shows that the inter-gyre transport is controlled by the circulating eddy vortices in the fast eastward jet region, whereas the basin-scale Rossby waves have very little impact.
QUICR-learning for Multi-Agent Coordination
NASA Technical Reports Server (NTRS)
Agogino, Adrian K.; Tumer, Kagan
2006-01-01
Coordinating multiple agents that need to perform a sequence of actions to maximize a system level reward requires solving two distinct credit assignment problems. First, credit must be assigned for an action taken at time step t that results in a reward at time step t > t. Second, credit must be assigned for the contribution of agent i to the overall system performance. The first credit assignment problem is typically addressed with temporal difference methods such as Q-learning. The second credit assignment problem is typically addressed by creating custom reward functions. To address both credit assignment problems simultaneously, we propose the "Q Updates with Immediate Counterfactual Rewards-learning" (QUICR-learning) designed to improve both the convergence properties and performance of Q-learning in large multi-agent problems. QUICR-learning is based on previous work on single-time-step counterfactual rewards described by the collectives framework. Results on a traffic congestion problem shows that QUICR-learning is significantly better than a Q-learner using collectives-based (single-time-step counterfactual) rewards. In addition QUICR-learning provides significant gains over conventional and local Q-learning. Additional results on a multi-agent grid-world problem show that the improvements due to QUICR-learning are not domain specific and can provide up to a ten fold increase in performance over existing methods.
Marine methane cycle simulations for the period of early global warming
NASA Astrophysics Data System (ADS)
Elliott, Scott; Maltrud, Mathew; Reagan, Matthew; Moridis, George; Cameron-Smith, Philip
2011-03-01
Geochemical environments, fates, and effects are modeled for methane released into seawater by the decomposition of climate-sensitive clathrates. A contemporary global background cycle is first constructed, within the framework of the Parallel Ocean Program. Input from organics in the upper thermocline is related to oxygen levels, and microbial consumption is parameterized from available rate measurements. Seepage into bottom layers is then superimposed, representing typical seabed fluid flow. The resulting CH4 distribution is validated against surface saturation ratios, vertical sections, and slope plume studies. Injections of clathrate-derived methane are explored by distributing a small number of point sources around the Arctic continental shelf, where stocks are extensive and susceptible to instability during the first few decades of global warming. Isolated bottom cells are assigned dissolved gas fluxes from porous-media simulation. Given the present bulk removal pattern, methane does not penetrate far from emission sites. Accumulated effects, however, spread to the regional scale following the modeled current system. Both hypoxification and acidification are documented. Sensitivity studies illustrate a potential for material restrictions to broaden the perturbations, since methanotrophic consumers require nutrients and trace metals. When such factors are considered, methane buildup within the Arctic basin is enhanced. However, freshened polar surface waters act as a barrier to atmospheric transfer, diverting products into the deep return flow. Uncertainties in the logic and calculations are enumerated including those inherent in high-latitude clathrate abundance, buoyant effluent rise through the column, representation of the general circulation, and bacterial growth kinetics.
A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition
NASA Technical Reports Server (NTRS)
Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.
2012-01-01
A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.
Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support
NASA Astrophysics Data System (ADS)
Djokic, D.; Noman, N.; Kopp, S.
2015-12-01
Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework has been successfully used in both the data rich environments as well as in locales with minimum available spatial and hydrographic data.
An entropy-based analysis of lane changing behavior: An interactive approach.
Kosun, Caglar; Ozdemir, Serhan
2017-05-19
As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would match discretionary lane changing behavior. This article states that driver behaviors would be in the nonadditive entropy domain to provide a safe traffic stream and hence with vehicle accident prevention in mind.
D'Agnese, Frank A.; O'Brien, G. M.; Faunt, C.C.; Belcher, W.R.; San Juan, C.
2002-01-01
In the early 1990's, two numerical models of the Death Valley regional ground-water flow system were developed by the U.S. Department of Energy. In general, the two models were based on the same basic hydrogeologic data set. In 1998, the U.S. Department of Energy requested that the U.S. Geological Survey develop and maintain a ground-water flow model of the Death Valley region in support of U.S. Department of Energy programs at the Nevada Test Site. The purpose of developing this 'second-generation' regional model was to enhance the knowledge an understanding of the ground-water flow system as new information and tools are developed. The U.S. Geological Survey also was encouraged by the U.S. Department of Energy to cooperate to the fullest extent with other Federal, State, and local entities in the region to take advantage of the benefits of their knowledge and expertise. The short-term objective of the Death Valley regional ground-water flow system project was to develop a steady-state representation of the predevelopment conditions of the ground-water flow system utilizing the two geologic interpretations used to develop the previous numerical models. The long-term objective of this project was to construct and calibrate a transient model that simulates the ground-water conditions of the study area over the historical record that utilizes a newly interpreted hydrogeologic conceptual model. This report describes the result of the predevelopment steady-state model construction and calibration. The Death Valley regional ground-water flow system is situated within the southern Great Basin, a subprovince of the Basin and Range physiographic province, bounded by latitudes 35 degrees north and 38 degrees 15 minutes north and by longitudes 115 and 118 degrees west. Hydrology in the region is a result of both the arid climatic conditions and the complex geology. Ground-water flow generally can be described as dominated by interbasinal flow and may be conceptualized as having two main components: a series of relatively shallow and localized flow paths that are superimposed on deeper regional flow paths. A significant component of the regional ground-water flow is through a thick Paleozoic carbonate rock sequence. Throughout the flow system, ground water flows through zones of high transmissivity that have resulted from regional faulting and fracturing. The conceptual model of the Death Valley regional ground-water flow system used for this study is adapted from the two previous ground-water modeling studies. The three-dimensional digital hydrogeologic framework model developed for the region also contains elements of both of the hydrogeologic framework models used in the previous investigations. As dictated by project scope, very little reinterpretation and refinement were made where these two framework models disagree; therefore, limitations in the hydrogeologic representation of the flow system exist. Despite limitations, the framework model provides the best representation to date of the hydrogeologic units and structures that control regional ground-water flow and serves as an important information source used to construct and calibrate the predevelopment, steady-state flow model. In addition to the hydrogeologic framework, a complex array of mechanisms accounts for flow into, through, and out of the regional ground-water flow system. Natural discharges from the regional ground-water flow system occur by evapotranspiration, springs, and subsurface outflow. In this study, evapotranspiration rates were adapted from a related investigation that developed maps of evapotranspiration areas and computed rates from micrometeorological data collected within the local area over a multiyear period. In some cases, historical spring flow records were used to derive ground-water discharge rates for isolated regional springs. For this investigation, a process-based, numerical model was developed to estimat
Prediction of the low-velocity distribution from the pore structure in simple porous media
NASA Astrophysics Data System (ADS)
de Anna, Pietro; Quaife, Bryan; Biros, George; Juanes, Ruben
2017-12-01
The macroscopic properties of fluid flow and transport through porous media are a direct consequence of the underlying pore structure. However, precise relations that characterize flow and transport from the statistics of pore-scale disorder have remained elusive. Here we investigate the relationship between pore structure and the resulting fluid flow and asymptotic transport behavior in two-dimensional geometries of nonoverlapping circular posts. We derive an analytical relationship between the pore throat size distribution fλ˜λ-β and the distribution of the low fluid velocities fu˜u-β /2 , based on a conceptual model of porelets (the flow established within each pore throat, here a Hagen-Poiseuille flow). Our model allows us to make predictions, within a continuous-time random-walk framework, for the asymptotic statistics of the spreading of fluid particles along their own trajectories. These predictions are confirmed by high-fidelity simulations of Stokes flow and advective transport. The proposed framework can be extended to other configurations which can be represented as a collection of known flow distributions.
Benninger, Elizabeth; Savahl, Shazly
2016-01-01
This study aimed to explore how children construct and assign meaning to the “self” within two urban communities of Cape Town in South Africa. Using a child participation methodological framework data were collected using Photovoice and community maps with 54 participants between the ages of 9 and 12. Feelings of safety, social connectedness, and children's spaces were found to be central to the ways in which the participants constructed and assigned meaning to the “self.” The study provides implications for intervention programmes aimed at improving children's well-being to be inclusive of activities aimed at improving children's self-concept, including the construction of safe spaces for children to play, learn, and form meaningful relationships. PMID:27291161
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
ERIC Educational Resources Information Center
Hein, Nancy Porras; Miller, Barbara A.
2004-01-01
Using an authentic assessment approach, two California State University, Fullerton, faculty members developed instructional strategies in support of an assignment that requires students to situate their families within a historical framework. We describe our efforts to provide students with the research skills to successfully complete the…
The Science ELF: Assessing the Enquiry Levels Framework as a Heuristic for Professional Development
ERIC Educational Resources Information Center
Wheeler, Lindsay B.; Bell, Randy L.; Whitworth, Brooke A.; Maeng, Jennifer L.
2015-01-01
This study utilized an explanatory sequential mixed methods approach to explore randomly assigned treatment and control participants' frequency of inquiry instruction in secondary science classrooms. Eleven treatment participants received professional development (PD) that emphasized a structured approach to inquiry instruction, while 10 control…
Choosers, Obstructed Choosers, and Nonchoosers: A Framework for Defaulting in Schooling Choices
ERIC Educational Resources Information Center
Delale-O'Connor, Lori
2018-01-01
Background/Context: Prior research overlooks the importance of drawing distinctions within the category of defaulters or "nonchoosers" in schooling choices. Defaulters are both a theoretically and empirically interesting population, and understanding the processes by which families come to or are assigned the default school offers…
Using Wikis to Develop Collaborative Communities in an Environmental Chemistry Course
ERIC Educational Resources Information Center
Pence, Laura E.; Pence, Harry E.
2015-01-01
Group construction of wikis in an environmental chemistry course provided an effective framework for students to develop and to manage collaborative communities, characterized by interactive projects designed to deepen learning. A sequence of assignments facilitated improvement of the students' wiki construction and editing skills and these…
USDA-ARS?s Scientific Manuscript database
Prokaryotic taxonomy is the underpinning of microbiology, providing a framework for the proper identification and naming of organisms. The 'gold standard' of bacterial species delineation is the overall genome similarity as determined by DNA-DNA hybridization (DDH), a technically rigorous yet someti...
ERIC Educational Resources Information Center
Nageotte, Nichole; Buck, Gayle; Kirk, Holly
2018-01-01
Imagine saving just one of the 23,000 species threatened with extinction. Students studying endangered species in a general life science course faced the decision of which species to save as a summative assignment in a unit on scientific explanation and argumentation. They used the claim, evidence, and reasoning (CER) framework in which students…
ERIC Educational Resources Information Center
Dabach, Dafney Blanca
2015-01-01
This qualitative study examined how secondary teachers were assigned to teach courses intended to expand English learners' (ELs') access to academic subjects. Theoretically, this research extends the "contexts of reception" framework from immigration studies into the educational realm by investigating how teachers--as one important…
Technology Acceptance Predictors among Student Teachers and Experienced Classroom Teachers
ERIC Educational Resources Information Center
Smarkola, Claudia
2007-01-01
This study investigated 160 student teachers' and 158 experienced teachers' self-reported computer usage and their future intentions to use computer applications for school assignments. The Technology Acceptance Model (TAM) was used as the framework to determine computer usage and intentions. Statistically significant results showed that after…
Etched Impressions: Student Writing as Engaged Pedagogy in the Graduate Sport Management Classroom
ERIC Educational Resources Information Center
Veri, Maria J.; Barton, Kenny; Burgee, David; Davis, James A., Jr.; Eaton, Pamela; Frazier, Cathy; Gray, Stevie; Halsey, Christine; Thurman, Richard
2006-01-01
This article illustrates the pedagogical value of employing student narrative writing assignments in the graduate sport management classroom and advocates for cultural studies and critical pedagogy approaches to teaching sport management. The article considers students' autobiographical narratives within a theoretical framework of cultural…
Key Determinants of Student Satisfaction When Undertaking Group Work
ERIC Educational Resources Information Center
Pang, Elvy; Tong, Canon; Wong, Anthony
2011-01-01
The increasing popularity of team structures in business environment coupled with the common practice of including group projects/assignments in university curricula means that business schools should direct efforts towards maximizing team as well as personal results. Yet, most frameworks for studying teams center exclusively on team level…
The Transformative Experience in Engineering Education
NASA Astrophysics Data System (ADS)
Goodman, Katherine Ann
This research evaluates the usefulness of transformative experience (TE) in engineering education. With TE, students 1) apply ideas from coursework to everyday experiences without prompting (motivated use); 2) see everyday situations through the lens of course content (expanded perception); and 3) value course content in new ways because it enriches everyday affective experience (affective value). In a three-part study, we examine how engineering educators can promote student progress toward TE and reliably measure that progress. For the first study, we select a mechanical engineering technical elective, Flow Visualization, that had evidence of promoting expanded perception of fluid physics. Through student surveys and interviews, we compare this elective to the required Fluid Mechanics course. We found student interest in fluids fell into four categories: complexity, application, ubiquity, and aesthetics. Fluid Mechanics promotes interest from application, while Flow Visualization promotes interest based in ubiquity and aesthetics. Coding for expanded perception, we found it associated with students' engineering identity, rather than a specific course. In our second study, we replicate atypical teaching methods from Flow Visualization in a new design course: Aesthetics of Design. Coding of surveys and interviews reveals that open-ended assignments and supportive teams lead to increased ownership of projects, which fuels risk-taking, and produces increased confidence as an engineer. The third study seeks to establish parallels between expanded perception and measurable perceptual expertise. Our visual expertise experiment uses fluid flow images with both novices and experts (students who had passed fluid mechanics). After training, subjects sort images into laminar and turbulent categories. The results demonstrate that novices learned to sort the flow stimuli in ways similar to subjects in prior perceptual expertise studies. In contrast, the experts' significantly better results suggest they are accessing conceptual fluids knowledge to perform this new, visual task. The ability to map concepts onto visual information is likely a necessary step toward expanded perception. Our findings suggest that open-ended aesthetic experiences with engineering content unexpectedly support engineering identity development, and that visual tasks could be developed to measure conceptual understanding, promoting expanded perception. Overall, we find TE a productive theoretical framework for engineering education research.
Fournier, Auriel M. V.; Sullivan, Alexis R.; Bump, Joseph K.; Perkins, Marie; Shieldcastle, Mark C.; King, Sammy L.
2017-01-01
Stable hydrogen isotope (δD) methods for tracking animal movement are widely used yet often produce low resolution assignments. Incorporating prior knowledge of abundance, distribution or movement patterns can ameliorate this limitation, but data are lacking for most species. We demonstrate how observations reported by citizen scientists can be used to develop robust estimates of species distributions and to constrain δD assignments.We developed a Bayesian framework to refine isotopic estimates of migrant animal origins conditional on species distribution models constructed from citizen scientist observations. To illustrate this approach, we analysed the migratory connectivity of the Virginia rail Rallus limicola, a secretive and declining migratory game bird in North America.Citizen science observations enabled both estimation of sampling bias and construction of bias-corrected species distribution models. Conditioning δD assignments on these species distribution models yielded comparably high-resolution assignments.Most Virginia rails wintering across five Gulf Coast sites spent the previous summer near the Great Lakes, although a considerable minority originated from the Chesapeake Bay watershed or Prairie Pothole region of North Dakota. Conversely, the majority of migrating Virginia rails from a site in the Great Lakes most likely spent the previous winter on the Gulf Coast between Texas and Louisiana.Synthesis and applications. In this analysis, Virginia rail migratory connectivity does not fully correspond to the administrative flyways used to manage migratory birds. This example demonstrates that with the increasing availability of citizen science data to create species distribution models, our framework can produce high-resolution estimates of migratory connectivity for many animals, including cryptic species. Empirical evidence of links between seasonal habitats will help enable effective habitat management, hunting quotas and population monitoring and also highlight critical knowledge gaps.
Transnationalization of Television in West Europe. Working Paper No. 13.
ERIC Educational Resources Information Center
Sepstrup, Preben
Based primarily on data from public service broadcasting, this study had two major purposes: to develop a framework for understanding, conceptualizing, and measuring international television flows and the effects associated with these flows; and to establish a background of facts on international television flows in Western Europe. Secondary…
Conceptualizing Group Flow: A Framework
ERIC Educational Resources Information Center
Duncan, Jana; West, Richard E.
2018-01-01
This literature review discusses the similarities in main themes between Csikszentmihályi theory of individual flow and Sawyer theory of group flow, and compares Sawyer's theory with existing concepts in the literature on group work both in education and business. Because much creativity and innovation occurs within groups, understanding group…
A conceptual framework that links pollinator foraging behavior to gene flow
USDA-ARS?s Scientific Manuscript database
In insect-pollinated crops such as alfalfa, a better understanding of how pollinator foraging behavior affects gene flow could lead to the development of management strategies to reduce gene flow and facilitate the coexistence of distinct seed-production markets. Here, we introduce a conceptual fram...
NASA Astrophysics Data System (ADS)
James, C. M.; Gildfind, D. E.; Lewis, S. W.; Morgan, R. G.; Zander, F.
2018-03-01
Expansion tubes are an important type of test facility for the study of planetary entry flow-fields, being the only type of impulse facility capable of simulating the aerothermodynamics of superorbital planetary entry conditions from 10 to 20 km/s. However, the complex flow processes involved in expansion tube operation make it difficult to fully characterise flow conditions, with two-dimensional full facility computational fluid dynamics simulations often requiring tens or hundreds of thousands of computational hours to complete. In an attempt to simplify this problem and provide a rapid flow condition prediction tool, this paper presents a validated and comprehensive analytical framework for the simulation of an expansion tube facility. It identifies central flow processes and models them from state to state through the facility using established compressible and isentropic flow relations, and equilibrium and frozen chemistry. How the model simulates each section of an expansion tube is discussed, as well as how the model can be used to simulate situations where flow conditions diverge from ideal theory. The model is then validated against experimental data from the X2 expansion tube at the University of Queensland.
Implementation of density-based solver for all speeds in the framework of OpenFOAM
NASA Astrophysics Data System (ADS)
Shen, Chun; Sun, Fengxian; Xia, Xinlin
2014-10-01
In the framework of open source CFD code OpenFOAM, a density-based solver for all speeds flow field is developed. In this solver the preconditioned all speeds AUSM+(P) scheme is adopted and the dual time scheme is implemented to complete the unsteady process. Parallel computation could be implemented to accelerate the solving process. Different interface reconstruction algorithms are implemented, and their accuracy with respect to convection is compared. Three benchmark tests of lid-driven cavity flow, flow crossing over a bump, and flow over a forward-facing step are presented to show the accuracy of the AUSM+(P) solver for low-speed incompressible flow, transonic flow, and supersonic/hypersonic flow. Firstly, for the lid driven cavity flow, the computational results obtained by different interface reconstruction algorithms are compared. It is indicated that the one dimensional reconstruction scheme adopted in this solver possesses high accuracy and the solver developed in this paper can effectively catch the features of low incompressible flow. Then via the test cases regarding the flow crossing over bump and over forward step, the ability to capture characteristics of the transonic and supersonic/hypersonic flows are confirmed. The forward-facing step proves to be the most challenging for the preconditioned solvers with and without the dual time scheme. Nonetheless, the solvers described in this paper reproduce the main features of this flow, including the evolution of the initial transient.
A metal–organic framework immobilised iridium pincer complex
Rimoldi, Martino; Nakamura, Akitake; Vermeulen, Nicolaas A.; ...
2016-05-10
An iridium pincer complex has been immobilised in the metal–organic framework NU-1000. The stable Ir-pincer modified NU-1000 is catalytically active in the hydrogenation of alkenes in condensed phase and under flow conditions.
NASA Astrophysics Data System (ADS)
Mortensen, Mikael; Langtangen, Hans Petter; Wells, Garth N.
2011-09-01
Finding an appropriate turbulence model for a given flow case usually calls for extensive experimentation with both models and numerical solution methods. This work presents the design and implementation of a flexible, programmable software framework for assisting with numerical experiments in computational turbulence. The framework targets Reynolds-averaged Navier-Stokes models, discretized by finite element methods. The novel implementation makes use of Python and the FEniCS package, the combination of which leads to compact and reusable code, where model- and solver-specific code resemble closely the mathematical formulation of equations and algorithms. The presented ideas and programming techniques are also applicable to other fields that involve systems of nonlinear partial differential equations. We demonstrate the framework in two applications and investigate the impact of various linearizations on the convergence properties of nonlinear solvers for a Reynolds-averaged Navier-Stokes model.
Using the red/yellow/green discharge tool to improve the timeliness of hospital discharges.
Mathews, Kusum S; Corso, Philip; Bacon, Sandra; Jenq, Grace Y
2014-06-01
As part of Yale-New Haven Hospital (Connecticut)'s Safe Patient Flow Initiative, the physician leadership developed the Red/Yellow/Green (RYG) Discharge Tool, an electronic medical record-based prompt to identify likelihood of patients' next-day discharge: green (very likely), yellow (possibly), and red (unlikely). The tool's purpose was to enhance communication with nursing/care coordination and trigger earlier discharge steps for patients identified as "green" or "yellow." Data on discharge assignments, discharge dates/ times, and team designation were collected for all adult medicine patients discharged in October-December 2009 (Study Period 1) and October-December 2011 (Study Period 2), between which the tool's placement changed from the sign-out note to the daily progress note. In Study Period 1, 75.9% of the patients had discharge assignments, compared with 90.8% in Period 2 (p < .001). The overall 11 A.M. discharge rate improved from 10.4% to 21.2% from 2007 to 2011. "Green" patients were more likely to be discharged before 11 A.M. than "yellow" or "red" patients (p < .001). Patients with RYG assignments discharged by 11 A.M. had a lower length of stay than those without assignments and did not have an associated increased risk of readmission. Discharge prediction accuracy worsened after the change in placement, decreasing from 75.1% to 59.1% for "green" patients (p < .001), and from 34.5% to 29.2% (p < .001) for "yellow" patients. In both periods, hospitalists were more accurate than house staff in discharge predictions, suggesting that education and/or experience may contribute to discharge assignment. The RYG Discharge Tool helped facilitate earlier discharges, but accuracy depends on placement in daily work flow and experience.
Using the Red/Yellow/Green Discharge Tool to Improve the Timeliness of Hospital Discharges
Mathews, Kusum S.; Corso, Philip; Bacon, Sandra; Jenq, Grace Y.
2015-01-01
Background As part of Yale-New Haven Hospital (Connecticut)’s Safe Patient Flow Initiative, the physician leadership developed the Red/Yellow/Green (RYG) Discharge Tool, an electronic medical record–based prompt to identify likelihood of patients’ next-day discharge: green (very likely), yellow (possibly), and red (unlikely). The tool’s purpose was to enhance communication with nursing/care coordination and trigger earlier discharge steps for patients identified as “green” or “yellow”. Methods Data on discharge assignments, discharge dates/times, and team designation were collected for all adult medicine patients discharged from October – December 2009 (Study Period 1) and October – December 2011 (Study Period 2), between which the tool’s placement changed from the sign-out note to the daily progress note. Results In Study Period 1, 75.9% of the patients had discharge assignments, compared with 90.8% in Period 2 (p < .001). The overall 11 A.M. discharge rate improved from 10.4% to 21.2% from 2007 to 2011. “Green” patients were more likely to be discharged before 11 A.M. than “yellow” or “red” patients (p < .001). Patients with RYG assignments discharged by 11 A.M. had a lower length of stay than those without assignments and did not have an associated increased risk of readmission. Discharge prediction accuracy worsened after the change in placement, decreasing from 75.1% to 59.1% for “green” patients (p < .001), and from 34.5% to 29.2% (p < .001) for “yellow” patients. In both periods, hospitalists were more accurate than housestaff in discharge predictions, suggesting that education and/or experience may contribute to discharge assignment. Conclusions The RYG Discharge Tool helped facilitate earlier discharges, but accuracy depends on placement in daily work flow and experience. PMID:25016672
A vision framework for the localization of soccer players and ball on the pitch using Handycams
NASA Astrophysics Data System (ADS)
Vilas, Tiago; Rodrigues, J. M. F.; Cardoso, P. J. S.; Silva, Bruno
2015-03-01
The current performance requirements in soccer make imperative the use of new technologies for game observation and analysis, such that detailed information about the teams' actions is provided. This paper summarizes a framework to collect the soccer players and ball positions using one or more Full HD Handycams, placed no more than 20cm apart in the stands, as well as how this framework connects to the FootData project. The system was based on four main modules: the detection and delimitation of the soccer pitch, the ball and the players detection and assignment to their teams, the tracking of players and ball and finally the computation of their localization (in meters) in the pitch.
Walling, Bendangtola; Chaudhary, Shushobhit; Dhanya, C T; Kumar, Arun
2017-05-01
Environmental flows (Eflow, hereafter) are the flows to be maintained in the river for its healthy functioning and the sustenance and protection of aquatic ecosystems. Estimation of Eflow in any river stretch demands consideration of various factors such as flow regime, ecosystem, and health of river. However, most of the Eflow estimation studies have neglected the water quality factor. This study urges the need to consider water quality criterion in the estimation of Eflow and proposes a framework for estimating Eflow incorporating water quality variations under present and hypothetical future scenarios of climate change and pollution load. The proposed framework is applied on the polluted stretch of Yamuna River passing through Delhi, India. Required Eflow at various locations along the stretch are determined by considering possible variations in future water quantity and quality. Eflow values satisfying minimum quality requirements for different river water usage classes (classes A, B, C, and D as specified by the Central Pollution Control Board, India) are found to be between 700 and 800 m 3 /s. The estimated Eflow values may aid policymakers to derive upstream storage-release policies or effluent restrictions. Generalized nature of this framework will help its implementation on any river systems.
NASA Astrophysics Data System (ADS)
Kovanen, Dori J.; Slaymaker, Olav
2008-07-01
Active debris flow fans in the North Cascade Foothills of Washington State constitute a natural hazard of importance to land managers, private property owners and personal security. In the absence of measurements of the sediment fluxes involved in debris flow events, a morphological-evolutionary systems approach, emphasizing stratigraphy, dating, fan morphology and debris flow basin morphometry, was used. Using the stratigraphic framework and 47 radiocarbon dates, frequency of occurrence and relative magnitudes of debris flow events have been estimated for three spatial scales of debris flow systems: the within-fan site scale (84 observations); the fan meso-scale (six observations) and the lumped fan, regional or macro-scale (one fan average and adjacent lake sediments). In order to characterize the morphometric framework, plots of basin area v. fan area, basin area v. fan gradient and the Melton ruggedness number v. fan gradient for the 12 debris flow basins were compared with those documented for semi-arid and paraglacial fans. Basin area to fan area ratios were generally consistent with the estimated level of debris flow activity during the Holocene as reported below. Terrain analysis of three of the most active debris flow basins revealed the variety of modes of slope failure and sediment production in the region. Micro-scale debris flow event systems indicated a range of recurrence intervals for large debris flows from 106-3645 years. The spatial variation of these rates across the fans was generally consistent with previously mapped hazard zones. At the fan meso-scale, the range of recurrence intervals for large debris flows was 273-1566 years and at the regional scale, the estimated recurrence interval of large debris flows was 874 years (with undetermined error bands) during the past 7290 years. Dated lake sediments from the adjacent Lake Whatcom gave recurrence intervals for large sediment producing events ranging from 481-557 years over the past 3900 years and clearly discernible sedimentation events in the lacustrine sediments had a recurrence interval of 67-78 years over that same period.
Cunningham, K.J.; Renken, R.A.; Wacker, M.A.; Zygnerski, M.R.; Robinson, E.; Shapiro, A.M.; Wingard, G.L.
2006-01-01
Combined analyses of cores, borehole geophysical logs, and cyclostratigraphy produced a new conceptual hydrogeologic framework for the triple-porosity (matrix, touching-vug, and conduit porosity) karst limestone of the Biscayne aquifer in a 0.65 km2 study area, SE Florida. Vertical lithofacies successions, which have recurrent stacking patterns, fit within high-frequency cycles. We define three ideal high-frequency cycles as: (1) upward-shallowing subtidal cycles, (2) upward-shallowing paralic cycles, and (3) aggradational subtidal cycles. Digital optical borehole images, tracers, and flow meters indicate that there is a predictable vertical pattern of porosity and permeability within the three ideal cycles, because the distribution of porosity and permeability is related to lithofacies. Stratiform zones of high permeability commonly occur just above flooding surfaces in the lower part of upward-shallowing subtidal and paralic cycles, forming preferential groundwater flow zones. Aggradational subtidal cycles are either mostly high-permeability zones or leaky, low-permeability units. In the study area, groundwater flow within stratiform high-permeability zones is through a secondary pore system of touching-vug porosity principally related to molds of burrows and pelecypods and to interburrow vugs. Movement of a dye-tracer pulse observed using a borehole fluid-temperature tool during a conservative tracer test indicates heterogeneous permeability. Advective movement of the tracer appears to be most concentrated within a thin stratiform flow zone contained within the lower part of a high-frequency cycle, indicating a distinctly high relative permeability for this zone. Borehole flow-meter measurements corroborate the relatively high permeability of the flow zone. Identification and mapping of such high-permeability flow zones is crucial to conceptualization of karst groundwater flow within a cyclostratigraphic framework. Many karst aquifers are included in cyclic platform carbonates. Clearly, a cyclostratigraphic approach that translates carbonate aquifer heterogeneity into a consistent framework of correlative units will improve simulation of karst groundwater flow. ?? 2006 Geological Society of America.
A GPU-Parallelized Eigen-Based Clutter Filter Framework for Ultrasound Color Flow Imaging.
Chee, Adrian J Y; Yiu, Billy Y S; Yu, Alfred C H
2017-01-01
Eigen-filters with attenuation response adapted to clutter statistics in color flow imaging (CFI) have shown improved flow detection sensitivity in the presence of tissue motion. Nevertheless, its practical adoption in clinical use is not straightforward due to the high computational cost for solving eigendecompositions. Here, we provide a pedagogical description of how a real-time computing framework for eigen-based clutter filtering can be developed through a single-instruction, multiple data (SIMD) computing approach that can be implemented on a graphical processing unit (GPU). Emphasis is placed on the single-ensemble-based eigen-filtering approach (Hankel singular value decomposition), since it is algorithmically compatible with GPU-based SIMD computing. The key algebraic principles and the corresponding SIMD algorithm are explained, and annotations on how such algorithm can be rationally implemented on the GPU are presented. Real-time efficacy of our framework was experimentally investigated on a single GPU device (GTX Titan X), and the computing throughput for varying scan depths and slow-time ensemble lengths was studied. Using our eigen-processing framework, real-time video-range throughput (24 frames/s) can be attained for CFI frames with full view in azimuth direction (128 scanlines), up to a scan depth of 5 cm ( λ pixel axial spacing) for slow-time ensemble length of 16 samples. The corresponding CFI image frames, with respect to the ones derived from non-adaptive polynomial regression clutter filtering, yielded enhanced flow detection sensitivity in vivo, as demonstrated in a carotid imaging case example. These findings indicate that the GPU-enabled eigen-based clutter filtering can improve CFI flow detection performance in real time.
NASA Astrophysics Data System (ADS)
Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.
2012-12-01
"From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following an idea by Festger and Walter, 2002. These quasi steady-state flow fields are cast into a geostatistical Monte Carlo framework to admit and evaluate the influence of parameter uncertainty on the delineation process. Furthermore, this framework enables conditioning on observed data with any conditioning scheme, such as rejection sampling, Ensemble Kalman Filters, etc. To further reduce the computational load, we use the reverse formulation of advective-dispersive transport. We simulate the reverse transport by particle tracking random walk in order to avoid numerical dispersion to account for well arrival times.
Apaydin, Mehmet Serkan; Çatay, Bülent; Patrick, Nicholas; Donald, Bruce R
2011-05-01
Nuclear magnetic resonance (NMR) spectroscopy is an important experimental technique that allows one to study protein structure and dynamics in solution. An important bottleneck in NMR protein structure determination is the assignment of NMR peaks to the corresponding nuclei. Structure-based assignment (SBA) aims to solve this problem with the help of a template protein which is homologous to the target and has applications in the study of structure-activity relationship, protein-protein and protein-ligand interactions. We formulate SBA as a linear assignment problem with additional nuclear overhauser effect constraints, which can be solved within nuclear vector replacement's (NVR) framework (Langmead, C., Yan, A., Lilien, R., Wang, L. and Donald, B. (2003) A Polynomial-Time Nuclear Vector Replacement Algorithm for Automated NMR Resonance Assignments. Proc. the 7th Annual Int. Conf. Research in Computational Molecular Biology (RECOMB) , Berlin, Germany, April 10-13, pp. 176-187. ACM Press, New York, NY. J. Comp. Bio. , (2004), 11, pp. 277-298; Langmead, C. and Donald, B. (2004) An expectation/maximization nuclear vector replacement algorithm for automated NMR resonance assignments. J. Biomol. NMR , 29, 111-138). Our approach uses NVR's scoring function and data types and also gives the option of using CH and NH residual dipolar coupling (RDCs), instead of NH RDCs which NVR requires. We test our technique on NVR's data set as well as on four new proteins. Our results are comparable to NVR's assignment accuracy on NVR's test set, but higher on novel proteins. Our approach allows partial assignments. It is also complete and can return the optimum as well as near-optimum assignments. Furthermore, it allows us to analyze the information content of each data type and is easily extendable to accept new forms of input data, such as additional RDCs.
Perceptual analysis of vibrotactile flows on a mobile device.
Seo, Jongman; Choi, Seungmoon
2013-01-01
"Vibrotactile flow" refers to a continuously moving sensation of vibrotactile stimulation applied by a few actuators directly onto the skin or through a rigid medium. Research demonstrated the effectiveness of vibrotactile flow for conveying intuitive directional information on a mobile device. In this paper, we extend previous research by investigating the perceptual characteristics of vibrotactile flows rendered on a mobile device and proposing a synthesis framework for vibrotactile flows with desired perceptual properties.
NASA Astrophysics Data System (ADS)
Yang, Z. L.; Wu, W. Y.; Lin, P.; Maidment, D. R.
2017-12-01
Extreme water events such as catastrophic floods and severe droughts have increased in recent decades. Mitigating the risk to lives, food security, infrastructure, energy supplies, as well as numerous other industries posed by these extreme events requires informed decision-making and planning based on sound science. We are developing a global water modeling capability by building models that will provide total operational water predictions (evapotranspiration, soil moisture, groundwater, channel flow, inundation, snow) at unprecedented spatial resolutions and updated frequencies. Toward this goal, this talk presents an integrated global hydrological modeling framework that takes advantage of gridded meteorological forcing, land surface modeling, channeled flow modeling, ground observations, and satellite remote sensing. Launched in August 2016, the National Water Model successfully incorporates weather forecasts to predict river flows for more than 2.7 million rivers across the continental United States, which transfers a "synoptic weather map" to a "synoptic river flow map" operationally. In this study, we apply a similar framework to a high-resolution global river network database, which is developed from a hierarchical Dominant River Tracing (DRT) algorithm, and runoff output from the Global Land Data Assimilation System (GLDAS) to a vector-based river routing model (The Routing Application for Parallel Computation of Discharge, RAPID) to produce river flows from 2001 to 2016 using Message Passing Interface (MPI) on Texas Advanced Computer Center's Stampede system. In this simulation, global river discharges for more than 177,000 rivers are computed every 30 minutes. The modeling framework's performance is evaluated with various observations including river flows at more than 400 gauge stations globally. Overall, the model exhibits a reasonably good performance in simulating the averaged patterns of terrestrial water storage, evapotranspiration and runoff. The system is appropriate for monitoring and studying floods and droughts. Directions for future research will be outlined and discussed.
Ontological Problem-Solving Framework for Dynamically Configuring Sensor Systems and Algorithms
Qualls, Joseph; Russomanno, David J.
2011-01-01
The deployment of ubiquitous sensor systems and algorithms has led to many challenges, such as matching sensor systems to compatible algorithms which are capable of satisfying a task. Compounding the challenges is the lack of the requisite knowledge models needed to discover sensors and algorithms and to subsequently integrate their capabilities to satisfy a specific task. A novel ontological problem-solving framework has been designed to match sensors to compatible algorithms to form synthesized systems, which are capable of satisfying a task and then assigning the synthesized systems to high-level missions. The approach designed for the ontological problem-solving framework has been instantiated in the context of a persistence surveillance prototype environment, which includes profiling sensor systems and algorithms to demonstrate proof-of-concept principles. Even though the problem-solving approach was instantiated with profiling sensor systems and algorithms, the ontological framework may be useful with other heterogeneous sensing-system environments. PMID:22163793
NASA Astrophysics Data System (ADS)
Murali, R. V.; Puri, A. B.; Fathi, Khalid
2010-10-01
This paper presents an extended version of study already undertaken on development of an artificial neural networks (ANNs) model for assigning workforce into virtual cells under virtual cellular manufacturing systems (VCMS) environments. Previously, the same authors have introduced this concept and applied it to virtual cells of two-cell configuration and the results demonstrated that ANNs could be a worth applying tool for carrying out workforce assignments. In this attempt, three-cell configurations problems are considered for worker assignment task. Virtual cells are formed under dual resource constraint (DRC) context in which the number of available workers is less than the total number of machines available. Since worker assignment tasks are quite non-linear and highly dynamic in nature under varying inputs & conditions and, in parallel, ANNs have the ability to model complex relationships between inputs and outputs and find similar patterns effectively, an attempt was earlier made to employ ANNs into the above task. In this paper, the multilayered perceptron with feed forward (MLP-FF) neural network model has been reused for worker assignment tasks of three-cell configurations under DRC context and its performance at different time periods has been analyzed. The previously proposed worker assignment model has been reconfigured and cell formation solutions available for three-cell configuration in the literature are used in combination to generate datasets for training ANNs framework. Finally, results of the study have been presented and discussed.
Quantifying water flow and retention in an unsaturated fracture-facial domain
Nimmo, John R.; Malek-Mohammadi, Siamak
2015-01-01
Hydrologically significant flow and storage of water occur in macropores and fractures that are only partially filled. To accommodate such processes in flow models, we propose a three-domain framework. Two of the domains correspond to water flow and water storage in a fracture-facial region, in addition to the third domain of matrix water. The fracture-facial region, typically within a fraction of a millimeter of the fracture wall, includes a flowing phase whose fullness is determined by the availability and flux of preferentially flowing water, and a static storage portion whose fullness is determined by the local matric potential. The flow domain can be modeled with the source-responsive preferential flow model, and the roughness-storage domain can be modeled with capillary relations applied on the fracture-facial area. The matrix domain is treated using traditional unsaturated flow theory. We tested the model with application to the hydrology of the Chalk formation in southern England, coherently linking hydrologic information including recharge estimates, streamflow, water table fluctuation, imaging by electron microscopy, and surface roughness. The quantitative consistency of the three-domain matrix-microcavity-film model with this body of diverse data supports the hypothesized distinctions and active mechanisms of the three domains and establishes the usefulness of this framework.
Performance Mapping Studies in Redox Flow Cells
NASA Technical Reports Server (NTRS)
Hoberecht, M. A.; Thaller, L. H.
1981-01-01
Pumping power requirements in any flow battery system constitute a direct parasitic energy loss. It is therefore useful to determine the practical lower limit for reactant flow rates. Through the use of a theoretical framework based on electrochemical first principles, two different experimental flow mapping techniques were developed to evaluate and compare electrodes as a function of flow rate. For the carbon felt electrodes presently used in NASA-Lewis Redox cells, a flow rate 1.5 times greater than the stoichiometric rate seems to be the required minimum.
Harmonising Nursing Terminologies Using a Conceptual Framework.
Jansen, Kay; Kim, Tae Youn; Coenen, Amy; Saba, Virginia; Hardiker, Nicholas
2016-01-01
The International Classification for Nursing Practice (ICNP®) and the Clinical Care Classification (CCC) System are standardised nursing terminologies that identify discrete elements of nursing practice, including nursing diagnoses, interventions, and outcomes. While CCC uses a conceptual framework or model with 21 Care Components to classify these elements, ICNP, built on a formal Web Ontology Language (OWL) description logic foundation, uses a logical hierarchical framework that is useful for computing and maintenance of ICNP. Since the logical framework of ICNP may not always align with the needs of nursing practice, an informal framework may be a more useful organisational tool to represent nursing content. The purpose of this study was to classify ICNP nursing diagnoses using the 21 Care Components of the CCC as a conceptual framework to facilitate usability and inter-operability of nursing diagnoses in electronic health records. Findings resulted in all 521 ICNP diagnoses being assigned to one of the 21 CCC Care Components. Further research is needed to validate the resulting product of this study with practitioners and develop recommendations for improvement of both terminologies.
Modelling ecosystem service flows under uncertainty with stochiastic SPAN
Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.
2012-01-01
Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.
A framework for the modeling of gut blood flow regulation and postprandial hyperaemia
Jeays, Adam David; Lawford, Patricia Veronica; Gillott, Richard; Spencer, Paul A; Bardhan, Karna Dev; Hose, David Rodney
2007-01-01
After a meal the activity of the gut increases markedly as digestion takes place. Associated with this increase in activity is an increase in blood flow, which has been shown to be dependent on factors such as caloric content and constitution of the meal. Much qualitative work has been carried out regarding mechanisms for the presence of food in a section of gut producing increased blood flow to that section, but there are still many aspects of this process that are not fully understood. In this paper we briefly review current knowledge on several relevant areas relating to gut blood flow, focusing on quantitative data where available and highlighting areas where further research is needed. We then present new data on the effect of feeding on flow in the superior mesenteric artery. Finally, we describe a framework for combining this data to produce a single model describing the mechanisms involved in postprandial hyperaemia. For a section of the model, where appropriate data are available, preliminary results are presented. PMID:17457971
The thermodynamics of dense granular flow and jamming
NASA Astrophysics Data System (ADS)
Lu, Shih Yu
The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.
Patient-specific CFD simulation of intraventricular haemodynamics based on 3D ultrasound imaging.
Bavo, A M; Pouch, A M; Degroote, J; Vierendeels, J; Gorman, J H; Gorman, R C; Segers, P
2016-09-09
The goal of this paper is to present a computational fluid dynamic (CFD) model with moving boundaries to study the intraventricular flows in a patient-specific framework. Starting from the segmentation of real-time transesophageal echocardiographic images, a CFD model including the complete left ventricle and the moving 3D mitral valve was realized. Their motion, known as a function of time from the segmented ultrasound images, was imposed as a boundary condition in an Arbitrary Lagrangian-Eulerian framework. The model allowed for a realistic description of the displacement of the structures of interest and for an effective analysis of the intraventricular flows throughout the cardiac cycle. The model provides detailed intraventricular flow features, and highlights the importance of the 3D valve apparatus for the vortex dynamics and apical flow. The proposed method could describe the haemodynamics of the left ventricle during the cardiac cycle. The methodology might therefore be of particular importance in patient treatment planning to assess the impact of mitral valve treatment on intraventricular flow dynamics.
Noise Production of an Idealized Two-Dimensional Fish School
NASA Astrophysics Data System (ADS)
Wagenhoffer, Nathan; Moored, Keith; Jaworski, Justin
2017-11-01
The analysis of quiet bio-inspired propulsive concepts requires a rapid, unified computational framework that integrates the coupled fluid-solid dynamics of swimmers and their wakes with the resulting noise generation. Such a framework is presented for two-dimensional flows, where the fluid motion is modeled by an unsteady boundary element method with a vortex-particle wake. The unsteady surface forces from the potential flow solver are then passed to an acoustic boundary element solver to predict the radiated sound in low-Mach-number flows. The coupled flow-acoustic solver is validated against canonical vortex-sound problems. A diamond arrangement of four airfoils are subjected to traveling wave kinematics representing a known idealized pattern for a school of fish, and the airfoil motion and inflow values are derived from the range of Strouhal values common to many natural swimmers. The coupled flow-acoustic solver estimates and analyzes the hydrodynamic performance and noise production of the idealized school of swimmers.
A lava flow simulation model for the development of volcanic hazard maps for Mount Etna (Italy)
NASA Astrophysics Data System (ADS)
Damiani, M. L.; Groppelli, G.; Norini, G.; Bertino, E.; Gigliuto, A.; Nucita, A.
2006-05-01
Volcanic hazard assessment is of paramount importance for the safeguard of the resources exposed to volcanic hazards. In the paper we present ELFM, a lava flow simulation model for the evaluation of the lava flow hazard on Mount Etna (Sicily, Italy), the most important active volcano in Europe. The major contributions of the paper are: (a) a detailed specification of the lava flow simulation model and the specification of an algorithm implementing it; (b) the definition of a methodological framework for applying the model to the specific volcano. For what concerns the former issue, we propose an extended version of an existing stochastic model that has been applied so far only to the assessment of the volcanic hazard on Lanzarote and Tenerife (Canary Islands). Concerning the methodological framework, we claim model validation is definitely needed for assessing the effectiveness of the lava flow simulation model. To that extent a strategy has been devised for the generation of simulation experiments and evaluation of their outcomes.
Page, Lindsay C
2012-04-01
Results from MDRC's longitudinal, random-assignment evaluation of career-academy high schools reveal that several years after high-school completion, those randomized to receive the academy opportunity realized a $175 (11%) increase in monthly earnings, on average. In this paper, I investigate the impact of duration of actual academy enrollment, as nearly half of treatment group students either never enrolled or participated for only a portion of high school. I capitalize on data from this experimental evaluation and utilize a principal stratification framework and Bayesian inference to investigate the causal impact of academy participation. This analysis focuses on a sample of 1,306 students across seven sites in the MDRC evaluation. Participation is measured by number of years of academy enrollment, and the outcome of interest is average monthly earnings in the period of four to eight years after high school graduation. I estimate an average causal effect of treatment assignment on subsequent monthly earnings of approximately $588 among males who remained enrolled in an academy throughout high school and more modest impacts among those who participated only partially. Different from an instrumental variables approach to treatment non-compliance, which allows for the estimation of linear returns to treatment take-up, the more general framework of principal stratification allows for the consideration of non-linear returns, although at the expense of additional model-based assumptions.
Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol
2009-01-01
Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693
Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol
2009-01-01
Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.
Applied Empiricism: Ensuring the Validity of Causal Response to Intervention Decisions
ERIC Educational Resources Information Center
Kilgus, Stephen P.; Collier-Meek, Melissa A.; Johnson, Austin H.; Jaffery, Rose
2014-01-01
School personnel make a variety of decisions within multitiered problem-solving frameworks, including the decision to assign a student to group-based support, to design an individualized support plan, or classify a student as eligible for special education. Each decision is founded upon a judgment regarding whether the student has responded to…
Framework for Disciplinary Writing in Science Grades 6-12: A National Survey
ERIC Educational Resources Information Center
Drew, Sally Valentino; Olinghouse, Natalie G.; Faggella-Luby, Michael; Welsh, Megan E.
2017-01-01
This study investigated the current state of writing instruction in science classes (Grades 6-12). A random sample of certified science teachers from the United States (N = 287) was electronically surveyed. Participants reported on their purposes for teaching writing, the writing assignments most often given to students, use of evidence-based…
Evaluating Intervention Effects in a Diagnostic Classification Model Framework
ERIC Educational Resources Information Center
Madison, Matthew J.; Bradshaw, Laine
2018-01-01
The evaluation of intervention effects is an important objective of educational research. One way to evaluate the effectiveness of an intervention is to conduct an experiment that assigns individuals to control and treatment groups. In the context of pretest/posttest designed studies, this is referred to as a control-group pretest/posttest design.…
Designing a Digital Story Assignment for Basic Writers Using the TPCK Framework
ERIC Educational Resources Information Center
Bandi-Rao, Shoba; Sepp, Mary
2014-01-01
The process of digital storytelling allows basic writers to take a personal narrative and translate it into a multimodal and multidimensional experience, motivating a diverse group of writers with different learning styles to engage more creatively and meaningfully in the writing process. Digital storytelling has the capacity to contextualize…
A Framework for Semantic Group Formation in Education
ERIC Educational Resources Information Center
Ounnas, Asma; Davis, Hugh C.; Millard, David E.
2009-01-01
Collaboration has long been considered an effective approach to learning. However, forming optimal groups can be a time consuming and complex task. Different approaches have been developed to assist teachers allocate students to groups based on a set of constraints. However, existing tools often fail to assign some students to groups creating a…
Enhancing Critical Thinking across the Undergraduate Experience: An Exemplar from Engineering
ERIC Educational Resources Information Center
Ralston, Patricia A.; Bays, Cathy L.
2013-01-01
Faculty in a large, urban school of engineering designed a longitudinal study to assess the critical thinking skills of undergraduate students as they progressed through the engineering program. The Paul-Elder critical thinking framework was used to design course assignments and develop a holistic assessment rubric. The curriculum was re-designed…
ERIC Educational Resources Information Center
Corbi, Alberto; Burgos, Daniel
2017-01-01
This paper presents how virtual containers enhance the implementation of STEAM (science, technology, engineering, arts, and math) subjects as Open Educational Resources (OER). The publication initially summarizes the limitations of delivering open rich learning contents and corresponding assignments to students in college level STEAM areas. The…
Employability, Managerialism, and Performativity in Higher Education: A Relational Perspective
ERIC Educational Resources Information Center
Kalfa, Senia; Taksa, Lucy
2017-01-01
This article combines Bourdieu's concepts of field, habitus and cultural capital with Lyotard's account of performativity to construct a three-tiered framework in order to explore how managerialism has affected the academic habitus. Specifically, this article examines the adoption of group assignments as a means of developing teamwork skills in…
Framework for Sustaining Innovation at Baker Library, Harvard Business School
ERIC Educational Resources Information Center
Dolan, Meghan; Hemment, Michael; Oliver, Stephanie
2017-01-01
Baker Library at Harvard Business School is increasingly asked by the school's faculty to create custom digital information products to enhance course assignments and to find novel ways of electronically disseminating faculty research. In order to prioritize these requests, as well as facilitate, manage, and track the resulting projects, the…
Information Literacy Instruction and Assessment in a Community College: A Collaborative Design
ERIC Educational Resources Information Center
Argüelles, Carlos
2015-01-01
This article describes practical steps taken in the planning of an integrated information literacy instruction linked to a course assignment for community health majors at Kingsborough Community College of the City University of New York. The library sessions integrated the Association of College and Research Libraries Framework for Information…
Tuning: A Guide for Creating Discipline-Specific Frameworks to Foster Meaningful Change
ERIC Educational Resources Information Center
Marshall, David W.
2017-01-01
Tuning, as a methodology, implies a philosophy of curriculum design, pedagogy, and assignment design. It implies that successful study in a discipline depends on intentional construction of learning experiences for students. Intentional construction of learning experiences requires an understanding of the learning goals set forth by faculty for…
Research Knowledge Transfer through Business-Driven Student Assignment
ERIC Educational Resources Information Center
Sas, Corina
2009-01-01
Purpose: The purpose of this paper is to present a knowledge transfer method that capitalizes on both research and teaching dimensions of academic work. It also aims to propose a framework for evaluating the impact of such a method on the involved stakeholders. Design/methodology/approach: The case study outlines and evaluates the six-stage…
The Student as Interpreter: What Do We Mean When We Ask Who Did It?
ERIC Educational Resources Information Center
Steele, Jack
One version of a first year seminar in rhetoric examines the President Kennedy assassination controversy as seen by several writers in a rhetorical framework that stresses the difference, particularly in regard to the writers' approaches to truth, in intellectual and imaginative discourses. The assignments, three major writing projects, introduce…
Promoting Self-Regulation through School-Based Martial Arts Training
ERIC Educational Resources Information Center
Lakes, Kimberley D.; Hoyt, William T.
2004-01-01
The impact of school-based Tae Kwon Do training on self-regulatory abilities was examined. A self-regulation framework including three domains (cognitive, affective, and physical) was presented. Children (N = 207) from kindergarten through Grade 5 were randomly assigned by homeroom class to either the intervention (martial arts) group or a…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
... reduce or control overabundant resident Canada geese populations. Increasing the daily bag limit from 8 to 15 geese may help both States reduce or control existing high populations of resident Canada geese... our Migratory Bird Surveys and assigned control number 1018-0023 (expires 4/30/2014). This information...
The Structure of Federal Policy: Deciphering the United States Code
ERIC Educational Resources Information Center
Staller, Karen M.
2004-01-01
CSWE mandates the study of social welfare policy, its history, and its evaluation; thus, assignments that require students to select policy for study are common. Educators provide frameworks for analysis but may not address the prerequisite step of locating policy for study, so students flounder in government documents without the tools to…
Challenges and Opportunities for the School System
ERIC Educational Resources Information Center
Capogna, Stefania
2015-01-01
Educational agencies are going through a difficult transition, with important consequences in terms of de-legitimacy of the social mandate historically assigned to the school. The reason for this widespread unease can be traced to the fact that today we live in a complex system. This paper arises within this problematic framework with the intent…
Perspectives of Fitness and Health in College Men and Women
ERIC Educational Resources Information Center
Waldron, Jennifer J.; Dieser, Rodney B.
2010-01-01
Because many college students engage in low levels of physical activity, the current study used a qualitative framework to interview 11 college students to examine the meaning physically active college students assign to the practice of fitness and health. Students discussed the importance of healthy eating, but that it was difficult to accomplish…
A Unified Framework for Analyzing and Designing for Stationary Arterial Networks
DOT National Transportation Integrated Search
2017-05-17
This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...
NASA Astrophysics Data System (ADS)
Destro, Elisa; Amponsah, William; Nikolopoulos, Efthymios I.; Marchi, Lorenzo; Marra, Francesco; Zoccatelli, Davide; Borga, Marco
2018-03-01
The concurrence of flash floods and debris flows is of particular concern, because it may amplify the hazard corresponding to the individual generative processes. This paper presents a coupled modelling framework for the predictions of flash flood response and of the occurrence of debris flows initiated by channel bed mobilization. The framework combines a spatially distributed flash flood response model and a debris flow initiation model to define a threshold value for the peak flow which permits identification of channelized debris flow initiation. The threshold is defined over the channel network as a function of the upslope area and of the local channel bed slope, and it is based on assumptions concerning the properties of the channel bed material and of the morphology of the channel network. The model is validated using data from an extreme rainstorm that impacted the 140 km2 Vizze basin in the Eastern Italian Alps on August 4-5, 2012. The results show that the proposed methodology has improved skill in identifying the catchments where debris-flows are triggered, compared to the use of simpler thresholds based on rainfall properties.
NASA Astrophysics Data System (ADS)
Li, Chang-Feng; Sureshkumar, Radhakrishna; Khomami, Bamin
2015-10-01
Self-consistent direct numerical simulations of turbulent channel flows of dilute polymer solutions exhibiting friction drag reduction (DR) show that an effective Deborah number defined as the ratio of polymer relaxation time to the time scale of fluctuations in the vorticity in the mean flow direction remains O (1) from the onset of DR to the maximum drag reduction (MDR) asymptote. However, the ratio of the convective time scale associated with streamwise vorticity fluctuations to the vortex rotation time decreases with increasing DR, and the maximum drag reduction asymptote is achieved when these two time scales become nearly equal. Based on these observations, a simple framework is proposed that adequately describes the influence of polymer additives on the extent of DR from the onset of DR to MDR as well as the universality of the MDR in wall-bounded turbulent flows with polymer additives.
Li, Chang-Feng; Sureshkumar, Radhakrishna; Khomami, Bamin
2015-10-01
Self-consistent direct numerical simulations of turbulent channel flows of dilute polymer solutions exhibiting friction drag reduction (DR) show that an effective Deborah number defined as the ratio of polymer relaxation time to the time scale of fluctuations in the vorticity in the mean flow direction remains O(1) from the onset of DR to the maximum drag reduction (MDR) asymptote. However, the ratio of the convective time scale associated with streamwise vorticity fluctuations to the vortex rotation time decreases with increasing DR, and the maximum drag reduction asymptote is achieved when these two time scales become nearly equal. Based on these observations, a simple framework is proposed that adequately describes the influence of polymer additives on the extent of DR from the onset of DR to MDR as well as the universality of the MDR in wall-bounded turbulent flows with polymer additives.
NASA Technical Reports Server (NTRS)
Venkatachari, Balaji Shankar; Streett, Craig L.; Chang, Chau-Lyan; Friedlander, David J.; Wang, Xiao-Yen; Chang, Sin-Chung
2016-01-01
Despite decades of development of unstructured mesh methods, high-fidelity time-accurate simulations are still predominantly carried out on structured, or unstructured hexahedral meshes by using high-order finite-difference, weighted essentially non-oscillatory (WENO), or hybrid schemes formed by their combinations. In this work, the space-time conservation element solution element (CESE) method is used to simulate several flow problems including supersonic jet/shock interaction and its impact on launch vehicle acoustics, and direct numerical simulations of turbulent flows using tetrahedral meshes. This paper provides a status report for the continuing development of the space-time conservation element solution element (CESE) numerical and software framework under the Revolutionary Computational Aerosciences (RCA) project. Solution accuracy and large-scale parallel performance of the numerical framework is assessed with the goal of providing a viable paradigm for future high-fidelity flow physics simulations.
Swept shock/boundary-layer interactions: Scaling laws, flowfield structure, and experimental methods
NASA Technical Reports Server (NTRS)
Settles, Gary S.
1993-01-01
A general review is given of several decades of research on the scaling laws and flowfield structures of swept shock wave/turbulent boundary layer interactions. Attention is further restricted to the experimental study and physical understanding of the steady-state aspects of these flows. The interaction produced by a sharp, upright fin mounted on a flat plate is taken as an archetype. An overall framework of quasiconical symmetry describing such interactions is first developed. Boundary-layer separation, the interaction footprint, Mach number scaling, and Reynolds number scaling are then considered, followed by a discussion of the quasiconical similarity of interactions produced by geometrically-dissimilar shock generators. The detailed structure of these interaction flowfields is next reviewed, and is illustrated by both qualitative visualizations and quantitative flow images in the quasiconical framework. Finally, the experimental techniques used to investigate such flows are reviewed, with emphasis on modern non-intrusive optical flow diagnostics.
Embedded Systems and TensorFlow Frameworks as Assistive Technology Solutions.
Mulfari, Davide; Palla, Alessandro; Fanucci, Luca
2017-01-01
In the field of deep learning, this paper presents the design of a wearable computer vision system for visually impaired users. The Assistive Technology solution exploits a powerful single board computer and smart glasses with a camera in order to allow its user to explore the objects within his surrounding environment, while it employs Google TensorFlow machine learning framework in order to real time classify the acquired stills. Therefore the proposed aid can increase the awareness of the explored environment and it interacts with its user by means of audio messages.
Versatile, High Quality and Scalable Continuous Flow Production of Metal-Organic Frameworks
Rubio-Martinez, Marta; Batten, Michael P.; Polyzos, Anastasios; Carey, Keri-Constanti; Mardel, James I.; Lim, Kok-Seng; Hill, Matthew R.
2014-01-01
Further deployment of Metal-Organic Frameworks in applied settings requires their ready preparation at scale. Expansion of typical batch processes can lead to unsuccessful or low quality synthesis for some systems. Here we report how continuous flow chemistry can be adapted as a versatile route to a range of MOFs, by emulating conditions of lab-scale batch synthesis. This delivers ready synthesis of three different MOFs, with surface areas that closely match theoretical maxima, with production rates of 60 g/h at extremely high space-time yields. PMID:24962145
Araujo, Reno R; Ginther, O J
2009-01-01
To assess the vascular effects of detomidine and xylazine in pony mares and heifers, respectively, as determined in a major artery and by extent of vascular perfusion of reproductive organs. 10 pony mares and 10 Holstein heifers. Pony mares were assigned to receive physiologic saline (0.9% NaCl) solution (n = 5) or detomidine (3.0 mg/mare, IV; 5). Heifers were assigned to receive saline solution (5) or xylazine (14 mg/heifer, IM; 5). Color Doppler ultrasonographic examinations were performed immediately before and 10 minutes after administration of saline solution or sedative. In spectral Doppler mode, a spectral graph of blood flow velocities during a cardiac cycle was obtained at the internal iliac artery and at the ovarian pedicle. In color-flow mode, color signals of blood flow in vessels of the corpus luteum and endometrium were assessed. Systemic effects of sedation in the 2 species were evident as a decrease in heart rate; increase in duration of systole, diastole, or both; decrease in volume of blood flow; and decrease in velocity of blood flow within the internal iliac artery. However, an effect of sedatives on local vascular perfusion in the ovaries and endometrium was not detected. Sedation with detomidine in pony mares and xylazine in heifers did not affect vascular perfusion in reproductive organs. These sedatives can be used in experimental and clinical color Doppler evaluations of vascular perfusion of the corpus luteum and endometrium.
Derivation of aerodynamic kernel functions
NASA Technical Reports Server (NTRS)
Dowell, E. H.; Ventres, C. S.
1973-01-01
The method of Fourier transforms is used to determine the kernel function which relates the pressure on a lifting surface to the prescribed downwash within the framework of Dowell's (1971) shear flow model. This model is intended to improve upon the potential flow aerodynamic model by allowing for the aerodynamic boundary layer effects neglected in the potential flow model. For simplicity, incompressible, steady flow is considered. The proposed method is illustrated by deriving known results from potential flow theory.
Liu, Ruolin; Dickerson, Julie
2017-11-01
We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.
NASA Astrophysics Data System (ADS)
Aemisegger, Franziska; Piaget, Nicolas
2017-04-01
A new weather-system oriented classification framework of extreme precipitation events leading to large-scale floods in Switzerland is presented on this poster. Thirty-six high impact floods in the last 130 years are assigned to three representative categories of atmospheric moisture origin and transport patterns. The methodology underlying this moisture source classification combines information of the airmass history in the twenty days preceding the precipitation event with humidity variations along the large-scale atmospheric transport systems in a Lagrangian approach. The classification scheme is defined using the 33-year ERA-Interim reanalysis dataset (1979-2011) and is then applied to the Twentieth Century Reanalysis (1871-2011) extreme precipitation events as well as the 36 selected floods. The three defined categories are characterised by different dominant moisture uptake regions including the North Atlantic, the Mediterranean and continental Europe. Furthermore, distinct anomalies in the large-scale atmospheric flow are associated with the different categories. The temporal variations in the relative importance of the three categories over the last 130 years provides new insights into the impact of changing climate conditions on the dynamical mechanisms leading to heavy precipitation in Switzerland.
Framework and tools for agricultural landscape assessment relating to water quality protection.
Gascuel-Odoux, Chantal; Massa, Florence; Durand, Patrick; Merot, Philippe; Troccaz, Olivier; Baudry, Jacques; Thenail, Claudine
2009-05-01
While many scientific studies show the influence of agricultural landscape patterns on water cycle and water quality, only a few of these have proposed scientifically based and operational methods to improve water management. Territ'eau is a framework developed to adapt agricultural landscapes to water quality protection, using components such as farmers' fields, seminatural areas, and human infrastructures, which can act as sources, sinks, or buffers on water quality. This framework allows us to delimit active areas contributing to water quality, defined by the following three characteristics: (i) the dominant hydrological processes and their flow pathways, (ii) the characteristics of each considered pollutant, and (iii) the main landscape features. These areas are delineated by analyzing the flow connectivity from the stream to the croplands, by assessing the buffer functions of seminatural areas according to their flow pathways. Hence, this framework allows us to identify functional seminatural areas in terms of water quality and assess their limits and functions; it helps in proposing different approaches for changing agricultural landscape, acting on agricultural practices or systems, and/or conserving or rebuilding seminatural areas in controversial landscapes. Finally, it allows us to objectivize the functions of the landscape components, for adapting these components to new environmental constraints.
A framework to simulate small shallow inland water bodies in semi-arid regions
NASA Astrophysics Data System (ADS)
Abbasi, Ali; Ohene Annor, Frank; van de Giesen, Nick
2017-12-01
In this study, a framework for simulating the flow field and heat transfer processes in small shallow inland water bodies has been developed. As the dynamics and thermal structure of these water bodies are crucial in studying the quality of stored water , and in assessing the heat fluxes from their surfaces as well, the heat transfer and temperature simulations were modeled. The proposed model is able to simulate the full 3-D water flow and heat transfer in the water body by applying complex and time varying boundary conditions. In this model, the continuity, momentum and temperature equations together with the turbulence equations, which comprise the buoyancy effect, have been solved. This model is built on the Reynolds Averaged Navier Stokes (RANS) equations with the widely used Boussinesq approach to solve the turbulence issues of the flow field. Micrometeorological data were obtained from an Automatic Weather Station (AWS) installed on the site and combined with field bathymetric measurements for the model. In the framework developed, a simple, applicable and generalizable approach is proposed for preparing the geometry of small shallow water bodies using coarsely measured bathymetry. All parts of the framework are based on open-source tools, which is essential for developing countries.
A numerical framework for the direct simulation of dense particulate flow under explosive dispersal
NASA Astrophysics Data System (ADS)
Mo, H.; Lien, F.-S.; Zhang, F.; Cronin, D. S.
2018-05-01
In this paper, we present a Cartesian grid-based numerical framework for the direct simulation of dense particulate flow under explosive dispersal. This numerical framework is established through the integration of the following numerical techniques: (1) operator splitting for partitioned fluid-solid interaction in the time domain, (2) the second-order SSP Runge-Kutta method and third-order WENO scheme for temporal and spatial discretization of governing equations, (3) the front-tracking method for evolving phase interfaces, (4) a field function proposed for low-memory-cost multimaterial mesh generation and fast collision detection, (5) an immersed boundary method developed for treating arbitrarily irregular and changing boundaries, and (6) a deterministic multibody contact and collision model. Employing the developed framework, this paper further studies particle jet formation under explosive dispersal by considering the effects of particle properties, particulate payload morphologies, and burster pressures. By the simulation of the dispersal processes of dense particle systems driven by pressurized gas, in which the driver pressure reaches 1.01325× 10^{10} Pa (10^5 times the ambient pressure) and particles are impulsively accelerated from stationary to a speed that is more than 12000 m/s within 15 μ s, it is demonstrated that the presented framework is able to effectively resolve coupled shock-shock, shock-particle, and particle-particle interactions in complex fluid-solid systems with shocked flow conditions, arbitrarily irregular particle shapes, and realistic multibody collisions.
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.
Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-08-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Petri net modeling of encrypted information flow in federated cloud
NASA Astrophysics Data System (ADS)
Khushk, Abdul Rauf; Li, Xiaozhong
2017-08-01
Solutions proposed and developed for the cost-effective cloud systems suffer from a combination of secure private clouds and less secure public clouds. Need to locate applications within different clouds poses a security risk to the information flow of the entire system. This study addresses this by assigning security levels of a given lattice to the entities of a federated cloud system. A dynamic flow sensitive security model featuring Bell-LaPadula procedures is explored that tracks and authenticates the secure information flow in federated clouds. Additionally, a Petri net model is considered as a case study to represent the proposed system and further validate the performance of the said system.
Freight Transportation Energy Use : Volume 2. Methodology and Program Documentation.
DOT National Transportation Integrated Search
1978-07-01
The structure and logic of the transportation network model component of the TSC Freight Energy Model are presented. The model assigns given origin-destination commodity flows to specific transport modes and routes, thereby determining the traffic lo...
Propagation of Disturbances in Traffic Flow
DOT National Transportation Integrated Search
1977-09-01
The system-optimized static traffic-assignment problem in a freeway corridor network is the problem of choosing a distribution of vehicles in the network to minimize average travel time. It is of interest to know how sensitive the optimal steady-stat...
Jiménez-Mejías, Pedro; Martinetto, Edoardo
2013-08-01
Despite growing interest in the systematics and evolution of the hyperdiverse genus Carex, few studies have focused on its evolution using an absolute time framework. This is partly due to the limited knowledge of the fossil record. However, Carex fruits are not rare in certain sediments. We analyzed carpological features of modern materials from Carex sect. Phacocystis to characterize the fossil record taxonomically. We studied 374 achenes from modern materials (18 extant species), as well as representatives from related groups, to establish the main traits within and among species. We also studied 99 achenes from sediments of living populations to assess their modification process after decay. Additionally, we characterized 145 fossil achenes from 10 different locations (from 4-0.02 mya), whose taxonomic assignment we discuss. Five main characters were identified for establishing morphological groups of species (epidermis morphology, achene-utricle attachment, achene base, style robustness, and pericarp section). Eleven additional characters allowed the discrimination at species level of most of the taxa. Fossil samples were assigned to two extant species and one unknown, possibly extinct species. The analysis of fruit characters allows the distinction of groups, even up to species level. Carpology is revealed as an accurate tool in Carex paleotaxonomy, which could allow the characterization of Carex fossil fruits and assign them to subgeneric or sectional categories, or to certain species. Our conclusions could be crucial for including a temporal framework in the study of the evolution of Carex.
McGowan, Erin L; Prapavessis, Harry
2010-12-01
Using a Protection Motivation Theory (PMT) framework, this study examined whether factual colon cancer information is a meaningful source of exercise motivation for relatives of patients with colon cancer. One hundred sixty-six inactive relatives were randomly assigned to one of two treatment conditions: PMT group (intervention); and non-PMT group (attention control). At baseline (T1) participants completed demographic information, a questionnaire designed to assess their beliefs toward exercise and colon cancer as well as their exercise intentions. At T2 (one week following T1) participants watched one of two DVD videos that were created for the study. The intervention DVD contained exercise and colon cancer information that was yoked within the four major components of PMT: perceived vulnerability (PV); perceived severity (PS); response efficacy (RE); and self-efficacy (SE), while the attention control DVD contained general diet and cancer information. Immediately following watching the DVD, participants completed the same measures as in T1. Participants assigned to the PMT intervention group showed significant improvement in PV, RE, SE and exercise intentions, whereas participants assigned to the attention control group showed significant improvement only in RE. RE, SE, and PS made significant and unique contributions to prediction of exercise intention. Overall, the results of the present study demonstrate that a single exposure media intervention grounded in a PMT framework can change individuals' exercise and colon cancer beliefs, as well as change their exercise intentions. Implications of these findings and direction for future research are discussed.
Comparison of PIV with 4D-Flow in a physiological accurate flow phantom
NASA Astrophysics Data System (ADS)
Sansom, Kurt; Balu, Niranjan; Liu, Haining; Aliseda, Alberto; Yuan, Chun; Canton, Maria De Gador
2016-11-01
Validation of 4D MRI flow sequences with planar particle image velocimetry (PIV) is performed in a physiologically-accurate flow phantom. A patient-specific phantom of a carotid artery is connected to a pulsatile flow loop to simulate the 3D unsteady flow in the cardiovascular anatomy. Cardiac-cycle synchronized MRI provides time-resolved 3D blood velocity measurements in clinical tool that is promising but lacks a robust validation framework. PIV at three different Reynolds numbers (540, 680, and 815, chosen based on +/- 20 % of the average velocity from the patient-specific CCA waveform) and four different Womersley numbers (3.30, 3.68, 4.03, and 4.35, chosen to reflect a physiological range of heart rates) are compared to 4D-MRI measurements. An accuracy assessment of raw velocity measurements and a comparison of estimated and measureable flow parameters such as wall shear stress, fluctuating velocity rms, and Lagrangian particle residence time, will be presented, with justification for their biomechanics relevance to the pathophysiology of arterial disease: atherosclerosis and intimal hyperplasia. Lastly, the framework is applied to a new 4D-Flow MRI sequence and post processing techniques to provide a quantitative assessment with the benchmarked data. Department of Education GAANN Fellowship.
Gukov, Sergei
2016-01-05
Here, interpreting renormalization group flows as solitons interpolating between different fixed points, we ask various questions that are normally asked in soliton physics but not in renormalization theory. Can one count RG flows? Are there different "topological sectors" for RG flows? What is the moduli space of an RG flow, and how does it compare to familiar moduli spaces of (supersymmetric) dowain walls? Analyzing these questions in a wide variety of contexts -- from counting RG walls to AdS/CFT correspondence -- will not only provide favorable answers, but will also lead us to a unified general framework that is powerfulmore » enough to account for peculiar RG flows and predict new physical phenomena. Namely, using Bott's version of Morse theory we relate the topology of conformal manifolds to certain properties of RG flows that can be used as precise diagnostics and "topological obstructions" for the strong form of the C-theorem in any dimension. Moreover, this framework suggests a precise mechanism for how the violation of the strong C-theorem happens and predicts "phase transitions" along the RG flow when the topological obstruction is non-trivial. Along the way, we also find new conformal manifolds in well-known 4d CFT's and point out connections with the superconformal index and classifying spaces of global symmetry groups.« less
Real Time Global Tests of the ALICE High Level Trigger Data Transport Framework
NASA Astrophysics Data System (ADS)
Becker, B.; Chattopadhyay, S.; Cicalo, C.; Cleymans, J.; de Vaux, G.; Fearick, R. W.; Lindenstruth, V.; Richter, M.; Rohrich, D.; Staley, F.; Steinbeck, T. M.; Szostak, A.; Tilsner, H.; Weis, R.; Vilakazi, Z. Z.
2008-04-01
The High Level Trigger (HLT) system of the ALICE experiment is an online event filter and trigger system designed for input bandwidths of up to 25 GB/s at event rates of up to 1 kHz. The system is designed as a scalable PC cluster, implementing several hundred nodes. The transport of data in the system is handled by an object-oriented data flow framework operating on the basis of the publisher-subscriber principle, being designed fully pipelined with lowest processing overhead and communication latency in the cluster. In this paper, we report the latest measurements where this framework has been operated on five different sites over a global north-south link extending more than 10,000 km, processing a ldquoreal-timerdquo data flow.
Scott-Hamilton, John; Schutte, Nicola S; Brown, Rhonda F
2016-03-01
This study investigated whether mindfulness training increases athletes' mindfulness and flow experience and decreases sport-specific anxiety and sport-specific pessimism. Cyclists were assigned to an eight-week mindfulness intervention, which incorporated a mindful spin-bike training component, or a wait-list control condition. Participants completed baseline and post-test measures of mindfulness, flow, sport-anxiety, and sport-related pessimistic attributions. Analyses of covariance showed significant positive effects on mindfulness, flow, and pessimism for the 27 cyclists in the mindfulness intervention condition compared with the 20 cyclists in the control condition. Changes in mindfulness experienced by the intervention participants were positively associated with changes in flow. Results suggest that mindfulness-based interventions tailored to specific athletic pursuits can be effective in facilitating flow experiences. © 2016 The International Association of Applied Psychology.
NASA Astrophysics Data System (ADS)
de Jong, Floor; van Hillegersberg, Jos; van Eck, Pascal; van der Kolk, Feiko; Jorissen, Rene
The lack of effective IT governance is widely recognized as a key inhibitor to successful global IT outsourcing relationships. In this study we present the development and application of a governance framework to improve outsourcing relationships. The approach used to developing an IT governance framework includes a meta model and a customization process to fit the framework to the target organization. The IT governance framework consists of four different elements (1) organisational structures, (2) joint processes between in- and outsourcer, (3) responsibilities that link roles to processes and (4) a diverse set of control indicators to measure the success of the relationship. The IT governance framework is put in practice in Shell GFIT BAM, a part of Shell that concluded to have a lack of management control over at least one of their outsourcing relationships. In a workshop the governance framework was used to perform a gap analysis between the current and desired governance. Several gaps were identified in the way roles and responsibilities are assigned and joint processes are set-up. Moreover, this workshop also showed the usefulness and usability of the IT governance framework in structuring, providing input and managing stakeholders in the discussions around IT governance.
Knowles, Judie M; Gray, Morag A
2011-11-01
This paper commences with affirmation of the importance of research critique within academic programmes of study, and the context of this skill within the nursing profession. Judie (student) shares an experience from a Professional Doctorate in Education (EdD) assignment that involved selecting and critiquing a piece of published research. "The qualities of an effective mentor" (Gray and Smith, 2000) was critiqued using the Critical Appraisal Skills Programme (CASP, 2006) framework. Morag was the researcher and co-author (Gray and Smith, 2000) and was subsequently contacted by Judie for the purposes of validating her critique assignment. On the tenth anniversary since publication of her PhD research findings Morag reflects on the original article in the light of Judie's critique and shares evaluative comments. Some of the assignment critique is validated by Morag, whilst some of the evaluation demonstrates unreliability of critique shown by Judie. Discussion surrounding sufficiency of research critique through systematic examination of a published article, versus an original research report such as a thesis ensues. The student and researcher/author reveal their learning from this collaborative experience and conclude with recommendations for; setting critique assignments; authors publishing their research findings; and students undertaking critique assignments. Copyright © 2011 Elsevier Ltd. All rights reserved.
RSTensorFlow: GPU Enabled TensorFlow for Deep Learning on Commodity Android Devices
Alzantot, Moustafa; Wang, Yingnan; Ren, Zhengshuang; Srivastava, Mani B.
2018-01-01
Mobile devices have become an essential part of our daily lives. By virtue of both their increasing computing power and the recent progress made in AI, mobile devices evolved to act as intelligent assistants in many tasks rather than a mere way of making phone calls. However, popular and commonly used tools and frameworks for machine intelligence are still lacking the ability to make proper use of the available heterogeneous computing resources on mobile devices. In this paper, we study the benefits of utilizing the heterogeneous (CPU and GPU) computing resources available on commodity android devices while running deep learning models. We leveraged the heterogeneous computing framework RenderScript to accelerate the execution of deep learning models on commodity Android devices. Our system is implemented as an extension to the popular open-source framework TensorFlow. By integrating our acceleration framework tightly into TensorFlow, machine learning engineers can now easily make benefit of the heterogeneous computing resources on mobile devices without the need of any extra tools. We evaluate our system on different android phones models to study the trade-offs of running different neural network operations on the GPU. We also compare the performance of running different models architectures such as convolutional and recurrent neural networks on CPU only vs using heterogeneous computing resources. Our result shows that although GPUs on the phones are capable of offering substantial performance gain in matrix multiplication on mobile devices. Therefore, models that involve multiplication of large matrices can run much faster (approx. 3 times faster in our experiments) due to GPU support. PMID:29629431
RSTensorFlow: GPU Enabled TensorFlow for Deep Learning on Commodity Android Devices.
Alzantot, Moustafa; Wang, Yingnan; Ren, Zhengshuang; Srivastava, Mani B
2017-06-01
Mobile devices have become an essential part of our daily lives. By virtue of both their increasing computing power and the recent progress made in AI, mobile devices evolved to act as intelligent assistants in many tasks rather than a mere way of making phone calls. However, popular and commonly used tools and frameworks for machine intelligence are still lacking the ability to make proper use of the available heterogeneous computing resources on mobile devices. In this paper, we study the benefits of utilizing the heterogeneous (CPU and GPU) computing resources available on commodity android devices while running deep learning models. We leveraged the heterogeneous computing framework RenderScript to accelerate the execution of deep learning models on commodity Android devices. Our system is implemented as an extension to the popular open-source framework TensorFlow. By integrating our acceleration framework tightly into TensorFlow, machine learning engineers can now easily make benefit of the heterogeneous computing resources on mobile devices without the need of any extra tools. We evaluate our system on different android phones models to study the trade-offs of running different neural network operations on the GPU. We also compare the performance of running different models architectures such as convolutional and recurrent neural networks on CPU only vs using heterogeneous computing resources. Our result shows that although GPUs on the phones are capable of offering substantial performance gain in matrix multiplication on mobile devices. Therefore, models that involve multiplication of large matrices can run much faster (approx. 3 times faster in our experiments) due to GPU support.
Saver, J L; Jahan, R; Levy, E I; Jovin, T G; Baxter, B; Nogueira, R; Clark, W; Budzik, R; Zaidat, O O
2014-07-01
Self-expanding stent retrievers are a promising new device class designed for rapid flow restoration in acute cerebral ischaemia. The SOLITAIRE™ Flow Restoration device (SOLITAIRE) has shown high rates of recanalization in preclinical models and in uncontrolled clinical series. (1) To demonstrate non-inferiority of SOLITAIRE compared with a legally marketed device, the MERCI Retrieval System®; (2) To demonstrate safety, feasibility, and efficacy of SOLITAIRE in subjects requiring mechanical thrombectomy diagnosed with acute ischaemic stroke. DESIGN : Multicenter, randomized, prospective, controlled trial with blinded primary end-point ascertainment. Key entry criteria include: age 22-85; National Institute of Health Stroke Scale (NIHSS) ≥8 and <30; clinical and imaging findings consistent with acute ischaemic stroke; patient ineligible or failed intravenous tissue plasminogen activator; accessible occlusion in M1 or M2 middle cerebral artery, internal carotid artery, basilar artery, or vertebral artery; and patient able to be treated within 8 h of onset. Sites first participate in a roll-in phase, treating two patients with the SOLITAIRE device, before proceeding to the randomized phase. In patients unresponsive to the initially assigned therapy, after the angiographic component of the primary end-point is ascertained (reperfusion with the initial assigned device), rescue therapy with other reperfusion techniques is permitted. The primary efficacy end-point is successful recanalization with the assigned study device (no use of rescue therapy) and with no symptomatic intracranial haemorrhage. Successful recanalization is defined as achieving Thrombolysis In Myocardial Ischemia 2 or 3 flow in all treatable vessels. The primary safety end-point is the incidence of device-related and procedure-related serious adverse events. A major secondary efficacy end-point is time to achieve initial recanalization. Additional secondary end-points include clinical outcomes at 90 days and radiologic haemorrhagic transformation. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
A semi-supervised learning framework for biomedical event extraction based on hidden topics.
Zhou, Deyu; Zhong, Dayou
2015-05-01
Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely described by hidden topics and structures of the sentences. Copyright © 2015 Elsevier B.V. All rights reserved.
Applying the payoff time framework to carotid artery disease management.
Yuo, Theodore H; Roberts, Mark S; Braithwaite, R Scott; Chang, Chung-Chou H; Kraemer, Kevin L
2013-11-01
and Asymptomatic stenosis of the carotid arteries is associated with stroke. Carotid revascularization can reduce the future risk of stroke but can also trigger an immediate stroke. The objective was to model the generic relationship between immediate risk, long-term benefit, and life expectancy for any one-time prophylactic treatment and then apply the model to the use of revascularization in the management of asymptomatic carotid disease. In the "payoff time" framework, the possibility of losing quality-adjusted life-years (QALYs) because of revascularization failure is conceptualized as an "investment" that is eventually recouped over time, on average. Using this framework, we developed simple mathematical forms that define relationships between the following: perioperative probability of stroke (P); annual stroke rate without revascularization (r0); annual stroke rate after revascularization, conditional on not having suffered perioperative stroke (r1); utility levels assigned to the asymptomatic state (ua) and stroke state (us); and mortality rates (λ). In patients whose life expectancy is below a critical life expectancy (CLE = P/(1-P)r0-r1, the "investment" will never pay off, and revascularization will lead to loss of QALYs, on average. CLE is independent of utilities assigned to the health states if a rank ordering exists in which ua > us. For clinically relevant values (P = 3%, r0 = 1%, r1 = 0.5%), the CLE is approximately 6.4 years, which is longer than published guidelines regarding patient selection for revascularization. In managing asymptomatic carotid disease, the payoff time framework specifies a CLE beneath which patients, on average, will not benefit from revascularization. This formula is suitable for clinical use at the patient's bedside and can account for patient variability, the ability of clinicians who perform revascularization, and the particular revascularization technology that is chosen.
NASA Astrophysics Data System (ADS)
Rodríguez, Estiven; Salazar, Juan Fernando; Villegas, Juan Camilo; Mercado-Bettín, Daniel
2018-07-01
Extreme flows are key components of river flow regimes that affect manifold hydrological, geomorphological and ecological processes with societal relevance. One fundamental characteristic of extreme flows in river basins is that they exhibit scaling properties which can be identified through scaling (power) laws. Understanding the physical mechanisms behind such scaling laws is a continuing challenge in hydrology, with potential implications for the prediction of river flow regimes in a changing environment and ungauged basins. After highlighting that the scaling properties are sensitive to environmental change, we develop a physical interpretation of how temporal changes in scaling exponents relate to the capacity of river basins to regulate extreme river flows. Regulation is defined here as the basins' capacity to either dampen high flows or to enhance low flows. Further, we use this framework to infer temporal changes in the regulation capacity of five large basins in tropical South America. Our results indicate that, during the last few decades, the Amazon river basin has been reducing its capacity to enhance low flows, likely as a consequence of pronounced environmental change in its south and south-eastern sub-basins. The proposed framework is widely applicable to different basins, and provides foundations for using scaling laws as empirical tools for inferring temporal changes of hydrological regulation, particularly relevant for identifying and managing hydrological consequences of environmental change.
NASA Astrophysics Data System (ADS)
Fan, Linfeng; Lehmann, Peter; McArdell, Brian; Or, Dani
2017-03-01
Debris flows and landslides induced by heavy rainfall represent an ubiquitous and destructive natural hazard in steep mountainous regions. For debris flows initiated by shallow landslides, the prediction of the resulting pathways and associated hazard is often hindered by uncertainty in determining initiation locations, volumes and mechanical state of the mobilized debris (and by model parameterization). We propose a framework for linking a simplified physically-based debris flow runout model with a novel Landslide Hydro-mechanical Triggering (LHT) model to obtain a coupled landslide-debris flow susceptibility and hazard assessment. We first compared the simplified debris flow model of Perla (1980) with a state-of-the art continuum-based model (RAMMS) and with an empirical model of Rickenmann (1999) at the catchment scale. The results indicate that predicted runout distances by the Perla model are in reasonable agreement with inventory measurements and with the other models. Predictions of localized shallow landslides by LHT model provides information on water content of released mass. To incorporate effects of water content and flow viscosity as provided by LHT on debris flow runout, we adapted the Perla model. The proposed integral link between landslide triggering susceptibility quantified by LHT and subsequent debris flow runout hazard calculation using the adapted Perla model provides a spatially and temporally resolved framework for real-time hazard assessment at the catchment scale or along critical infrastructure (roads, railroad lines).
National Ecosystem Services Classification System (NESCS): Framework Design and Policy Application
Understanding the ways in which ecosystems provide flows of “services” to humans is critical for decision making in many contexts; however, relationships between natural and human systems are complex. A well-defined framework for classifying ecosystem services is essential for sy...
Nelson, Kurt; James, Scott C.; Roberts, Jesse D.; ...
2017-06-05
A modelling framework identifies deployment locations for current-energy-capture devices that maximise power output while minimising potential environmental impacts. The framework, based on the Environmental Fluid Dynamics Code, can incorporate site-specific environmental constraints. Over a 29-day period, energy outputs from three array layouts were estimated for: (1) the preliminary configuration (baseline), (2) an updated configuration that accounted for environmental constraints, (3) and an improved configuration subject to no environmental constraints. Of these layouts, array placement that did not consider environmental constraints extracted the most energy from flow (4.38 MW-hr/day), 19% higher than output from the baseline configuration (3.69 MW-hr/day). Array placementmore » that considered environmental constraints removed 4.27 MW-hr/day of energy (16% more than baseline). In conclusion, this analysis framework accounts for bathymetry and flow-pattern variations that typical experimental studies cannot, demonstrating that it is a valuable tool for identifying improved array layouts for field deployments.« less
A FSI computational framework for vascular physiopathology: A novel flow-tissue multiscale strategy.
Bianchi, Daniele; Monaldo, Elisabetta; Gizzi, Alessio; Marino, Michele; Filippi, Simonetta; Vairo, Giuseppe
2017-09-01
A novel fluid-structure computational framework for vascular applications is herein presented. It is developed by combining the double multi-scale nature of vascular physiopathology in terms of both tissue properties and blood flow. Addressing arterial tissues, they are modelled via a nonlinear multiscale constitutive rationale, based only on parameters having a clear histological and biochemical meaning. Moreover, blood flow is described by coupling a three-dimensional fluid domain (undergoing physiological inflow conditions) with a zero-dimensional model, which allows to reproduce the influence of the downstream vasculature, furnishing a realistic description of the outflow proximal pressure. The fluid-structure interaction is managed through an explicit time-marching approach, able to accurately describe tissue nonlinearities within each computational step for the fluid problem. A case study associated to a patient-specific aortic abdominal aneurysmatic geometry is numerically investigated, highlighting advantages gained from the proposed multiscale strategy, as well as showing soundness and effectiveness of the established framework for assessing useful clinical quantities and risk indexes. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.
2009-12-01
Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.
Eigenspace perturbations for uncertainty estimation of single-point turbulence closures
NASA Astrophysics Data System (ADS)
Iaccarino, Gianluca; Mishra, Aashwin Ananda; Ghili, Saman
2017-02-01
Reynolds-averaged Navier-Stokes (RANS) models represent the workhorse for predicting turbulent flows in complex industrial applications. However, RANS closures introduce a significant degree of epistemic uncertainty in predictions due to the potential lack of validity of the assumptions utilized in model formulation. Estimating this uncertainty is a fundamental requirement for building confidence in such predictions. We outline a methodology to estimate this structural uncertainty, incorporating perturbations to the eigenvalues and the eigenvectors of the modeled Reynolds stress tensor. The mathematical foundations of this framework are derived and explicated. Thence, this framework is applied to a set of separated turbulent flows, while compared to numerical and experimental data and contrasted against the predictions of the eigenvalue-only perturbation methodology. It is exhibited that for separated flows, this framework is able to yield significant enhancement over the established eigenvalue perturbation methodology in explaining the discrepancy against experimental observations and high-fidelity simulations. Furthermore, uncertainty bounds of potential engineering utility can be estimated by performing five specific RANS simulations, reducing the computational expenditure on such an exercise.
Peak Stress Testing Protocol Framework
Treatment of peak flows during wet weather is a common challenge across the country for municipal wastewater utilities with separate and/or combined sewer systems. Increases in wastewater flow resulting from infiltration and inflow (I/I) during wet weather events can result in op...
A Novel Uncertainty Framework for Improving Discharge Data Quality Using Hydraulic Modelling.
NASA Astrophysics Data System (ADS)
Mansanarez, V.; Westerberg, I.; Lyon, S. W.; Lam, N.
2017-12-01
Flood risk assessments rely on accurate discharge data records. Establishing a reliable stage-discharge (SD) rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. We introduce an uncertainty framework using hydraulic modelling for developing SD rating curves and estimating their uncertainties. The proposed framework incorporates information from both the hydraulic configuration (bed slope, roughness, vegetation) and the information available in the stage-discharge observation data (gaugings). This method provides a direct estimation of the hydraulic configuration (slope, bed roughness and vegetation roughness). Discharge time series are estimated propagating stage records through posterior rating curve results.We applied this novel method to two Swedish hydrometric stations, accounting for uncertainties in the gaugings for the hydraulic model. Results from these applications were compared to discharge measurements and official discharge estimations.Sensitivity analysis was performed. We focused analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken.
ERIC Educational Resources Information Center
Mori, Junko
2004-01-01
Using the methodological framework of conversation analysis (CA) as a central tool for analysis, this study examines a peer interactive task that occurred in a Japanese as a foreign language classroom. During the short segment of interaction, the students shifted back and forth between the development of an assigned task and the management of…
ERIC Educational Resources Information Center
Prasad, V. Kanti; And Others
This study investigated, in a laboratory experimental framework, the relative influences of television commercials and parental counter-commercial advocacy on children's consumption choice behavior. Sixty-four 8- to 10-year-old boys were randomly assigned to one of three treatment groups: (1) no counter commercial advocacy (control group), (2)…
ERIC Educational Resources Information Center
Bigby, Christine; Beadle-Brown, Julie
2018-01-01
Background: The quality of life (QOL) of people with intellectual disability living in supported accommodation services is variable, influenced by many possible factors. Various frameworks have attempted to identify these factors without assigning value, direction of influence or relative impact on outcomes. Methods: A realist review of the…
Weaving Imagination into an Academic Framework: Attitudes, Assignments, and Assessments
ERIC Educational Resources Information Center
Miller, Jeanetta
2009-01-01
The author believes that imagination is alive in the high school classroom, but it is pale and sickly, suffering from a long decline in which teachers have confined it to its most decorous forms of expression--inference and interpretation--and become ambiguous about whether or not it is truly welcome. To rouse imagination in the high school…
The Curriculum in School External Evaluation Frameworks in Portugal and England
ERIC Educational Resources Information Center
Figueiredo, Carla; Leite, Carlinda; Fernandes, Preciosa
2016-01-01
The curriculum has been target of social and political demands due to its central role in school education and to the changes that occurred in education over the 20th century. The changes include more autonomy assigned to schools and teachers and the establishment of educational standards. These raised concerns that led European bodies to…
49 CFR 173.121 - Class 3-Assignment of packing group.
Code of Federal Regulations, 2013 CFR
2013-10-01
... than 100 L (26.3 gallons); and (iv) The viscosity and flash point are in accordance with the following... paragraph (b)(1) of this section shall be performed are as follows: (i) Viscosity test. The flow time in...
49 CFR 173.121 - Class 3-Assignment of packing group.
Code of Federal Regulations, 2014 CFR
2014-10-01
... than 100 L (26.3 gallons); and (iv) The viscosity and flash point are in accordance with the following... paragraph (b)(1) of this section shall be performed are as follows: (i) Viscosity test. The flow time in...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... in the patent referred to below to Flow Applications, Inc., having a place of business in Okawville, Illinois. The patent rights in these inventions have been assigned to the government of the United States...
A Fully Magnetically Levitated Circulatory Pump for Advanced Heart Failure.
Mehra, Mandeep R; Naka, Yoshifumi; Uriel, Nir; Goldstein, Daniel J; Cleveland, Joseph C; Colombo, Paolo C; Walsh, Mary N; Milano, Carmelo A; Patel, Chetan B; Jorde, Ulrich P; Pagani, Francis D; Aaronson, Keith D; Dean, David A; McCants, Kelly; Itoh, Akinobu; Ewald, Gregory A; Horstmanshof, Douglas; Long, James W; Salerno, Christopher
2017-02-02
Continuous-flow left ventricular assist systems increase the rate of survival among patients with advanced heart failure but are associated with the development of pump thrombosis. We investigated the effects of a new magnetically levitated centrifugal continuous-flow pump that was engineered to avert thrombosis. We randomly assigned patients with advanced heart failure to receive either the new centrifugal continuous-flow pump or a commercially available axial continuous-flow pump. Patients could be enrolled irrespective of the intended goal of pump support (bridge to transplantation or destination therapy). The primary end point was a composite of survival free of disabling stroke (with disabling stroke indicated by a modified Rankin score >3; scores range from 0 to 6, with higher scores indicating more severe disability) or survival free of reoperation to replace or remove the device at 6 months after implantation. The trial was powered for noninferiority testing of the primary end point (noninferiority margin, -10 percentage points). Of 294 patients, 152 were assigned to the centrifugal-flow pump group and 142 to the axial-flow pump group. In the intention-to-treat population, the primary end point occurred in 131 patients (86.2%) in the centrifugal-flow pump group and in 109 (76.8%) in the axial-flow pump group (absolute difference, 9.4 percentage points; 95% lower confidence boundary, -2.1 [P<0.001 for noninferiority]; hazard ratio, 0.55; 95% confidence interval [CI], 0.32 to 0.95 [two-tailed P=0.04 for superiority]). There were no significant between-group differences in the rates of death or disabling stroke, but reoperation for pump malfunction was less frequent in the centrifugal-flow pump group than in the axial-flow pump group (1 [0.7%] vs. 11 [7.7%]; hazard ratio, 0.08; 95% CI, 0.01 to 0.60; P=0.002). Suspected or confirmed pump thrombosis occurred in no patients in the centrifugal-flow pump group and in 14 patients (10.1%) in the axial-flow pump group. Among patients with advanced heart failure, implantation of a fully magnetically levitated centrifugal-flow pump was associated with better outcomes at 6 months than was implantation of an axial-flow pump, primarily because of the lower rate of reoperation for pump malfunction. (Funded by St. Jude Medical; MOMENTUM 3 ClinicalTrials.gov number, NCT02224755 .).
Uncertainty quantification in LES of channel flow
Safta, Cosmin; Blaylock, Myra; Templeton, Jeremy; ...
2016-07-12
Here, in this paper, we present a Bayesian framework for estimating joint densities for large eddy simulation (LES) sub-grid scale model parameters based on canonical forced isotropic turbulence direct numerical simulation (DNS) data. The framework accounts for noise in the independent variables, and we present alternative formulations for accounting for discrepancies between model and data. To generate probability densities for flow characteristics, posterior densities for sub-grid scale model parameters are propagated forward through LES of channel flow and compared with DNS data. Synthesis of the calibration and prediction results demonstrates that model parameters have an explicit filter width dependence andmore » are highly correlated. Discrepancies between DNS and calibrated LES results point to additional model form inadequacies that need to be accounted for.« less
Nonlinear problems in flight dynamics
NASA Technical Reports Server (NTRS)
Chapman, G. T.; Tobak, M.
1984-01-01
A comprehensive framework is proposed for the description and analysis of nonlinear problems in flight dynamics. Emphasis is placed on the aerodynamic component as the major source of nonlinearities in the flight dynamic system. Four aerodynamic flows are examined to illustrate the richness and regularity of the flow structures and the nature of the flow structures and the nature of the resulting nonlinear aerodynamic forces and moments. A framework to facilitate the study of the aerodynamic system is proposed having parallel observational and mathematical components. The observational component, structure is described in the language of topology. Changes in flow structure are described via bifurcation theory. Chaos or turbulence is related to the analogous chaotic behavior of nonlinear dynamical systems characterized by the existence of strange attractors having fractal dimensionality. Scales of the flow are considered in the light of ideas from group theory. Several one and two degree of freedom dynamical systems with various mathematical models of the nonlinear aerodynamic forces and moments are examined to illustrate the resulting types of dynamical behavior. The mathematical ideas that proved useful in the description of fluid flows are shown to be similarly useful in the description of flight dynamic behavior.
DEVELOPMENT OF A DECISION SUPPORT FRAMEWORK FOR PLACEMENT OF BMPS IN URBAN-WATERSHEDS
This paper will present an on-going development of an integrated decision support framework (IDSF) for cost-effective placement of best management practices (BMPs) for managing wet weather flows (WWF) in urban watersheds. This decision tool will facilitate the selection and plac...
AN INTEGRATED DECISION SUPPORT FRAMEWORK FOR PLACEMENT OF BMPS IN URBAN-WATERSHEDS
This paper will present an on-going development of an integrated decision support framework (IDSF) for cost-effective placement of best management practices (BMPs) for managing wet weather flows (WWF) in urban watersheds. This decision tool will facilitate the selection and plac...
DNA Detection by Flow Cytometry using PNA-Modified Metal-Organic Framework Particles.
Mejia-Ariza, Raquel; Rosselli, Jessica; Breukers, Christian; Manicardi, Alex; Terstappen, Leon W M M; Corradini, Roberto; Huskens, Jurriaan
2017-03-23
A DNA-sensing platform is developed by exploiting the easy surface functionalization of metal-organic framework (MOF) particles and their highly parallelized fluorescence detection by flow cytometry. Two strategies were employed to functionalize the surface of MIL-88A, using either covalent or non-covalent interactions, resulting in alkyne-modified and biotin-modified MIL-88A, respectively. Covalent surface coupling of an azide-dye and the alkyne-MIL-88A was achieved by means of a click reaction. Non-covalent streptavidin-biotin interactions were employed to link biotin-PNA to biotin-MIL-88A particles mediated by streptavidin. Characterization by confocal imaging and flow cytometry demonstrated that DNA can be bound selectively to the MOF surface. Flow cytometry provided quantitative data of the interaction with DNA. Making use of the large numbers of particles that can be simultaneously processed by flow cytometry, this MOF platform was able to discriminate between fully complementary, single-base mismatched, and randomized DNA targets. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
DOT National Transportation Integrated Search
2018-04-01
Consistent efforts with dense sensor deployment and data gathering processes for bridge big data have accumulated profound information regarding bridge performance, associated environments, and traffic flows. However, direct applications of bridge bi...
The Site-Scale Saturated Zone Flow Model for Yucca Mountain
NASA Astrophysics Data System (ADS)
Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.
2006-12-01
This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the previous model calibration. Specific discharge at a point 5~km from the repository is also examined and found to be within acceptable uncertainty. The results show that updated model yields a calibration with smaller residuals than the previous model revision while ensuring that flowpaths follow measured gradients and paths derived from hydrochemical analyses. This work was supported by the Yucca Mountain Site Characterization Office as part of the Civilian Radioactive Waste Management Program, which is managed by the U.S. Department of Energy, Yucca Mountain Site Characterization Project. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Health workers at the core of the health system: framework and research issues.
Anand, Sudhir; Bärnighausen, Till
2012-05-01
This paper presents a framework for the health system with health workers at the core. We review existing health-system frameworks and the role they assign to health workers. Earlier frameworks either do not include health workers as a central feature of system functioning or treat them as one among several components of equal importance. As every function of the health system is either undertaken by or mediated through the health worker, we place the health worker at the center of the health system. Our framework is useful for structuring research on the health workforce and for identifying health-worker research issues. We describe six research issues on the health workforce: metrics to measure the capacity of a health system to deliver healthcare; the contribution of public- vs. private-sector health workers in meeting healthcare needs and demands; the appropriate size, composition and distribution of the health workforce; approaches to achieving health-worker requirements; the adoption and adaption of treatments by health workers; and the training of health workers for horizontally vs. vertically structured health systems. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
François, Marianne M.
2015-05-28
A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less
Comparing and Contrasting Siblings: Defining the Self.
ERIC Educational Resources Information Center
Schachter, Frances Fuchs; Stone, Richard K.
1987-01-01
Deidentification is the phenomenon whereby siblings are defined as different or contrasting. In pathological deidentification, the natural flow of sibling conflict and reconciliation seems obstructed as one sibling is assigned the fixed identity of "devil," who constantly harasses the other, "angel," sibling. A clinical…
Collusion-resistant multimedia fingerprinting: a unified framework
NASA Astrophysics Data System (ADS)
Wu, Min; Trappe, Wade; Wang, Z. Jane; Liu, K. J. Ray
2004-06-01
Digital fingerprints are unique labels inserted in different copies of the same content before distribution. Each digital fingerprint is assigned to an inteded recipient, and can be used to trace the culprits who use their content for unintended purposes. Attacks mounted by multiple users, known as collusion attacks, provide a cost-effective method for attenuating the identifying fingerprint from each coluder, thus collusion poses a reeal challenge to protect the digital media data and enforce usage policies. This paper examines a few major design methodologies for collusion-resistant fingerprinting of multimedia, and presents a unified framework that helps highlight the common issues and the uniqueness of different fingerprinting techniques.
A hydromorphological framework for the evaluation of e-flows
NASA Astrophysics Data System (ADS)
Bussettini, Martina; Rinaldi, Massimo; Grant, Gordon
2017-04-01
Anthropogenic alteration of hydromorphological processes in rivers is a major factor that diminishes river health and undermines environmental objectives envisaged by river protection policies. Specifying environmental flows to address those impacts can be a key strategy for the maintenance of functional river processes and the achievement of those objectives. Environmental flows are determined by various methods and approaches, based primarily on hydrological and/or hydraulic evaluations, although holistic methodologies, considering the many interacting factors that structure aquatic ecosystems, including sediments, are increasingly used. Hydrological and geomorphological processes are highly coupled and any change in one typically affects the other. The coupling varies over different spatial and temporal scales, and changing either hydrological or geomorphological processes can result in alteration of river habitats, ultimately impacting ecological processes. In spite of these linkages, current restoration approaches typically focus only on changes on hydrological regime as a means promoting ecological enhancements. Neglecting sediment transport and its interaction with flow in shaping riverine habitats is likely to results not only in minor or no enhancements in the ecology, but may also increase the costs of water use. A more integrated view of how human activities jointly affect sediment regime, river morphology and river flows is therefore needed in order to determine the most effective actions to rehabilitate river processes to desired states. These states involve considerations of the combination of intrinsic ("natural") conditions (e.g. river sensitivity and morphological potential, off-site conditions) and socio-economic constraints. The evaluation of such factors, the analysis of different scenarios, and the selection of appropriate actions require the contextualization of river reaches within a wider spatial-temporal hydromorphological framework. Here we present such a general multiscale, process-based hydromorphological framework, and discuss its application to the problem of how best to analyse and estimate e-flows.
Wilkinson, M E; Quinn, P F; Barber, N J; Jonczyk, J
2014-01-15
Intense farming plays a key role in increasing local scale runoff and erosion rates, resulting in water quality issues and flooding problems. There is potential for agricultural management to become a major part of improved strategies for controlling runoff. Here, a Catchment Systems Engineering (CSE) approach has been explored to solve the above problem. CSE is an interventionist approach to altering the catchment scale runoff regime through the manipulation of hydrological flow pathways throughout the catchment. By targeting hydrological flow pathways at source, such as overland flow, field drain and ditch function, a significant component of the runoff generation can be managed in turn reducing soil nutrient losses. The Belford catchment (5.7 km(2)) is a catchment scale study for which a CSE approach has been used to tackle a number of environmental issues. A variety of Runoff Attenuation Features (RAFs) have been implemented throughout the catchment to address diffuse pollution and flooding issues. The RAFs include bunds disconnecting flow pathways, diversion structures in ditches to spill and store high flows, large wood debris structure within the channel, and riparian zone management. Here a framework for applying a CSE approach to the catchment is shown in a step by step guide to implementing mitigation measures in the Belford Burn catchment. The framework is based around engagement with catchment stakeholders and uses evidence arising from field science. Using the framework, the flooding issue has been addressed at the catchment scale by altering the runoff regime. Initial findings suggest that RAFs have functioned as designed to reduce/attenuate runoff locally. However, evidence suggested that some RAFs needed modification and new RAFs be created to address diffuse pollution issues during storm events. Initial findings from these modified RAFs are showing improvements in sediment trapping capacities and reductions in phosphorus, nitrate and suspended sediment losses during storm events. © 2013.
NASA Astrophysics Data System (ADS)
Taner, M. U.; Ray, P.; Brown, C.
2016-12-01
Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.
Passenger flow analysis of Beijing urban rail transit network using fractal approach
NASA Astrophysics Data System (ADS)
Li, Xiaohong; Chen, Peiwen; Chen, Feng; Wang, Zijia
2018-04-01
To quantify the spatiotemporal distribution of passenger flow and the characteristics of an urban rail transit network, we introduce four radius fractal dimensions and two branch fractal dimensions by combining a fractal approach with passenger flow assignment model. These fractal dimensions can numerically describe the complexity of passenger flow in the urban rail transit network and its change characteristics. Based on it, we establish a fractal quantification method to measure the fractal characteristics of passenger follow in the rail transit network. Finally, we validate the reasonability of our proposed method by using the actual data of Beijing subway network. It has been shown that our proposed method can effectively measure the scale-free range of the urban rail transit network, network development and the fractal characteristics of time-varying passenger flow, which further provides a reference for network planning and analysis of passenger flow.
Faculty Descriptions of Simulation Debriefing in Traditional Baccalaureate Nursing Programs.
Waznonis, Annette R
A study was conducted to describe simulation debriefing practices of faculty in accredited, traditional, baccalaureate nursing programs in the United States. Best debriefing practices include debriefing by a competent facilitator in a safe environment using a structured framework. Yet, structured frameworks and evaluation of debriefing are lacking in nursing education. This article reports the interview findings from the qualitative component of a large-scale mixed-methods study. Twenty-three full-time faculty members with an average of 6 years of simulation debriefing experience participated in interviews. Three themes emerged with subthemes: a) having the student's best interest at heart, b) getting over the emotional hurdle, and c) intentional debriefing evolves into learning. Gaps were found in faculty development, use of a structured framework, and evaluation. Research is warranted on use of video, postdebriefing assignments, cofacilitation, and debriefing effectiveness.
Cashman, Katherine V.; Mangan, Margaret T.; Poland, Michael P.; Takahashi, T. Jane; Landowski, Claire M.
2014-01-01
The Hawaiian Volcano Observatory (HVO) was established as a natural laboratory to study volcanic processes. Since the most frequent form of volcanic activity in Hawai‘i is effusive, a major contribution of the past century of research at HVO has been to describe and quantify lava flow emplacement processes. Lava flow research has taken many forms; first and foremost it has been a collection of basic observational data on active lava flows from both Mauna Loa and Kīlauea volcanoes that have occurred over the past 100 years. Both the types and quantities of observational data have changed with changing technology; thus, another important contribution of HVO to lava flow studies has been the application of new observational techniques. Also important has been a long-term effort to measure the physical properties (temperature, viscosity, crystallinity, and so on) of flowing lava. Field measurements of these properties have both motivated laboratory experiments and presaged the results of those experiments, particularly with respect to understanding the rheology of complex fluids. Finally, studies of the dynamics of lava flow emplacement have combined detailed field measurements with theoretical models to build a framework for the interpretation of lava flows in numerous other terrestrial, submarine, and planetary environments. Here, we attempt to review all these aspects of lava flow studies and place them into a coherent framework that we hope will motivate future research.
Watershed and stormwater managers need modeling tools to evaluate alternative plans for water quality management and flow abatement techniques in urban and developing areas. A watershed-scale, decision-support framework that is based on cost optimization is needed to support gov...
Climate change adaptation frameworks: an evaluation of plans for coastal Suffolk, UK
NASA Astrophysics Data System (ADS)
Armstrong, J.; Wilby, R.; Nicholls, R. J.
2015-11-01
This paper asserts that three principal frameworks for climate change adaptation can be recognised in the literature: scenario-led (SL), vulnerability-led (VL) and decision-centric (DC) frameworks. A criterion is developed to differentiate these frameworks in recent adaptation projects. The criterion features six key hallmarks as follows: (1) use of climate model information; (2) analysis of metrics/units; (3) socio-economic knowledge; (4) stakeholder engagement; (5) adaptation of implementation mechanisms; (6) tier of adaptation implementation. The paper then tests the validity of this approach using adaptation projects on the Suffolk coast, UK. Fourteen adaptation plans were identified in an online survey. They were analysed in relation to the hallmarks outlined above and assigned to an adaptation framework. The results show that while some adaptation plans are primarily SL, VL or DC, the majority are hybrid, showing a mixture of DC/VL and DC/SL characteristics. Interestingly, the SL/VL combination is not observed, perhaps because the DC framework is intermediate and attempts to overcome weaknesses of both SL and VL approaches. The majority (57 %) of adaptation projects generated a risk assessment or advice notes. Further development of this type of framework analysis would allow better guidance on approaches for organisations when implementing climate change adaptation initiatives, and other similar proactive long-term planning.
Climate change adaptation frameworks: an evaluation of plans for coastal, Suffolk, UK
NASA Astrophysics Data System (ADS)
Armstrong, J.; Wilby, R.; Nicholls, R. J.
2015-06-01
This paper asserts that three principal frameworks for climate change adaptation can be recognised in the literature: Scenario-Led (SL), Vulnerability-Led (VL) and Decision-Centric (DC) frameworks. A criterion is developed to differentiate these frameworks in recent adaptation projects. The criterion features six key hallmarks as follows: (1) use of climate model information; (2) analysis metrics/units; (3) socio-economic knowledge; (4) stakeholder engagement; (5) adaptation implementation mechanisms; (6) tier of adaptation implementation. The paper then tests the validity of this approach using adaptation projects on the Suffolk coast, UK. Fourteen adaptation plans were identified in an online survey. They were analysed in relation to the hallmarks outlined above and assigned to an adaptation framework. The results show that while some adaptation plans are primarily SL, VL or DC, the majority are hybrid showing a mixture of DC/VL and DC/SL characteristics. Interestingly, the SL/VL combination is not observed, perhaps because the DC framework is intermediate and attempts to overcome weaknesses of both SL and VL approaches. The majority (57 %) of adaptation projects generated a risk assessment or advice notes. Further development of this type of framework analysis would allow better guidance on approaches for organisations when implementing climate change adaptation initiatives, and other similar proactive long-term planning.
NASA Astrophysics Data System (ADS)
Li, Xiaohui; Sun, Zhenping; Cao, Dongpu; Liu, Daxue; He, Hangen
2017-03-01
This study proposes a novel integrated local trajectory planning and tracking control (ILTPTC) framework for autonomous vehicles driving along a reference path with obstacles avoidance. For this ILTPTC framework, an efficient state-space sampling-based trajectory planning scheme is employed to smoothly follow the reference path. A model-based predictive path generation algorithm is applied to produce a set of smooth and kinematically-feasible paths connecting the initial state with the sampling terminal states. A velocity control law is then designed to assign a speed value at each of the points along the generated paths. An objective function considering both safety and comfort performance is carefully formulated for assessing the generated trajectories and selecting the optimal one. For accurately tracking the optimal trajectory while overcoming external disturbances and model uncertainties, a combined feedforward and feedback controller is developed. Both simulation analyses and vehicle testing are performed to verify the effectiveness of the proposed ILTPTC framework, and future research is also briefly discussed.
Wise, Robert A; Bartlett, Susan J; Brown, Ellen D; Castro, Mario; Cohen, Rubin; Holbrook, Janet T; Irvin, Charles G; Rand, Cynthia S; Sockrider, Marianna M; Sugar, Elizabeth A
2009-09-01
Information that enhances expectations about drug effectiveness improves the response to placebos for pain. Although asthma symptoms often improve with placebo, it is not known whether the response to placebo or active treatment can be augmented by increasing expectation of benefit. The study objective was to determine whether response to placebo or a leukotriene antagonist (montelukast) can be augmented by messages that increase expectation of benefit. A randomized 20-center controlled trial enrolled 601 asthmatic patients with poor symptom control who were assigned to one of 5 study groups. Participants were randomly assigned to one of 4 treatment groups in a factorial design (ie, placebo with enhanced messages, placebo with neutral messages, montelukast with enhanced messages, or montelukast with neutral messages) or to usual care. Assignment to study drug was double masked, assignment to message content was single masked, and usual care was not masked. The enhanced message aimed to increase expectation of benefit from the drug. The primary outcome was mean change in daily peak flow over 4 weeks. Secondary outcomes included lung function and asthma symptom control. Peak flow and other lung function measures were not improved in participants assigned to the enhanced message groups versus the neutral messages groups for either montelukast or placebo; no differences were noted between the neutral placebo and usual care groups. Placebo-treated participants had improved asthma control with the enhanced message but not montelukast-treated participants; the neutral placebo group did have improved asthma control compared with the usual care group after adjusting for baseline difference. Headaches were more common in participants provided messages that mentioned headache as a montelukast side effect. Optimistic drug presentation augments the placebo effect for patient-reported outcomes (asthma control) but not lung function. However, the effect of montelukast was not enhanced by optimistic messages regarding treatment effectiveness.
Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease
NASA Astrophysics Data System (ADS)
Marsden, Alison
2009-11-01
Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.
Theoretical study of the transonic lift of a double-wedge profile with detached bow wave
NASA Technical Reports Server (NTRS)
Vincenti, Walter G; Wagoner, Cleo B
1954-01-01
A theoretical study is described of the aerodynamic characteristics at small angle of attack of a thin, double-wedge profile in the range of supersonic flight speed in which the bow wave is detached. The analysis is carried out within the framework of the transonic (nonlinear) small-disturbance theory, and the effects of angle of attack are regarded as a small perturbation on the flow previously calculated at zero angle. The mixed flow about the front half of the profile is calculated by relaxation solution of a suitably defined boundary-value problem for transonic small-disturbance equation in the hodograph plane (i.e., the Tricomi equation). The purely supersonic flow about the rear half is found by an extension of the usual numerical method of characteristics. Analytical results are also obtained, within the framework of the same theory, for the range of speed in which the bow wave is attached and the flow is completely supersonic.
Unifying Suspension and Granular flows near Jamming
NASA Astrophysics Data System (ADS)
DeGiuli, Eric; Wyart, Matthieu
2017-06-01
Rheological properties of dense flows of hard particles are singular as one approaches the jamming threshold where flow ceases, both for granular flows dominated by inertia, and for over-damped suspensions. Concomitantly, the lengthscale characterizing velocity correlations appears to diverge at jamming. Here we review a theoretical framework that gives a scaling description of stationary flows of frictionless particles. Our analysis applies both to suspensions and inertial flows of hard particles. We report numerical results in support of the theory, and show the phase diagram that results when friction is added, delineating the regime of validity of the frictionless theory.
The nurse scheduling problem: a goal programming and nonlinear optimization approaches
NASA Astrophysics Data System (ADS)
Hakim, L.; Bakhtiar, T.; Jaharuddin
2017-01-01
Nurses scheduling is an activity of allocating nurses to conduct a set of tasks at certain room at a hospital or health centre within a certain period. One of obstacles in the nurse scheduling is the lack of resources in order to fulfil the needs of the hospital. Nurse scheduling which is undertaken manually will be at risk of not fulfilling some nursing rules set by the hospital. Therefore, this study aimed to perform scheduling models that satisfy all the specific rules set by the management of Bogor State Hospital. We have developed three models to overcome the scheduling needs. Model 1 is designed to schedule nurses who are solely assigned to a certain inpatient unit and Model 2 is constructed to manage nurses who are assigned to an inpatient room as well as at Polyclinic room as conjunct nurses. As the assignment of nurses on each shift is uneven, then we propose Model 3 to minimize the variance of the workload in order to achieve equitable assignment on every shift. The first two models are formulated in goal programming framework, while the last model is in nonlinear optimization form.
Arsenic-based Life: An active learning assignment for teaching scientific discourse.
Jeremy Johnson, R
2017-01-02
Among recent high profile scientific debates was the proposal that life could exist with arsenic in place of phosphorous in its nucleic acids and other biomolecules. Soon after its initial publication, scientists across diverse disciplines began to question this extraordinary claim. Using the original article, its claims, its scientific support, and the ensuing counterarguments, a two-day, active learning classroom exercise was developed focusing on the presentation, evaluation, and discussion of scientific argumentation and discourse. In this culminating assignment of a first semester biochemistry course, undergraduate students analyze the scientific support from the original research articles and then present and discuss multiple scientific rebuttals in a lively, civil classroom debate. Through this assignment, students develop a sense of skepticism, especially for the original arsenic-based life claims, and learn to clearly articulate their counterarguments with scientific support and critical reasoning. With its direct integration into first-semester biochemistry curriculum and the excitement surrounding arsenic based life, this assignment provides a robust, simple, and stimulating framework for introducing scientific discourse and active learning into the undergraduate molecular science curriculum. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(1):40-45, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
2012-12-01
Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.
Poiseuille equation for steady flow of fractal fluid
NASA Astrophysics Data System (ADS)
Tarasov, Vasily E.
2016-07-01
Fractal fluid is considered in the framework of continuous models with noninteger dimensional spaces (NIDS). A recently proposed vector calculus in NIDS is used to get a description of fractal fluid flow in pipes with circular cross-sections. The Navier-Stokes equations of fractal incompressible viscous fluids are used to derive a generalization of the Poiseuille equation of steady flow of fractal media in pipe.
NASA Astrophysics Data System (ADS)
Mudunuru, M. K.; Karra, S.; Nakshatrala, K. B.
2016-12-01
Fundamental to enhancement and control of the macroscopic spreading, mixing, and dilution of solute plumes in porous media structures is the topology of flow field and underlying heterogeneity and anisotropy contrast of porous media. Traditionally, in literature, the main focus was limited to the shearing effects of flow field (i.e., flow has zero helical density, meaning that flow is always perpendicular to vorticity vector) on scalar mixing [2]. However, the combined effect of anisotropy of the porous media and the helical structure (or chaotic nature) of the flow field on the species reactive-transport and mixing has been rarely studied. Recently, it has been experimentally shown that there is an irrefutable evidence that chaotic advection and helical flows are inherent in porous media flows [1,2]. In this poster presentation, we present a non-intrusive physics-based model-order reduction framework to quantify the effects of species mixing in-terms of reduced-order models (ROMs) and scaling laws. The ROM framework is constructed based on the recent advancements in non-negative formulations for reactive-transport in heterogeneous anisotropic porous media [3] and non-intrusive ROM methods [4]. The objective is to generate computationally efficient and accurate ROMs for species mixing for different values of input data and reactive-transport model parameters. This is achieved by using multiple ROMs, which is a way to determine the robustness of the proposed framework. Sensitivity analysis is performed to identify the important parameters. Representative numerical examples from reactive-transport are presented to illustrate the importance of the proposed ROMs to accurately describe mixing process in porous media. [1] Lester, Metcalfe, and Trefry, "Is chaotic advection inherent to porous media flow?," PRL, 2013. [2] Ye, Chiogna, Cirpka, Grathwohl, and Rolle, "Experimental evidence of helical flow in porous media," PRL, 2015. [3] Mudunuru, and Nakshatrala, "On enforcing maximum principles and achieving element-wise species balance for advection-diffusion-reaction equations under the finite element method," JCP, 2016. [4] Quarteroni, Manzoni, and Negri. "Reduced Basis Methods for Partial Differential Equations: An Introduction," Springer, 2016.
NASA Astrophysics Data System (ADS)
Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi
2016-08-01
Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.
ERIC Educational Resources Information Center
Murdock, Margaret; Holman, R. W.; Slade, Tyler; Clark, Shelley L. D.; Rodnick, Kenneth J.
2014-01-01
A unique homework assignment has been designed as a review exercise to be implemented near the end of the one-year undergraduate organic chemistry sequence. Within the framework of the exercise, students derive potential mechanisms for glucose ring opening in the aqueous mutarotation process. In this endeavor, 21 general review principles are…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
... Nasdaq Securities within the existing DMM and SLP framework used to trade its listed securities. The... substantially similar to the Exchange's current SLP procedures in Rule 107B--NYSE Amex Equities. See proposed..., reassign one or more Nasdaq Securities to a different DMM Unit or to a different SLP or SLPs. a. Assignment...
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2013-01-01
Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…
2012-02-09
1nclud1ng suggestions for reduc1ng the burden. to the Department of Defense. ExecutiVe Serv1ce D>rectorate (0704-0188) Respondents should be aware...benchmark problem we contacted Bertrand LeCun who in their poject CHOC from 2005-2008 had applied their parallel B&B framework BOB++ to the RLT1
ERIC Educational Resources Information Center
Moranski, Kara; Kim, Frederic
2016-01-01
Flipped or inverted classroom (IC) models are promising for foreign language instruction in that they appear to promote well-regarded practices that bridge both sociocultural and cognitive theoretical frameworks, such as allowing for higher degrees of learner agency and facilitating deeper levels of processing. To date, the majority of work on IC…
ERIC Educational Resources Information Center
Keirn, Tim; Luhr, Eileen; Escobar, Miguel; Choudhary, Manoj
2012-01-01
Given California's role in the Pacific economy, its historic Asian heritage, and the strong and growing presence of Asian communities and businesses in the state, it is imperative that students statewide understand the history of Asia. Unfortunately, the California state curricular framework and standards in history and social science limit the…
DOT National Transportation Integrated Search
2010-02-01
This project developed a methodology to couple a new pollutant dispersion model with a traffic : assignment process to contain air pollution while maximizing mobility. The overall objective of the air : quality modeling part of the project is to deve...
Random Assignment: Practical Considerations from Field Experiments.
ERIC Educational Resources Information Center
Dunford, Franklyn W.
1990-01-01
Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…
Wisconsin Recertification Manual for Public Librarians.
ERIC Educational Resources Information Center
Fox, Robert; And Others
Designed to assist public librarians certified after May 1, 1979, this manual explains Wisconsin recertification requirements based on continuing education. It provides continuing education guidelines, a flow chart of the recertification process, an individual learning activity form, an annual report form, a conversion chart for assignment of…
Efficient Trajectory Options Allocation for the Collaborative Trajectory Options Program
NASA Technical Reports Server (NTRS)
Rodionova, Olga; Arneson, Heather; Sridhar, Banavar; Evans, Antony
2017-01-01
The Collaborative Trajectory Options Program (CTOP) is a Traffic Management Initiative (TMI) intended to control the air traffic flow rates at multiple specified Flow Constrained Areas (FCAs), where demand exceeds capacity. CTOP allows flight operators to submit the desired Trajectory Options Set (TOS) for each affected flight with associated Relative Trajectory Cost (RTC) for each option. CTOP then creates a feasible schedule that complies with capacity constraints by assigning affected flights with routes and departure delays in such a way as to minimize the total cost while maintaining equity across flight operators. The current version of CTOP implements a Ration-by-Schedule (RBS) scheme, which assigns the best available options to flights based on a First-Scheduled-First-Served heuristic. In the present study, an alternative flight scheduling approach is developed based on linear optimization. Results suggest that such an approach can significantly reduce flight delays, in the deterministic case, while maintaining equity as defined using a Max-Min fairness scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nettesheim, D.G.; Klevit, R.E.; Drobny, G.
1989-02-21
The authors report the sequential assignment of resonances to specific residues in the proton nuclear magnetic resonance spectrum of the variant-3 neurotoxin from the scorpion Centruroides sculpturatus Ewing (range southwestern U.S.A.). A combination of two-dimensional NMR experiments such as 2D-COSY, 2D-NOESY, and single- and double-RELAY coherence transfer spectroscopy has been employed on samples of the protein dissolved in D{sub 2}O and in H{sub 2}O for assignment purposes. These studies provide a basis for the determination of the solution-phase conformation of this protein and for undertaking detailed structure-function studies of these neurotoxins that modulate the flow of sodium current by bindingmore » to the sodium channels of excitable membranes.« less
Game theoretic sensor management for target tracking
NASA Astrophysics Data System (ADS)
Shen, Dan; Chen, Genshe; Blasch, Erik; Pham, Khanh; Douville, Philip; Yang, Chun; Kadar, Ivan
2010-04-01
This paper develops and evaluates a game-theoretic approach to distributed sensor-network management for target tracking via sensor-based negotiation. We present a distributed sensor-based negotiation game model for sensor management for multi-sensor multi-target tacking situations. In our negotiation framework, each negotiation agent represents a sensor and each sensor maximizes their utility using a game approach. The greediness of each sensor is limited by the fact that the sensor-to-target assignment efficiency will decrease if too many sensor resources are assigned to a same target. It is similar to the market concept in real world, such as agreements between buyers and sellers in an auction market. Sensors are willing to switch targets so that they can obtain their highest utility and the most efficient way of applying their resources. Our sub-game perfect equilibrium-based negotiation strategies dynamically and distributedly assign sensors to targets. Numerical simulations are performed to demonstrate our sensor-based negotiation approach for distributed sensor management.
Argueta, Edwin; Shaji, Jeena; Gopalan, Arun; Liao, Peilin; Snurr, Randall Q; Gómez-Gualdrón, Diego A
2018-01-09
Metal-organic frameworks (MOFs) are porous crystalline materials with attractive properties for gas separation and storage. Their remarkable tunability makes it possible to create millions of MOF variations but creates the need for fast material screening to identify promising structures. Computational high-throughput screening (HTS) is a possible solution, but its usefulness is tied to accurate predictions of MOF adsorption properties. Accurate adsorption simulations often require an accurate description of electrostatic interactions, which depend on the electronic charges of the MOF atoms. HTS-compatible methods to assign charges to MOF atoms need to accurately reproduce electrostatic potentials (ESPs) and be computationally affordable, but current methods present an unsatisfactory trade-off between computational cost and accuracy. We illustrate a method to assign charges to MOF atoms based on ab initio calculations on MOF molecular building blocks. A library of building blocks with built-in charges is thus created and used by an automated MOF construction code to create hundreds of MOFs with charges "inherited" from the constituent building blocks. The molecular building block-based (MBBB) charges are similar to REPEAT charges-which are charges that reproduce ESPs obtained from ab initio calculations on crystallographic unit cells of nanoporous crystals-and thus similar predictions of adsorption loadings, heats of adsorption, and Henry's constants are obtained with either method. The presented results indicate that the MBBB method to assign charges to MOF atoms is suitable for use in computational high-throughput screening of MOFs for applications that involve adsorption of molecules such as carbon dioxide.
NASA Astrophysics Data System (ADS)
Szemis, J. M.; Maier, H. R.; Dandy, G. C.
2012-08-01
Rivers, wetlands, and floodplains are in need of management as they have been altered from natural conditions and are at risk of vanishing because of river development. One method to mitigate these impacts involves the scheduling of environmental flow management alternatives (EFMA); however, this is a complex task as there are generally a large number of ecological assets (e.g., wetlands) that need to be considered, each with species with competing flow requirements. Hence, this problem evolves into an optimization problem to maximize an ecological benefit within constraints imposed by human needs and the physical layout of the system. This paper presents a novel optimization framework which uses ant colony optimization to enable optimal scheduling of EFMAs, given constraints on the environmental water that is available. This optimization algorithm is selected because, unlike other currently popular algorithms, it is able to account for all aspects of the problem. The approach is validated by comparing it to a heuristic approach, and its utility is demonstrated using a case study based on the Murray River in South Australia to investigate (1) the trade-off between plant recruitment (i.e., promoting germination) and maintenance (i.e., maintaining habitat) flow requirements, (2) the trade-off between flora and fauna flow requirements, and (3) a hydrograph inversion case. The results demonstrate the usefulness and flexibility of the proposed framework as it is able to determine EFMA schedules that provide optimal or near-optimal trade-offs between the competing needs of species under a range of operating conditions and valuable insight for managers.
Heat flow, seismic cut-off depth and thermal modeling of the Fennoscandian Shield
NASA Astrophysics Data System (ADS)
Veikkolainen, Toni; Kukkonen, Ilmo T.; Tiira, Timo
2017-12-01
Being far from plate boundaries but covered with seismograph networks, the Fennoscandian Shield features an ideal test laboratory for studies of intraplate seismicity. For this purpose, this study applies 4190 earthquake events from years 2000-2015 with magnitudes ranging from 0.10 to 5.22 in Finnish and Swedish national catalogues. In addition, 223 heat flow determinations from both countries and their immediate vicinity were used to analyse the potential correlation of earthquake focal depths and the spatially interpolated heat flow field. Separate subset analyses were performed for five areas of notable seismic activity: the southern Gulf of Bothnia coast of Sweden (area 1), the northern Gulf of Bothnia coast of Sweden (area 2), the Swedish Norrbotten and western Finnish Lapland (area 3), the Kuusamo region of Finland (area 4) and the southernmost Sweden (area 5). In total, our subsets incorporated 3619 earthquake events. No obvious relation of heat flow and focal depth exists, implying that variations of heat flow are primarily caused by shallow lying heat producing units instead of deeper sources. This allows for construction of generic geotherms for the range of representative palaeoclimatically corrected (steady-state) surface heat flow values (40-60 mW m-2). The 1-D geotherms constructed for a three-layer crust and lithospheric upper mantle are based on mantle heat flow constrained with the aid of mantle xenolith thermobarometry (9-15 mW m-2), upper crustal heat production values (3.3-1.1 μWm-3) and the brittle-ductile transition temperature (350 °C) assigned to the cut-off depth of seismicity (28 ± 4 km). For the middle and lower crust heat production values of 0.6 and 0.2 μWm-3 were assigned, respectively. The models suggest a Moho temperature range of 460-500 °C.
Does Non-Compliance with Route/Destination Assignment Compromise Evacuation Efficiency?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Fang; Han, Lee; Chin, Shih-Miao
2007-01-01
This paper documents studies of two real-world network evacuation cases, each with a different, but proven, simulation software package. The purpose of these studies was to examine whether the rate of evacuees' compliance with predetermined route/destination assignments would have an impact on the efficiency of evacuation operations. Results from both cases suggest that a rate of less than 100% compliance does not compromise evacuation efficiency. In fact, although this is counter-intuitive, evacuation efficiency would actually improve as a result of "sensible" non-compliance on the part of the evacuees. A closer observation of the results revealed that the somewhat unexpected improvementmore » results from a reduction in congestion along designated evacuation routes as evacuees spread out to less prominent parallel streets and other non-congested outbound routes. This suggests that by being limited by the zone-to-zone and one-to-one assignment framework, conventional evacuation plans may have fallen short of providing the most efficient guidance to evacuees. To address this issue, some systematic means, perhaps simulation-based, should be performed to assess the zone partitions, route designations, and destination assignments in existing evacuation plans. Thus, evacuation planning with route/destination assignments based on origin zones may be flawed and may deserve reconsideration. After all, once en route, where an evacuee is coming from is of far less consequence than where he or she is going.« less
Belmar, Oscar; Velasco, Josefa; Martinez-Capel, Francisco
2011-05-01
Hydrological classification constitutes the first step of a new holistic framework for developing regional environmental flow criteria: the "Ecological Limits of Hydrologic Alteration (ELOHA)". The aim of this study was to develop a classification for 390 stream sections of the Segura River Basin based on 73 hydrological indices that characterize their natural flow regimes. The hydrological indices were calculated with 25 years of natural monthly flows (1980/81-2005/06) derived from a rainfall-runoff model developed by the Spanish Ministry of Environment and Public Works. These indices included, at a monthly or annual basis, measures of duration of droughts and central tendency and dispersion of flow magnitude (average, low and high flow conditions). Principal Component Analysis (PCA) indicated high redundancy among most hydrological indices, as well as two gradients: flow magnitude for mainstream rivers and temporal variability for tributary streams. A classification with eight flow-regime classes was chosen as the most easily interpretable in the Segura River Basin, which was supported by ANOSIM analyses. These classes can be simplified in 4 broader groups, with different seasonal discharge pattern: large rivers, perennial stable streams, perennial seasonal streams and intermittent and ephemeral streams. They showed a high degree of spatial cohesion, following a gradient associated with climatic aridity from NW to SE, and were well defined in terms of the fundamental variables in Mediterranean streams: magnitude and temporal variability of flows. Therefore, this classification is a fundamental tool to support water management and planning in the Segura River Basin. Future research will allow us to study the flow alteration-ecological response relationship for each river type, and set the basis to design scientifically credible environmental flows following the ELOHA framework.
Array distribution in data-parallel programs
NASA Technical Reports Server (NTRS)
Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert; Sheffler, Thomas J.
1994-01-01
We consider distribution at compile time of the array data in a distributed-memory implementation of a data-parallel program written in a language like Fortran 90. We allow dynamic redistribution of data and define a heuristic algorithmic framework that chooses distribution parameters to minimize an estimate of program completion time. We represent the program as an alignment-distribution graph. We propose a divide-and-conquer algorithm for distribution that initially assigns a common distribution to each node of the graph and successively refines this assignment, taking computation, realignment, and redistribution costs into account. We explain how to estimate the effect of distribution on computation cost and how to choose a candidate set of distributions. We present the results of an implementation of our algorithms on several test problems.
Compression-based integral curve data reuse framework for flow visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Fan; Bi, Chongke; Guo, Hanqi
Currently, by default, integral curves are repeatedly re-computed in different flow visualization applications, such as FTLE field computation, source-destination queries, etc., leading to unnecessary resource cost. We present a compression-based data reuse framework for integral curves, to greatly reduce their retrieval cost, especially in a resource-limited environment. In our design, a hierarchical and hybrid compression scheme is proposed to balance three objectives, including high compression ratio, controllable error, and low decompression cost. Specifically, we use and combine digitized curve sparse representation, floating-point data compression, and octree space partitioning to adaptively achieve the objectives. Results have shown that our data reusemore » framework could acquire tens of times acceleration in the resource-limited environment compared to on-the-fly particle tracing, and keep controllable information loss. Moreover, our method could provide fast integral curve retrieval for more complex data, such as unstructured mesh data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Kurt; James, Scott C.; Roberts, Jesse D.
A modelling framework identifies deployment locations for current-energy-capture devices that maximise power output while minimising potential environmental impacts. The framework, based on the Environmental Fluid Dynamics Code, can incorporate site-specific environmental constraints. Over a 29-day period, energy outputs from three array layouts were estimated for: (1) the preliminary configuration (baseline), (2) an updated configuration that accounted for environmental constraints, (3) and an improved configuration subject to no environmental constraints. Of these layouts, array placement that did not consider environmental constraints extracted the most energy from flow (4.38 MW-hr/day), 19% higher than output from the baseline configuration (3.69 MW-hr/day). Array placementmore » that considered environmental constraints removed 4.27 MW-hr/day of energy (16% more than baseline). In conclusion, this analysis framework accounts for bathymetry and flow-pattern variations that typical experimental studies cannot, demonstrating that it is a valuable tool for identifying improved array layouts for field deployments.« less
Unbiased multi-fidelity estimate of failure probability of a free plane jet
NASA Astrophysics Data System (ADS)
Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin
2017-11-01
Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.
2010-01-01
Background Computerized ICUs rely on software services to convey the medical condition of their patients as well as assisting the staff in taking treatment decisions. Such services are useful for following clinical guidelines quickly and accurately. However, the development of services is often time-consuming and error-prone. Consequently, many care-related activities are still conducted based on manually constructed guidelines. These are often ambiguous, which leads to unnecessary variations in treatments and costs. The goal of this paper is to present a semi-automatic verification and translation framework capable of turning manually constructed diagrams into ready-to-use programs. This framework combines the strengths of the manual and service-oriented approaches while decreasing their disadvantages. The aim is to close the gap in communication between the IT and the medical domain. This leads to a less time-consuming and error-prone development phase and a shorter clinical evaluation phase. Methods A framework is proposed that semi-automatically translates a clinical guideline, expressed as an XML-based flow chart, into a Drools Rule Flow by employing semantic technologies such as ontologies and SWRL. An overview of the architecture is given and all the technology choices are thoroughly motivated. Finally, it is shown how this framework can be integrated into a service-oriented architecture (SOA). Results The applicability of the Drools Rule language to express clinical guidelines is evaluated by translating an example guideline, namely the sedation protocol used for the anaesthetization of patients, to a Drools Rule Flow and executing and deploying this Rule-based application as a part of a SOA. The results show that the performance of Drools is comparable to other technologies such as Web Services and increases with the number of decision nodes present in the Rule Flow. Most delays are introduced by loading the Rule Flows. Conclusions The framework is an effective solution for computerizing clinical guidelines as it allows for quick development, evaluation and human-readable visualization of the Rules and has a good performance. By monitoring the parameters of the patient to automatically detect exceptional situations and problems and by notifying the medical staff of tasks that need to be performed, the computerized sedation guideline improves the execution of the guideline. PMID:20082700
Ongenae, Femke; De Backere, Femke; Steurbaut, Kristof; Colpaert, Kirsten; Kerckhove, Wannes; Decruyenaere, Johan; De Turck, Filip
2010-01-18
Computerized ICUs rely on software services to convey the medical condition of their patients as well as assisting the staff in taking treatment decisions. Such services are useful for following clinical guidelines quickly and accurately. However, the development of services is often time-consuming and error-prone. Consequently, many care-related activities are still conducted based on manually constructed guidelines. These are often ambiguous, which leads to unnecessary variations in treatments and costs.The goal of this paper is to present a semi-automatic verification and translation framework capable of turning manually constructed diagrams into ready-to-use programs. This framework combines the strengths of the manual and service-oriented approaches while decreasing their disadvantages. The aim is to close the gap in communication between the IT and the medical domain. This leads to a less time-consuming and error-prone development phase and a shorter clinical evaluation phase. A framework is proposed that semi-automatically translates a clinical guideline, expressed as an XML-based flow chart, into a Drools Rule Flow by employing semantic technologies such as ontologies and SWRL. An overview of the architecture is given and all the technology choices are thoroughly motivated. Finally, it is shown how this framework can be integrated into a service-oriented architecture (SOA). The applicability of the Drools Rule language to express clinical guidelines is evaluated by translating an example guideline, namely the sedation protocol used for the anaesthetization of patients, to a Drools Rule Flow and executing and deploying this Rule-based application as a part of a SOA. The results show that the performance of Drools is comparable to other technologies such as Web Services and increases with the number of decision nodes present in the Rule Flow. Most delays are introduced by loading the Rule Flows. The framework is an effective solution for computerizing clinical guidelines as it allows for quick development, evaluation and human-readable visualization of the Rules and has a good performance. By monitoring the parameters of the patient to automatically detect exceptional situations and problems and by notifying the medical staff of tasks that need to be performed, the computerized sedation guideline improves the execution of the guideline.
Simulations of a Liquid Hydrogen Inducer at Low-Flow Off-Design Flow Conditions
NASA Technical Reports Server (NTRS)
Hosangadi, A.; Ahuja, V.; Ungewitter, R. J.
2005-01-01
The ability to accurately model details of inlet back flow for inducers operating a t low-flow, off-design conditions is evaluated. A sub-scale version of a three-bladed liquid hydrogen inducer tested in water with detailed velocity and pressure measurements is used as a numerical test bed. Under low-flow, off-design conditions the length of the separation zone as well as the swirl velocity magnitude was under predicted with a standard k-E model. When the turbulent viscosity coefficient was reduced good comparison was obtained a t all the flow conditions examined with both the magnitude and shape of the profile matching well with the experimental data taken half a diameter upstream of the leading edge. The velocity profiles and incidence angles a t the leading edge itself were less sensitive to the back flow length predictions indicating that single-phase performance predictions may be well predicted even if the details of flow separation modeled are incorrect. However, for cavitating flow situations the prediction of the correct swirl in the back flow and the pressure depression in the core becomes critical since it leads to vapor formation. The simulations have been performed using the CRUNCH CFD(Registered Trademark) code that has a generalized multi-element unstructured framework and a n advanced multi-phase formulation for cryogenic fluids. The framework has been validated rigorously for predictions of temperature and pressure depression in cryogenic fluid cavities and has also been shown to predict the cavitation breakdown point for inducers a t design conditions.
CLARA: CLAS12 Reconstruction and Analysis Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo
2016-11-01
In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.
Ductal carcinoma of breast: nuclear grade as a predictor of S-phase fraction.
Dabbs, D J
1993-06-01
Nuclear grade (NG) and S-phase fraction (SPF) are established independent prognostic variables for ductal breast carcinomas. Nuclear grade can be assigned by a pathologist in a simple fashion during histopathologic evaluation of the tumor, while SPF requires flow cytometric evaluation of tumor samples. This prospective study was undertaken to determine whether elevated SPF could be predicted from NG alone and how NG and SPF correlate with c-erbB-2 expression. Eighty-two breast carcinomas of ductal type were assigned an NG of low (grade 1 or grade 2) or high (grade 3). S-phase fraction was recorded initially from fresh-frozen tissue samples and was designated as either low SPF (below the value designated as the cutoff for elevated SPF) or high SPF (a value at or greater than the cutoff value). On fresh tissue the NG predicted the range of SPF (low or high) in 89% of cases. Four percent of the cases that did not correlate could definitely be attributed to sample error. The remaining 7% that did not correlate could have been due to sample error, specimen quality, or tumor heterogeneity, as demonstrated by reversal of SPF range as performed on paraffin blocks of tumor. Eighty-eight percent of the tumors positive for c-erbB-2 were NG 3 and 12% were NG 2. All c-erbB-2 tumors were aneuploid. This study demonstrates the importance of carefully assigning NGs on tissue and indicates the importance of reviewing flow cytometric data side by side with histopathologic parameters to detect discrepancies between these two modalities. Careful nuclear grading assignment can accurately predict the range of SPF.
Interaction Between Strategic and Local Traffic Flow Controls
NASA Technical Reports Server (NTRS)
Grabbe, Son; Sridhar, Banavar; Mukherjee, Avijit; Morando, Alexander
2010-01-01
The loosely coordinated sets of traffic flow management initiatives that are operationally implemented at the national- and local-levels have the potential to under, over, and inconsistently control flights. This study is designed to explore these interactions through fast-time simulations with an emphasis on identifying inequitable situations in which flights receive multiple uncoordinated delays. Two operationally derived scenarios were considered in which flights arriving into the Dallas/Fort Worth International Airport were first controlled at the national-level, either with a Ground Delay Program or a playbook reroute. These flights were subsequently controlled at the local level. The Traffic Management Advisor assigned them arrival scheduling delays. For the Ground Delay Program scenarios, between 51% and 53% of all arrivals experience both pre-departure delays from the Ground Delay Program and arrival scheduling delays from the Traffic Management Advisor. Of the subset of flights that received multiple delays, between 5.7% and 6.4% of the internal departures were first assigned a pre-departure delay by the Ground Delay Program, followed by a second pre-departure delay as a result of the arrival scheduling. For the playbook reroute scenario, Dallas/Fort Worth International Airport arrivals were first assigned pre-departure reroutes based on the MW_2_DALLAS playbook plan, and were subsequently assigned arrival scheduling delays by the Traffic Management Advisor. Since the airport was operating well below capacity when the playbook reroute was in effect, only 7% of the arrivals were observed to receive both rerouting and arrival scheduling delays. Findings from these initial experiments confirm field observations that Ground Delay Programs operated in conjunction with arrival scheduling can result in inequitable situations in which flights receive multiple uncoordinated delays.
76 FR 34658 - The Internet Assigned Numbers Authority (IANA) Functions
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
... raised concerns that short-term contracts create instability in the IANA functions process and would... political sustainability of an Internet that supports the free flow of information, goods, and services... account security and stability issues. Commenters were divided on whether the IANA functions should be...
A scheduling model for the aerial relay system
NASA Technical Reports Server (NTRS)
Ausrotas, R. A.; Liu, E. W.
1980-01-01
The ability of the Aerial Relay System to handle the U.S. transcontinental large hub passenger flow was analyzed with a flexible, interactive computer model. The model incorporated city pair time of day demand and a demand allocation function which assigned passengers to their preferred flights.
Development of a dynamic traffic assignment model to evaluate lane-reversal plans for I-65.
DOT National Transportation Integrated Search
2010-05-01
This report presents the methodology and results from a project that studied contra-flow operations in support of : hurricane evacuations in the state of Alabama. As part of this effort, a simulation model was developed using the : VISTA platform for...
Code of Federal Regulations, 2010 CFR
2010-10-01
... organizational level (e.g., designations and delegations of authority, assignments of responsibilities, work-flow....) as implemented in 5 CFR part 1320 (see 1.105) and the Regulatory Flexibility Act (5 U.S.C. 601, et seq.). Normally, when a law requires publication of a proposed regulation, the Regulatory Flexibility...
Design Document. EKG Interpretation Program.
ERIC Educational Resources Information Center
Webb, Sandra M.
This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…
NASA Technical Reports Server (NTRS)
Chalot, F.; Hughes, T. J. R.; Johan, Z.; Shakib, F.
1991-01-01
A finite element method for the compressible Navier-Stokes equations is introduced. The discretization is based on entropy variables. The methodology is developed within the framework of a Galerkin/least-squares formulation to which a discontinuity-capturing operator is added. Results for four test cases selected among those of the Workshop on Hypersonic Flows for Reentry Problems are presented.
NASA Technical Reports Server (NTRS)
Chalot, F.; Hughes, T. J. R.; Johan, Z.; Shakib, F.
1991-01-01
An FEM for the compressible Navier-Stokes equations is introduced. The discretization is based on entropy variables. The methodology is developed within the framework of a Galerkin/least-squares formulation to which a discontinuity-capturing operator is added. Results for three test cases selected among those of the Workshop on Hypersonic Flows for Reentry Problems are presented.
A Characteristics-Based Approach to Radioactive Waste Classification in Advanced Nuclear Fuel Cycles
NASA Astrophysics Data System (ADS)
Djokic, Denia
The radioactive waste classification system currently used in the United States primarily relies on a source-based framework. This has lead to numerous issues, such as wastes that are not categorized by their intrinsic risk, or wastes that do not fall under a category within the framework and therefore are without a legal imperative for responsible management. Furthermore, in the possible case that advanced fuel cycles were to be deployed in the United States, the shortcomings of the source-based classification system would be exacerbated: advanced fuel cycles implement processes such as the separation of used nuclear fuel, which introduce new waste streams of varying characteristics. To be able to manage and dispose of these potential new wastes properly, development of a classification system that would assign appropriate level of management to each type of waste based on its physical properties is imperative. This dissertation explores how characteristics from wastes generated from potential future nuclear fuel cycles could be coupled with a characteristics-based classification framework. A static mass flow model developed under the Department of Energy's Fuel Cycle Research & Development program, called the Fuel-cycle Integration and Tradeoffs (FIT) model, was used to calculate the composition of waste streams resulting from different nuclear fuel cycle choices: two modified open fuel cycle cases (recycle in MOX reactor) and two different continuous-recycle fast reactor recycle cases (oxide and metal fuel fast reactors). This analysis focuses on the impact of waste heat load on waste classification practices, although future work could involve coupling waste heat load with metrics of radiotoxicity and longevity. The value of separation of heat-generating fission products and actinides in different fuel cycles and how it could inform long- and short-term disposal management is discussed. It is shown that the benefits of reducing the short-term fission-product heat load of waste destined for geologic disposal are neglected under the current source-based radioactive waste classification system, and that it is useful to classify waste streams based on how favorable the impact of interim storage is on increasing repository capacity. The need for a more diverse set of waste classes is discussed, and it is shown that the characteristics-based IAEA classification guidelines could accommodate wastes created from advanced fuel cycles more comprehensively than the U.S. classification framework.
Optimal concentrations in transport systems
Jensen, Kaare H.; Kim, Wonjung; Holbrook, N. Michele; Bush, John W. M.
2013-01-01
Many biological and man-made systems rely on transport systems for the distribution of material, for example matter and energy. Material transfer in these systems is determined by the flow rate and the concentration of material. While the most concentrated solutions offer the greatest potential in terms of material transfer, impedance typically increases with concentration, thus making them the most difficult to transport. We develop a general framework for describing systems for which impedance increases with concentration, and consider material flow in four different natural systems: blood flow in vertebrates, sugar transport in vascular plants and two modes of nectar drinking in birds and insects. The model provides a simple method for determining the optimum concentration copt in these systems. The model further suggests that the impedance at the optimum concentration μopt may be expressed in terms of the impedance of the pure (c = 0) carrier medium μ0 as μopt∼2αμ0, where the power α is prescribed by the specific flow constraints, for example constant pressure for blood flow (α = 1) or constant work rate for certain nectar-drinking insects (α = 6). Comparing the model predictions with experimental data from more than 100 animal and plant species, we find that the simple model rationalizes the observed concentrations and impedances. The model provides a universal framework for studying flows impeded by concentration, and yields insight into optimization in engineered systems, such as traffic flow. PMID:23594815
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sossoe, K.S., E-mail: kwami.sossoe@irt-systemx.fr; Lebacque, J-P., E-mail: jean-patrick.lebacque@ifsttar.fr
2015-03-10
We present in this paper a model of vehicular traffic flow for a multimodal transportation road network. We introduce the notion of class of vehicles to refer to vehicles of different transport modes. Our model describes the traffic on highways (which may contain several lanes) and network transit for pubic transportation. The model is drafted with Eulerian and Lagrangian coordinates and uses a Logit model to describe the traffic assignment of our multiclass vehicular flow description on shared roads. The paper also discusses traffic streams on dedicated lanes for specific class of vehicles with event-based traffic laws. An Euler-Lagrangian-remap schememore » is introduced to numerically approximate the model’s flow equations.« less
Lee, James S; Franc, Jeffrey M
2015-08-01
A high influx of patients during a mass-casualty incident (MCI) may disrupt patient flow in an already overcrowded emergency department (ED) that is functioning beyond its operating capacity. This pilot study examined the impact of a two-step ED triage model using Simple Triage and Rapid Treatment (START) for pre-triage, followed by triage with the Canadian Triage and Acuity Scale (CTAS), on patient flow during a MCI simulation exercise. Hypothesis/Problem It was hypothesized that there would be no difference in time intervals nor patient volumes at each patient-flow milestone. Physicians and nurses participated in a computer-based tabletop disaster simulation exercise. Physicians were randomized into the intervention group using START, then CTAS, or the control group using START alone. Patient-flow milestones including time intervals and patient volumes from ED arrival to triage, ED arrival to bed assignment, ED arrival to physician assessment, and ED arrival to disposition decision were compared. Triage accuracy was compared for secondary purposes. There were no significant differences in the time interval from ED arrival to triage (mean difference 108 seconds; 95% CI, -353 to 596 seconds; P=1.0), ED arrival to bed assignment (mean difference 362 seconds; 95% CI, -1,269 to 545 seconds; P=1.0), ED arrival to physician assessment (mean difference 31 seconds; 95% CI, -1,104 to 348 seconds; P=0.92), and ED arrival to disposition decision (mean difference 175 seconds; 95% CI, -1,650 to 1,300 seconds; P=1.0) between the two groups. There were no significant differences in the volume of patients to be triaged (32% vs 34%; 95% CI for the difference -16% to 21%; P=1.0), assigned a bed (16% vs 21%; 95% CI for the difference -11% to 20%; P=1.0), assessed by a physician (20% vs 22%; 95% CI for the difference -14% to 19%; P=1.0), and with a disposition decision (20% vs 9%; 95% CI for the difference -25% to 4%; P=.34) between the two groups. The accuracy of triage was similar in both groups (57% vs 70%; 95% CI for the difference -15% to 41%; P=.46). Experienced triage nurses were able to apply CTAS effectively during a MCI simulation exercise. A two-step ED triage model using START, then CTAS, had similar patient flow and triage accuracy when compared to START alone.
Controllability of flow-conservation networks
NASA Astrophysics Data System (ADS)
Zhao, Chen; Zeng, An; Jiang, Rui; Yuan, Zhengzhong; Wang, Wen-Xu
2017-07-01
The ultimate goal of exploring complex networks is to control them. As such, controllability of complex networks has been intensively investigated. Despite recent advances in studying the impact of a network's topology on its controllability, a comprehensive understanding of the synergistic impact of network topology and dynamics on controllability is still lacking. Here, we explore the controllability of flow-conservation networks, trying to identify the minimal number of driver nodes that can guide the network to any desirable state. We develop a method to analyze the controllability on flow-conservation networks based on exact controllability theory, transforming the original analysis on adjacency matrix to Laplacian matrix. With this framework, we systematically investigate the impact of some key factors of networks, including link density, link directionality, and link polarity, on the controllability of these networks. We also obtain the analytical equations by investigating the network's structural properties approximatively and design the efficient tools. Finally, we consider some real networks with flow dynamics, finding that their controllability is significantly different from that predicted by only considering the topology. These findings deepen our understanding of network controllability with flow-conservation dynamics and provide a general framework to incorporate real dynamics in the analysis of network controllability.
Path Flow Estimation Using Time Varying Coefficient State Space Model
NASA Astrophysics Data System (ADS)
Jou, Yow-Jen; Lan, Chien-Lun
2009-08-01
The dynamic path flow information is very crucial in the field of transportation operation and management, i.e., dynamic traffic assignment, scheduling plan, and signal timing. Time-dependent path information, which is important in many aspects, is nearly impossible to be obtained. Consequently, researchers have been seeking estimation methods for deriving valuable path flow information from less expensive traffic data, primarily link traffic counts of surveillance systems. This investigation considers a path flow estimation problem involving the time varying coefficient state space model, Gibbs sampler, and Kalman filter. Numerical examples with part of a real network of the Taipei Mass Rapid Transit with real O-D matrices is demonstrated to address the accuracy of proposed model. Results of this study show that this time-varying coefficient state space model is very effective in the estimation of path flow compared to time-invariant model.
Coupled Thermo-Hydro-Mechanical Numerical Framework for Simulating Unconventional Formations
NASA Astrophysics Data System (ADS)
Garipov, T. T.; White, J. A.; Lapene, A.; Tchelepi, H.
2016-12-01
Unconventional deposits are found in all world oil provinces. Modeling these systems is challenging, however, due to complex thermo-hydro-mechanical processes that govern their behavior. As a motivating example, we consider in situ thermal processing of oil shale deposits. When oil shale is heated to sufficient temperatures, kerogen can be converted to oil and gas products over a relatively short timespan. This phase change dramatically impact both the mechanical and hydrologic properties of the rock, leading to strongly coupled THMC interactions. Here, we present a numerical framework for simulating tightly-coupled chemistry, geomechanics, and multiphase flow within a reservoir simulator (the AD-GPRS General Purpose Research Simulator). We model changes in constitutive behavior of the rock using a thermoplasticity model that accounts for microstructural evolution. The multi-component, multiphase flow and transport processes of both mass and heat are modeled at the macroscopic (e.g., Darcy) scale. The phase compositions and properties are described by a cubic equation of state; Arrhenius-type chemical reactions are used to represent kerogen conversion. The system of partial differential equations is discretized using a combination of finite-volumes and finite-elements, respectively, for the flow and mechanics problems. Fully implicit and sequentially implicit method are used to solve resulting nonlinear problem. The proposed framework is verified against available analytical and numerical benchmark cases. We demonstrate the efficiency, performance, and capabilities of the proposed simulation framework by analyzing near well deformation in an oil shale formation.
Bio-Inspired Sampling and Reconstruction Framework for Scientific Visualization
2015-09-17
Framework for Sc ent f c V sua zat on Grant/Contract Number AFOSR assigned control number. It must begin with "FA9550" or "F49620" or "FA2386". FA9550-12-1...project nvest gated samp ng theoret c ssues that ar se n v sua zat on of 3-D (e.g., n s mu at on or b omed ca ) datasets. As samp ng and reconstruct on...are key e ements n the v sua zat on p pe ne, the r mathemat ca mode ng and ana ys s are foundat ona to re ab ty of the resu t ng v sua zat ons. An
ERIC Educational Resources Information Center
Guarino, Cassandra; Dieterle, Steven G.; Bargagliotti, Anna E.; Mason, William M.
2013-01-01
This study investigates the impact of teacher characteristics and instructional strategies on the mathematics achievement of students in kindergarten and first grade and tackles the question of how best to use longitudinal survey data to elicit causal inference in the face of potential threats to validity due to nonrandom assignment to treatment.…
ERIC Educational Resources Information Center
Wissinger, Daniel R.
2012-01-01
The purpose of this study was to explore the effects of Walton, Reed, and Macagno's (2008) dialectical framework on middle school students' historical discussions and written arguments. To do this,151 middle school students from six classrooms were randomly assigned to one of two conditions and asked to participate in a three-week…
ERIC Educational Resources Information Center
Halat, Erdogan
2009-01-01
The aim of this study was to examine the views of pre-service mathematics teachers on the use of webquests in teaching and learning geometry with reference to a theoretical framework developed by Dodge in 1995. For this study the researcher identified four groups containing nineteen pre-service mathematics teachers, which were then assigned to…
ERIC Educational Resources Information Center
Gerzel-Short, Lydia
2013-01-01
This dissertation examined the importance of family involvement in student learning and achievement within the Response to Intervention framework. This study built upon the premise that family involvement in a child's education is paramount if educational gaps are to be closed. Families included in this study were randomly assigned into a…
Improving Long-term Post-wildfire hydrologic simulations using ParFlow
NASA Astrophysics Data System (ADS)
Lopez, S. R.; Kinoshita, A. M.
2015-12-01
Wildfires alter the natural hydrologic processes within a watershed. After vegetation is burned, the combustion of organic material and debris settles into the soil creating a hydrophobic layer beneath the soil surface with varying degree of thickness and depth. Vegetation regrowth rates vary as a function of radiative exposure, burn severity, and precipitation patterns. Hydrologic models used by the Burned Area Emergency Response (BAER) teams use input data and model calibration constraints that are generally either one-dimensional, empirically-based models, or two-dimensional, conceptually-based models with lumped parameter distributions. These models estimate runoff measurements at the watershed outlet; however, do not provide a distributed hydrologic simulation at each point within the watershed. This work uses ParFlow, a three-dimensional, distributed hydrologic model to (1) correlate burn severity with hydrophobicity, (2) evaluate vegetation recovery rate on water components, and (3) improve flood prediction for managers to help with resource allocation and management operations in burned watersheds. ParFlow is applied to Devil Canyon (43 km2) in San Bernardino, California, which was 97% burned in the 2003 Old Fire. The model set-up uses a 30m-cell size resolution over a 6.7 km by 6.4 km lateral extent. The subsurface reaches 30 m and is assigned a variable cell thickness. Variable subsurface thickness allows users to explicitly consider the degree of recovery throughout the stages of regrowth. Burn severity maps from remotely sensed imagery are used to assign initial hydrophobic layer parameters and thickness. Vegetation regrowth is represented with satellite an Enhanced Vegetation Index. Pre and post-fire hydrologic response is evaluated using runoff measurements at the watershed outlet, and using water component (overland flow, lateral flow, baseflow) measurements.
Assessing effects of water abstraction on fish assemblages in Mediterranean streams
Benejam, Lluis; Angermeier, Paul L.; Munne, Antoni; García-Berthou, Emili
2010-01-01
1. Water abstraction strongly affects streams in arid and semiarid ecosystems, particularly where there is a Mediterranean climate. Excessive abstraction reduces the availability of water for human uses downstream and impairs the capacity of streams to support native biota. 2. We investigated the flow regime and related variables in six river basins of the Iberian Peninsula and show that they have been strongly altered, with declining flows (autoregressive models) and groundwater levels during the 20th century. These streams had lower flows and more frequent droughts than predicted by the official hydrological model used in this region. Three of these rivers were sometimes dry, whereas there were predicted by the model to be permanently flowing. Meanwhile, there has been no decrease in annual precipitation. 3. We also investigated the fish assemblage of a stream in one of these river basins (Tordera) for 6 years and show that sites more affected by water abstraction display significant differences in four fish metrics (catch per unit effort, number of benthic species, number of intolerant species and proportional abundance of intolerant individuals) commonly used to assess the biotic condition of streams. 4. We discuss the utility of these metrics in assessing impacts of water abstraction and point out the need for detailed characterisation of the natural flow regime (and hence drought events) prior to the application of biotic indices in streams severely affected by water abstraction. In particular, in cases of artificially dry streams, it is more appropriate for regulatory agencies to assign index scores that reflect biotic degradation than to assign ‘missing’ scores, as is presently customary in assessments of Iberian streams.
Flow line asymmetric nonimaging concentrating optics
NASA Astrophysics Data System (ADS)
Jiang, Lun; Winston, Roland
2016-09-01
Nonimaging Optics has shown that it achieves the theoretical limits by utilizing thermodynamic principles rather than conventional optics. Hence in this paper the condition of the "best" design are both defined and fulfilled in the framework of thermodynamic arguments, which we believe has profound consequences for the designs of thermal and even photovoltaic systems, even illumination and optical communication tasks. This new way of looking at the problem of efficient concentration depends on probabilities, geometric flux field and radiative heat transfer while "optics" in the conventional sense recedes into the background. Some of the new development of flow line designs will be introduced and the connection between the thermodynamics and flow line design will be officially formulated in the framework of geometric flux field. A new way of using geometric flux to design nonimaging optics will be introduced. And finally, we discuss the possibility of 3D ideal nonimaing optics.
Programmable multi-node quantum network design and simulation
NASA Astrophysics Data System (ADS)
Dasari, Venkat R.; Sadlier, Ronald J.; Prout, Ryan; Williams, Brian P.; Humble, Travis S.
2016-05-01
Software-defined networking offers a device-agnostic programmable framework to encode new network functions. Externally centralized control plane intelligence allows programmers to write network applications and to build functional network designs. OpenFlow is a key protocol widely adopted to build programmable networks because of its programmability, flexibility and ability to interconnect heterogeneous network devices. We simulate the functional topology of a multi-node quantum network that uses programmable network principles to manage quantum metadata for protocols such as teleportation, superdense coding, and quantum key distribution. We first show how the OpenFlow protocol can manage the quantum metadata needed to control the quantum channel. We then use numerical simulation to demonstrate robust programmability of a quantum switch via the OpenFlow network controller while executing an application of superdense coding. We describe the software framework implemented to carry out these simulations and we discuss near-term efforts to realize these applications.
Direct simulation Monte Carlo method for gas flows in micro-channels with bends with added curvature
NASA Astrophysics Data System (ADS)
Tisovský, Tomáš; Vít, Tomáš
Gas flows in micro-channels are simulated using an open source Direct Simulation Monte Carlo (DSMC) code dsmcFOAM for general application to rarefied gas flow written within the framework of the open source C++ toolbox called OpenFOAM. Aim of this paper is to investigate the flow in micro-channel with bend with added curvature. Results are compared with flows in channel without added curvature and equivalent straight channel. Effects of micro-channel bend was already thoroughly investigated by White et al. Geometry proposed by White is also used here for refference.
On the implicit density based OpenFOAM solver for turbulent compressible flows
NASA Astrophysics Data System (ADS)
Fürst, Jiří
The contribution deals with the development of coupled implicit density based solver for compressible flows in the framework of open source package OpenFOAM. However the standard distribution of OpenFOAM contains several ready-made segregated solvers for compressible flows, the performance of those solvers is rather week in the case of transonic flows. Therefore we extend the work of Shen [15] and we develop an implicit semi-coupled solver. The main flow field variables are updated using lower-upper symmetric Gauss-Seidel method (LU-SGS) whereas the turbulence model variables are updated using implicit Euler method.
NASA Astrophysics Data System (ADS)
Saye, Robert
2017-09-01
In this two-part paper, a high-order accurate implicit mesh discontinuous Galerkin (dG) framework is developed for fluid interface dynamics, facilitating precise computation of interfacial fluid flow in evolving geometries. The framework uses implicitly defined meshes-wherein a reference quadtree or octree grid is combined with an implicit representation of evolving interfaces and moving domain boundaries-and allows physically prescribed interfacial jump conditions to be imposed or captured with high-order accuracy. Part one discusses the design of the framework, including: (i) high-order quadrature for implicitly defined elements and faces; (ii) high-order accurate discretisation of scalar and vector-valued elliptic partial differential equations with interfacial jumps in ellipticity coefficient, leading to optimal-order accuracy in the maximum norm and discrete linear systems that are symmetric positive (semi)definite; (iii) the design of incompressible fluid flow projection operators, which except for the influence of small penalty parameters, are discretely idempotent; and (iv) the design of geometric multigrid methods for elliptic interface problems on implicitly defined meshes and their use as preconditioners for the conjugate gradient method. Also discussed is a variety of aspects relating to moving interfaces, including: (v) dG discretisations of the level set method on implicitly defined meshes; (vi) transferring state between evolving implicit meshes; (vii) preserving mesh topology to accurately compute temporal derivatives; (viii) high-order accurate reinitialisation of level set functions; and (ix) the integration of adaptive mesh refinement. In part two, several applications of the implicit mesh dG framework in two and three dimensions are presented, including examples of single phase flow in nontrivial geometry, surface tension-driven two phase flow with phase-dependent fluid density and viscosity, rigid body fluid-structure interaction, and free surface flow. A class of techniques known as interfacial gauge methods is adopted to solve the corresponding incompressible Navier-Stokes equations, which, compared to archetypical projection methods, have a weaker coupling between fluid velocity, pressure, and interface position, and allow high-order accurate numerical methods to be developed more easily. Convergence analyses conducted throughout the work demonstrate high-order accuracy in the maximum norm for all of the applications considered; for example, fourth-order spatial accuracy in fluid velocity, pressure, and interface location is demonstrated for surface tension-driven two phase flow in 2D and 3D. Specific application examples include: vortex shedding in nontrivial geometry, capillary wave dynamics revealing fine-scale flow features, falling rigid bodies tumbling in unsteady flow, and free surface flow over a submersed obstacle, as well as high Reynolds number soap bubble oscillation dynamics and vortex shedding induced by a type of Plateau-Rayleigh instability in water ripple free surface flow. These last two examples compare numerical results with experimental data and serve as an additional means of validation; they also reveal physical phenomena not visible in the experiments, highlight how small-scale interfacial features develop and affect macroscopic dynamics, and demonstrate the wide range of spatial scales often at play in interfacial fluid flow.
NASA Astrophysics Data System (ADS)
Saye, Robert
2017-09-01
In this two-part paper, a high-order accurate implicit mesh discontinuous Galerkin (dG) framework is developed for fluid interface dynamics, facilitating precise computation of interfacial fluid flow in evolving geometries. The framework uses implicitly defined meshes-wherein a reference quadtree or octree grid is combined with an implicit representation of evolving interfaces and moving domain boundaries-and allows physically prescribed interfacial jump conditions to be imposed or captured with high-order accuracy. Part one discusses the design of the framework, including: (i) high-order quadrature for implicitly defined elements and faces; (ii) high-order accurate discretisation of scalar and vector-valued elliptic partial differential equations with interfacial jumps in ellipticity coefficient, leading to optimal-order accuracy in the maximum norm and discrete linear systems that are symmetric positive (semi)definite; (iii) the design of incompressible fluid flow projection operators, which except for the influence of small penalty parameters, are discretely idempotent; and (iv) the design of geometric multigrid methods for elliptic interface problems on implicitly defined meshes and their use as preconditioners for the conjugate gradient method. Also discussed is a variety of aspects relating to moving interfaces, including: (v) dG discretisations of the level set method on implicitly defined meshes; (vi) transferring state between evolving implicit meshes; (vii) preserving mesh topology to accurately compute temporal derivatives; (viii) high-order accurate reinitialisation of level set functions; and (ix) the integration of adaptive mesh refinement. In part two, several applications of the implicit mesh dG framework in two and three dimensions are presented, including examples of single phase flow in nontrivial geometry, surface tension-driven two phase flow with phase-dependent fluid density and viscosity, rigid body fluid-structure interaction, and free surface flow. A class of techniques known as interfacial gauge methods is adopted to solve the corresponding incompressible Navier-Stokes equations, which, compared to archetypical projection methods, have a weaker coupling between fluid velocity, pressure, and interface position, and allow high-order accurate numerical methods to be developed more easily. Convergence analyses conducted throughout the work demonstrate high-order accuracy in the maximum norm for all of the applications considered; for example, fourth-order spatial accuracy in fluid velocity, pressure, and interface location is demonstrated for surface tension-driven two phase flow in 2D and 3D. Specific application examples include: vortex shedding in nontrivial geometry, capillary wave dynamics revealing fine-scale flow features, falling rigid bodies tumbling in unsteady flow, and free surface flow over a submersed obstacle, as well as high Reynolds number soap bubble oscillation dynamics and vortex shedding induced by a type of Plateau-Rayleigh instability in water ripple free surface flow. These last two examples compare numerical results with experimental data and serve as an additional means of validation; they also reveal physical phenomena not visible in the experiments, highlight how small-scale interfacial features develop and affect macroscopic dynamics, and demonstrate the wide range of spatial scales often at play in interfacial fluid flow.
Effects of individualized assignments on biology achievement
NASA Astrophysics Data System (ADS)
Kremer, Philip L.
A pretest-posttest, randomized, two groups, experimental, factorial design compared effects of detailed and nondetailed assignments on biology achievement over seven and a half months. Detailed assignments (favoring field independence and induction) employed block diagrams and stepwise directions. Nondetailed assignments (favoring field dependence and deduction) virtually lacked these. The accessible population was 101 tenth grade preparatory school male students. The 95 students enrolled in first year biology constituted the sample. Two by three ANOVA was done on residualized posttest score means of the students. Totally, the detailed students achieved significantly higher than the nondetailed students. This significantly higher achievement was only true of detailed students in the middle thirds of the deviation intelligence quotient (DIQ) range and of the grade point average (G.P.A.) range after the breakdown into upper, middle, and lower thirds of intellectual capability (ability and achievement). The upper third detailed DIQ grouping indirectly achieved higher than its peers, whereas the lower detailed DIQ third achieved lower than its peers. Thus, high capability students apparently benefit from flow and block diagrams, inductions, field independence, and high structure, whereas low capability students may be hindered by these.
Kuhn, Stefan; Schlörer, Nils E
2015-08-01
nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.
Geologic and hydrogeologic frameworks of the Biscayne aquifer in central Miami-Dade County, Florida
Wacker, Michael A.; Cunningham, Kevin J.; Williams, John H.
2014-01-01
Evaluations of the lithostratigraphy, lithofacies, paleontology, ichnology, depositional environments, and cyclostratigraphy from 11 test coreholes were linked to geophysical interpretations, and to results of hydraulic slug tests of six test coreholes at the Snapper Creek Well Field (SCWF), to construct geologic and hydrogeologic frameworks for the study area in central Miami-Dade County, Florida. The resulting geologic and hydrogeologic frameworks are consistent with those recently described for the Biscayne aquifer in the nearby Lake Belt area in Miami-Dade County and link the Lake Belt area frameworks with those developed for the SCWF study area. The hydrogeologic framework is characterized by a triple-porosity pore system of (1) matrix porosity (mainly mesoporous interparticle porosity, moldic porosity, and mesoporous to megaporous separate vugs), which under dynamic conditions, produces limited flow; (2) megaporous, touching-vug porosity that commonly forms stratiform groundwater passageways; and (3) conduit porosity, including bedding-plane vugs, decimeter-scale diameter vertical solution pipes, and meter-scale cavernous vugs. The various pore types and associated permeabilities generally have a predictable vertical spatial distribution related to the cyclostratigraphy. The Biscayne aquifer within the study area can be described as two major flow units separated by a single middle semiconfining unit. The upper Biscayne aquifer flow unit is present mainly within the Miami Limestone at the top of the aquifer and has the greatest hydraulic conductivity values, with a mean of 8,200 feet per day. The middle semiconfining unit, mainly within the upper Fort Thompson Formation, comprises continuous to discontinuous zones with (1) matrix porosity; (2) leaky, low permeability layers that may have up to centimeter-scale vuggy porosity with higher vertical permeability than horizontal permeability; and (3) stratiform flow zones composed of fossil moldic porosity, burrow related vugs, or irregular vugs. Flow zones with a mean hydraulic conductivity of 2,600 feet per day are present within the middle semiconfining unit, but none of the flow zones are continuous across the study area. The lower Biscayne aquifer flow unit comprises a group of flow zones in the lower part of the aquifer. These flow zones are present in the lower part of the Fort Thompson Formation and in some cases within the limestone or sandstone or both in the uppermost part of the Pinecrest Sand Member of the Tamiami Formation. The mean hydraulic conductivity of major flow zones within the lower Biscayne aquifer flow unit is 5,900 feet per day, and the mean value for minor flow zones is 2,900 feet per day. A semiconfining unit is present beneath the Biscayne aquifer. The boundary between the two hydrologic units is at the top or near the top of the Pinecrest Sand Member of the Tamiami Formation. The lower semiconfining unit has a hydraulic conductivity of less than 350 feet per day. The most productive zones of groundwater flow within the two Biscayne aquifer flow units have a characteristic pore system dominated by stratiform megaporosity related to selective dissolution of an Ophiomorpha-dominated ichnofabric. In the upper flow unit, decimeter-scale vertical solution pipes that are common in some areas of the SCWF study area contribute to high vertical permeability compared to that in areas without the pipes. Cross-hole flowmeter data collected from the SCWF test coreholes show that the distribution of vuggy porosity, matrix porosity, and permeability within the Biscayne aquifer of the SCWF is highly heterogeneous and anisotropic. Groundwater withdrawals from production well fields in southeastern Florida may be inducing recharge of the Biscayne aquifer from canals near the well fields that are used for water-management functions, such as flood control and well-field pumping. The SCWF was chosen as a location within Miami-Dade County to study the potential for such recharge to the Biscayne aquifer from the C–2 (Snapper Creek) canal that roughly divides the well field in half. Geologic, hydrogeologic, and hydraulic information on the aquifer collected during construction of monitoring wells within the SCWF could be used to evaluate the groundwater flow budget at the well-field scale.
50 CFR 679.93 - Amendment 80 Program recordkeeping, permits, monitoring, and catch accounting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED... storage. There is sufficient space to accommodate a minimum of 10 observer sampling baskets. This space... manager, and any observers assigned to the vessel. (8) Belt and flow operations. The vessel operator stops...
Cells competition in tumor growth poroelasticity
NASA Astrophysics Data System (ADS)
Fraldi, Massimiliano; Carotenuto, Angelo R.
2018-03-01
Growth of biological tissues has been recently treated within the framework of Continuum Mechanics, by adopting heterogeneous poroelastic models where the interaction between soft matrix and interstitial fluid flow is coupled with inelastic effects ad hoc introduced to simulate the macroscopic volumetric growth determined by cells division, cells growth and extracellular matrix changes occurring at the micro-scale level. These continuum models seem to overcome some limitations intrinsically associated to other alternative approaches based on mass balances in multiphase systems, because the crucial role played by residual stresses accompanying growth and nutrients walkway is preserved. Nevertheless, when these strategies are applied to analyze solid tumors, mass growth is usually assigned in a prescribed form that essentially copies the in vitro measured intrinsic growth rates of the cell species. As a consequence, some important cell-cell dynamics governing mass evolution and invasion rates of cancer cells, as well as their coupling with feedback mechanisms associated to in situ stresses, are inevitably lost and thus the spatial distribution and the evolution with time of the growth inside the tumor -which would be results rather than inputs- are forced to enter in the model simply as data. In order to solve this paradox, it is here proposed an enhanced multi-scale poroelastic model undergoing large deformations and embodying inelastic growth, where the net growth terms directly result from the "interspecific" predator-prey (Volterra/Lotka-like) competition occurring at the micro-scale level between healthy and abnormal cell species. In this way, a system of fully-coupled non-linear PDEs is derived to describe how the fight among cell species to grab the available common resources, stress field, pressure gradients, interstitial fluid flows driving nutrients and inhomogeneous growth all simultaneously interact to decide the tumor fate.
Parkinson's Law quantified: three investigations on bureaucratic inefficiency
NASA Astrophysics Data System (ADS)
Klimek, Peter; Hanel, Rudolf; Thurner, Stefan
2009-03-01
We formulate three famous, descriptive essays of Parkinson on bureaucratic inefficiency in a quantifiable and dynamical socio-physical framework. In the first model we show how the use of recent opinion formation models for small groups can be used to understand Parkinson's observation that decision-making bodies such as cabinets or boards become highly inefficient once their size exceeds a critical 'Coefficient of Inefficiency', typically around 20. A second observation of Parkinson—which is sometimes referred to as Parkinson's Law—is that the growth of bureaucratic or administrative bodies usually goes hand in hand with a drastic decrease of its overall efficiency. In our second model we view a bureaucratic body as a system of a flow of workers, who enter, become promoted to various internal levels within the system over time, and leave the system after having served for a certain time. Promotion usually is associated with an increase of subordinates. Within the proposed model it becomes possible to work out the phase diagram under which conditions of bureaucratic growth can be confined. In our last model we assign individual efficiency curves to workers throughout their life in administration, and compute the optimum time to give them the old age pension, in order to ensure a maximum of efficiency within the body—in Parkinson's words we compute the 'Pension Point'.
Chopik, A; Pasechnik, S; Semerenko, D; Shmeliova, D; Dubtsov, A; Srivastava, A K; Chigrinov, V
2014-03-15
The results of investigation of electro-optical properties of porous polyethylene terephthalate films filled with a nematic liquid crystal (5 CB) are presented. It is established that the optical response of the samples on the applied voltage drastically depends on the frequency range. At low frequencies of applied electrical field (f
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Bevelhimer, Mark S; Frimpong, Dr. Emmanuel A,
2014-01-01
Classification systems are valuable to ecological management in that they organize information into consolidated units thereby providing efficient means to achieve conservation objectives. Of the many ways classifications benefit management, hypothesis generation has been discussed as the most important. However, in order to provide templates for developing and testing ecologically relevant hypotheses, classifications created using environmental variables must be linked to ecological patterns. Herein, we develop associations between a recent US hydrologic classification and fish traits in order to form a template for generating flow ecology hypotheses and supporting environmental flow standard development. Tradeoffs in adaptive strategies for fish weremore » observed across a spectrum of stable, perennial flow to unstable intermittent flow. In accordance with theory, periodic strategists were associated with stable, predictable flow, whereas opportunistic strategists were more affiliated with intermittent, variable flows. We developed linkages between the uniqueness of hydrologic character and ecological distinction among classes, which may translate into predictions between losses in hydrologic uniqueness and ecological community response. Comparisons of classification strength between hydrologic classifications and other frameworks suggested that spatially contiguous classifications with higher regionalization will tend to explain more variation in ecological patterns. Despite explaining less ecological variation than other frameworks, we contend that hydrologic classifications are still useful because they provide a conceptual linkage between hydrologic variation and ecological communities to support flow ecology relationships. Mechanistic associations among fish traits and hydrologic classes support the presumption that environmental flow standards should be developed uniquely for stream classes and ecological communities, therein.« less
Flow-permeability feedbacks and the development of segregation pipes in volcanic materials
NASA Astrophysics Data System (ADS)
Rust, Alison
2014-05-01
Flow and transformation in volcanic porous media is important for the segregation of melts and aqueous fluids from magmas as well as elutriation of fine ash from pyroclastic flows and vents. The general topic will be discussed in the framework of understanding sets of vertical pipes found in two very different types of volcanic deposits: 1) vesicular (bubbly) cylinders in basalt lava flows and 2) gas escape pipes in pyroclastic flow deposits. In both cases the cylinders can be explained by a flow-permeability feedback where perturbations in porosity and thus permeability cause locally higher flow speeds that in turn locally increase the permeability. For vesicular cylinders in lava flows, the porous medium is a framework of crystals within the magma. Above a critical crystallinity, which depends on the shape and size distribution of the crystals, the crystals form a touching framework. As the water-saturated magma continues to cool, it crystallizes anhydrous minerals, resulting in the exsolution of water vapour bubbles that can drive flow of bubbly melt through the crystal network. It is common to find sets of vertical cylinders of bubby melt in solidified lava flows, with compositions that match the residual melt from 35-50% crystallization of the host basalt. These cylinders resemble chimneys in experiments of crystallising ammonium chloride solution that are explained by reactive flow with porous medium convection. The Rayleigh number for the magmatic case is too low for convection but the growth of steam bubbles as the magma crystallizes induces pore fluid flow up through the permeable crystal pile even if there is no convective instability. This bubble-growth-driven upward flow is reactive and can lead to channelization because of a feedback between velocity and permeability. For the gas escape pipes in pyroclastic flows, the porous medium is a very poorly sorted granular material composed of fragments of solid magma with a huge range of grain sizes from ash (microns to 2 mm) to clasts of decimeters or greater. The vertical gas escape pipes are distinguished from the surrounding pyroclastic flow deposit by the lack of fine ash in the pipes; this missing ash was transported up out of the pyroclastic flow by gas flow, a process called elutriation. Laboratory experiments with beds of binary mixtures of spheres aerated through a porous plate at the base, demonstrate that the size ratio, density ratio, and proportions of the two populations of spheres all affect the pattern and efficiency of segregation. Decompaction of the upper portion of the bed separates the grains and thus facilitated the elutriation of the finer particles, which must be transported up through the spaces between the larger particles. A variety of segregation feature are found including pipes lacking fines that grow down from the top of the bed. These could be explained by channelizing of gas flow due to a feedback between local reduction in fines increasing the local permeability and gas velocity.
Planning in context: A situated view of children's management of science projects
NASA Astrophysics Data System (ADS)
Marshall, Susan Katharine
This study investigated children's collaborative planning of a complex, long-term software design project. Using sociocultural methods, it examined over time the development of design teams' planning negotiations and tools to document the coconstruction of cultural frameworks to organize teams' shared understanding of what and how to plan. Results indicated that student teams developed frameworks to address a set of common planning functions that included design planning, project metaplanning (things such as division of labor or sharing of computer resources) and team collaboration management planning. There were also some between-team variations in planning frameworks, within a bandwidth of options. Teams engaged in opportunistic planning, which reflected shifts in strategies in response to new circumstances over time. Team members with past design project experience ("oldtimers") demonstrated the transfer of their planning framework to the current design task, and they supported the developing participation of "newcomers." Teams constructed physical tools (e.g. planning boards) that acted as visual representations of teams' planning frameworks, and inscriptions of team thinking. The assigned functions of the tools also shifted over time with changing project circumstances. The discussion reexamines current approaches to the study of planning and discusses their educational implications.
NASA Astrophysics Data System (ADS)
Machetel, P.; Yuen, D. A.
2012-12-01
In this work, we propose to use Open Thermodynamic System (OTS) frameworks to assess temperatures and discharges of underground flows in fluviokarstic systems. The theoretical formulation is built on the first and second laws of thermodynamics. However, such assumptions would require steady states in the Control Volume to cancel the heat exchanges between underground water and embedding rocks. This situation is obviously never perfectly reached in Nature where flow discharges and temperatures vary with rainfalls, recessions and seasonal or diurnal fluctuations. First, we will shortly show that the results of a pumping test campaign on the Cent-Font (Hérault, France) fluviokarst during summer 2005 are consistent with this theoretical approach. Second, we will present the theoretical formalism of the OTS framework that leads to equation systems involving the temperatures and/or the discharges of the underground and surface flows.Third, this approach will be applied to the white (2003) conceptual model of fluviokarst, and we will present the numerical model built to assess the applicability of these assumptions. The first order of the field hydrologic properties observed at the Cent-Fonts resurgence are well described by the calculations based on this OTS framework. If this agreement is necessary, it is not sufficient to validate the method. In order to test its applicability, the mixing process has been modelized as a cooling reaction in a Continuous Stirred Tank Reactor (CSTR) for which matrix and intrusive flows are introduced continuously while effluent water is recovered at the output. The enthalpy of the various flows is conserved except for the part that exchanges heat with the embedding rocks. However the numerical model shows that in the water saturated part of the CS, the matrix flow swepts heat by convective-advective processes while temporal heat fluctuations from intrusive flows cross the CV walls. The numerical model shows that the convective flow from matrix damps the diurnal fluctuations on very short space and time scales. The case of the seasonal temperature fluctuations depends on the relative global space and time scales between the global transport properties of the fluviokarst and the fluctuations. This works shows that, under these circumstances and framework, temperature can be considered as a conservative tracer because most of the heat exchanged with the embedding rocks during non-steady periods is brought back by the convergence of matrix flows toward the CV. This mechanism cancels the effects of the heat exchanges for the diurnal fluctuations and also reduces those that are due to seasonal variations of temperature. The OTS approach may therefore bring new tools for underground fluid temperatures and discharges assessment and may also probably offer potential applications for geothermal studies. The mixing process in the fluviokarst Conduit System is analogous to a chemical reaction in a Continuous Stirred Tank Reactor (CSTR).
Flow Charts: Visualization of Vector Fields on Arbitrary Surfaces
Li, Guo-Shi; Tricoche, Xavier; Weiskopf, Daniel; Hansen, Charles
2009-01-01
We introduce a novel flow visualization method called Flow Charts, which uses a texture atlas approach for the visualization of flows defined over curved surfaces. In this scheme, the surface and its associated flow are segmented into overlapping patches, which are then parameterized and packed in the texture domain. This scheme allows accurate particle advection across multiple charts in the texture domain, providing a flexible framework that supports various flow visualization techniques. The use of surface parameterization enables flow visualization techniques requiring the global view of the surface over long time spans, such as Unsteady Flow LIC (UFLIC), particle-based Unsteady Flow Advection Convolution (UFAC), or dye advection. It also prevents visual artifacts normally associated with view-dependent methods. Represented as textures, Flow Charts can be naturally integrated into hardware accelerated flow visualization techniques for interactive performance. PMID:18599918
Reverse logistics in the Brazilian construction industry.
Nunes, K R A; Mahler, C F; Valle, R A
2009-09-01
In Brazil most Construction and Demolition Waste (C&D waste) is not recycled. This situation is expected to change significantly, since new federal regulations oblige municipalities to create and implement sustainable C&D waste management plans which assign an important role to recycling activities. The recycling organizational network and its flows and components are fundamental to C&D waste recycling feasibility. Organizational networks, flows and components involve reverse logistics. The aim of this work is to introduce the concepts of reverse logistics and reverse distribution channel networks and to study the Brazilian C&D waste case.
Oosterhuis, W P; van der Horst, M; van Dongen, K; Ulenkate, H J L M; Volmer, M; Wulkan, R W
2007-10-20
To compare the flow diagram for the diagnosis of anaemia from the guideline 'Anaemia' from the Dutch College of General Practitioners (NHG) with a substantive and logistical alternative protocol. Prospective. For evaluation of anaemia, 124 patients from primary care reported to the laboratories of the St. Elisabeth Hospital in Tilburg (n = 94) and the Scheper Hospital in Emmen (n = 30), the Netherlands. Two flow charts were used: the NHG's flow chart and a self-developed chart in which not mean corpuscular volume, but ferritin concentration occupies the central position. All the laboratory tests mentioned in both flow charts were carried out in every patient with, for practical reasons, the exception of Hgb electrophoresis and bone marrow investigations. General practitioners were approached and patient dossiers were consulted to obtain further clinical data. According to the NHG protocol, on the grounds of the laboratory investigations, 64 (52%) of patients could not be put in a specific category. The majority were patients with normocytary anaemia who did not fulfil the criteria for iron deficiency anaemia or the anaemia of chronic disease. According to the alternative chart, in 36 (29%) patients no diagnosis was made. These were patients in whom no abnormal laboratory findings were observed, other than low haemoglobin values. The majority of the patients had normocytary anaemia, in some cases this was interpreted as the anaemia of chronic disease, but more often the anaemia could not be assigned to a particular category. A large number ofpatients had a raised creatinine value. This value did not appear in the NHG protocol. In 15% of patients, more than one cause for anaemia was found. The NHG protocol did not enable these multiple diagnoses to be made. Accordingly, the NHG protocol was difficult to implement in the laboratory. Using the NHG flow diagram a large percentage of patients could not be assigned to a particular category. Using the alternative flow diagram, which procedure is easier to carry out in the laboratory, it was possible to make multiple diagnoses.
A Framework for Simplifying Educator Tasks Related to the Integration of Games in the Learning Flow
ERIC Educational Resources Information Center
del Blanco, Angel; Torrente, Javier; Marchiori, Eugenio J.; Martinez-Ortiz, Ivan; Moreno-Ger, Pablo; Fernandez-Manjon, Baltasar
2012-01-01
The integration of educational video games in educational settings in general, and e-learning systems in particular, can be challenging for educators. We propose a framework that aims to facilitate educators' participation in the creation and modification of courses that use educational games. Our approach addresses problems identified by previous…
An information theory framework for dynamic functional domain connectivity.
Vergara, Victor M; Miller, Robyn; Calhoun, Vince
2017-06-01
Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Shakhawath Hossain, Md; Bergstrom, D J; Chen, X B
2015-12-01
The in vitro chondrocyte cell culture for cartilage tissue regeneration in a perfusion bioreactor is a complex process. Mathematical modeling and computational simulation can provide important insights into the culture process, which would be helpful for selecting culture conditions to improve the quality of the developed tissue constructs. However, simulation of the cell culture process is a challenging task due to the complicated interaction between the cells and local fluid flow and nutrient transport inside the complex porous scaffolds. In this study, a mathematical model and computational framework has been developed to simulate the three-dimensional (3D) cell growth in a porous scaffold placed inside a bi-directional flow perfusion bioreactor. The model was developed by taking into account the two-way coupling between the cell growth and local flow field and associated glucose concentration, and then used to perform a resolved-scale simulation based on the lattice Boltzmann method (LBM). The simulation predicts the local shear stress, glucose concentration, and 3D cell growth inside the porous scaffold for a period of 30 days of cell culture. The predicted cell growth rate was in good overall agreement with the experimental results available in the literature. This study demonstrates that the bi-directional flow perfusion culture system can enhance the homogeneity of the cell growth inside the scaffold. The model and computational framework developed is capable of providing significant insight into the culture process, thus providing a powerful tool for the design and optimization of the cell culture process. © 2015 Wiley Periodicals, Inc.
Shallow groundwater in the Matanuska-Susitna Valley, Alaska—Conceptualization and simulation of flow
Kikuchi, Colin P.
2013-01-01
The Matanuska-Susitna Valley is in the Upper Cook Inlet Basin and is currently undergoing rapid population growth outside of municipal water and sewer service areas. In response to concerns about the effects of increasing water use on future groundwater availability, a study was initiated between the Alaska Department of Natural Resources and the U.S. Geological Survey. The goals of the study were (1) to compile existing data and collect new data to support hydrogeologic conceptualization of the study area, and (2) to develop a groundwater flow model to simulate flow dynamics important at the regional scale. The purpose of the groundwater flow model is to provide a scientific framework for analysis of regional-scale groundwater availability. To address the first study goal, subsurface lithologic data were compiled into a database and were used to construct a regional hydrogeologic framework model describing the extent and thickness of hydrogeologic units in the Matanuska-Susitna Valley. The hydrogeologic framework model synthesizes existing maps of surficial geology and conceptual geochronologies developed in the study area with the distribution of lithologies encountered in hundreds of boreholes. The geologic modeling package Geological Surveying and Investigation in Three Dimensions (GSI3D) was used to construct the hydrogeologic framework model. In addition to characterizing the hydrogeologic framework, major groundwater-budget components were quantified using several different techniques. A land-surface model known as the Deep Percolation Model was used to estimate in-place groundwater recharge across the study area. This model incorporates data on topography, soils, vegetation, and climate. Model-simulated surface runoff was consistent with observed streamflow at U.S. Geological Survey streamgages. Groundwater withdrawals were estimated on the basis of records from major water suppliers during 2004-2010. Fluxes between groundwater and surface water were estimated during field investigations on several small streams. Regional groundwater flow patterns were characterized by synthesizing previous water-table maps with a synoptic water-level measurement conducted during 2009. Time-series water-level data were collected at groundwater and lake monitoring stations over the study period (2009–present). Comparison of historical groundwater-level records with time-series groundwater-level data collected during this study showed similar patterns in groundwater-level fluctuation in response to precipitation. Groundwater-age data collected during previous studies show that water moves quickly through the groundwater system, suggesting that the system responds quickly to changes in climate forcing. Similarly, the groundwater system quickly returns to long-term average conditions following variability due to seasonal or interannual changes in precipitation. These analyses indicate that the groundwater system is in a state of dynamic equilibrium, characterized by water-level fluctuation about a constant average state, with no long-term trends in aquifer-system storage. To address the second study goal, a steady-state groundwater flow model was developed to simulate regional groundwater flow patterns. The groundwater flow model was bounded by physically meaningful hydrologic features, and appropriate internal model boundaries were specified on the basis of conceptualization of the groundwater system resulting in a three-layer model. Calibration data included 173 water‑level measurements and 18 measurements of streamflow gains and losses along small streams. Comparison of simulated and observed heads and flows showed that the model accurately simulates important regional characteristics of the groundwater flow system. This model is therefore appropriate for studying regional-scale groundwater availability. Mismatch between model-simulated and observed hydrologic quantities is likely because of the coarse grid size of the model and seasonal transient effects. Next steps towards model refinement include the development of a transient groundwater flow model that is suitable for analysis of seasonal variability in hydraulic heads and flows. In addition, several important groundwater budget components remain poorly quantified—including groundwater outflow to the Matanuska River, Little Susitna River, and Knik Arm.
NASA Technical Reports Server (NTRS)
Axdahl, Erik L.
2015-01-01
Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.
Theadore, Fred; Jellison, James B.
2012-01-01
Currently, public health emergency preparedness (PHEP) is not well defined. Discussions about public health preparedness often make little progress, for lack of a shared understanding of the topic. We present a concise yet comprehensive framework describing PHEP activities. The framework, which was refined for 3 years by state and local health departments, uses terms easily recognized by the public health workforce within an information flow consistent with the National Incident Management System. To assess the framework's completeness, strengths, and weaknesses, we compare it to 4 other frameworks: the RAND Corporation's PREPARE Pandemic Influenza Quality Improvement Toolkit, the National Response Framework's Public Health and Medical Services Functional Areas, the National Health Security Strategy Capabilities List, and the Centers for Disease Control and Prevention's PHEP Capabilities. PMID:22397343
Thermodynamics of urban population flows.
Hernando, A; Plastino, A
2012-12-01
Orderliness, reflected via mathematical laws, is encountered in different frameworks involving social groups. Here we show that a thermodynamics can be constructed that macroscopically describes urban population flows. Microscopic dynamic equations and simulations with random walkers underlie the macroscopic approach. Our results might be regarded, via suitable analogies, as a step towards building an explicit social thermodynamics.
Inferring landscape effects on gene flow: A new model selection framework
A. J. Shirk; D. O. Wallin; S. A. Cushman; C. G. Rice; K. I. Warheit
2010-01-01
Populations in fragmented landscapes experience reduced gene flow, lose genetic diversity over time and ultimately face greater extinction risk. Improving connectivity in fragmented landscapes is now a major focus of conservation biology. Designing effective wildlife corridors for this purpose, however, requires an accurate understanding of how landscapes shape gene...
The Determinants of Interdistrict Open Enrollment Flows: Evidence from Two States
ERIC Educational Resources Information Center
Carlson, Deven; Lavery, Lesley; Witte, John F.
2011-01-01
Interdistrict open enrollment is the most widely used form of school choice in the United States. Through the theoretical lens of a utility maximization framework, this article analyzes the determinants of interdistrict open enrollment flows in Minnesota and Colorado. The authors' empirical analysis employs an original data set that details open…
Preliminary report on geophysical data in Yavapai County, Arizona
Langenheim, V.E.; Hoffmann, J.P.; Blasch, K.W.; DeWitt, Ed; Wirt, Laurie
2002-01-01
Recently acquired geophysical data provide information on the geologic framework and its effect of groundwater flow and on stream/aquifer interaction in Yavapai County, Arizona. High-resolution aeromagnetic data reflect diverse rock types at and below the topographic surface and have permitted a preliminary interpretation of faults and underlying rock types (in particular, volcanic) that will provide new insights on the geologic framework, critical input to future hydrologic investigations. Aeromagnetic data map the western end of the Bear Wallow Canyon fault into the sedimentary fill of Verde Valley. Regional gravity data indicate potentially significant accumulations of low-density basin fill in Big Chino, Verde, and Williamson Valleys. Electrical and seismic data were also collected and help evaluate the approximate depth and extent of recent alluvium overlying Tertiary and Paleozoic sediments. These data will be used to ascertain the potential contribution of shallow ground-water subflow that cannot be measured by gages or flow meters and whether stream flow in losing reaches is moving as subflow or is being lost to the subsurface. The geophysical data will help produce a more robust groundwater flow model of the region.
Martin, Angel; Whiteman, C.D.
1999-01-01
Existing data on water levels, water use, water quality, and aquifer properties were used to construct a multilayer digital model to simulate flow in the aquifer system. The report describes the geohydrologic framework of the aquifer system, and the development, calibration, and sensitivity analysis of the ground-water-flow model, but it is primarily focused on the results of the simulations that show the natural flow of ground water throughout the regional aquifer system and the changes from the natural flow caused by development of ground-water supplies.
The AGCE related studies of baroclinic flows in spherical geometry
NASA Technical Reports Server (NTRS)
Hyun, J. M.
1983-01-01
Steady state, axisymmetric motions of a Boussineaq fluid continued in rotating spherical anmulus are considered. The motions are driven by latitudinally varying temperature gradient at the shells. Linearized formulations for a narrow gap are derived and the flow field is divided into the Ekman layers and the geostrophic interior. The Ekman layer flows are consistent with the known results for cylindrical geometries. Within the framework of rather restrictive assumptions, the interior flows are solved by a series of associated Legendre polynomials. The solutions show qualitative features valid at midlatitudes.
NASA Astrophysics Data System (ADS)
Kibler, K. M.; Alipour, M.
2016-12-01
Achieving the universal energy access Sustainable Development Goal will require great investment in renewable energy infrastructure in the developing world. Much growth in the renewable sector will come from new hydropower projects, including small and diversion hydropower in remote and mountainous regions. Yet, human impacts to hydrological systems from diversion hydropower are poorly described. Diversion hydropower is often implemented in ungauged rivers, thus detection of impact requires flow analysis tools suited to prediction in poorly-gauged and human-altered catchments. We conduct a comprehensive analysis of hydrologic alteration in 32 rivers developed with diversion hydropower in southwestern China. As flow data are sparse, we devise an approach for estimating streamflow during pre- and post-development periods, drawing upon a decade of research into prediction in ungauged basins. We apply a rainfall-runoff model, parameterized and forced exclusively with global-scale data, in hydrologically-similar gauged and ungauged catchments. Uncertain "soft" data are incorporated through fuzzy numbers and confidence-based weighting, and a multi-criteria objective function is applied to evaluate model performance. Testing indicates that the proposed framework returns superior performance (NSE = 0.77) as compared to models parameterized by rote calibration (NSE = 0.62). Confident that the models are providing `the right answer for the right reasons', our analysis of hydrologic alteration based on simulated flows indicates statistically significant hydrologic effects of diversion hydropower across many rivers. Mean annual flows, 7-day minimum and 7-day maximum flows decreased. Frequency and duration of flow exceeding Q25 decreased while duration of flows sustained below the Q75 increased substantially. Hydrograph rise and fall rates and flow constancy increased. The proposed methodology may be applied to improve diversion hydropower design in data-limited regions.
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voisin, Nathalie; Leung, Lai-Yung R.; Hejazi, Mohamad I.
A global integrated assessment model including a water-demand model driven by socio-economics, is coupled in a one-way fashion with a land surface hydrology – routing – water resources management model. The integrated modeling framework is applied to the U.S. Upper Midwest (Missouri, Upper Mississippi, and Ohio) to advance understanding of the regional impacts of climate and socio-economic changes on integrated water resources. Implications for future flow regulation, water supply, and supply deficit are investigated using climate change projections with the B1 and A2 emission scenarios, which affect both natural flow and water demand. Changes in water demand are driven bymore » socio-economic factors, energy and food demands, global markets and prices. The framework identifies the multiple spatial scales of interactions between the drivers of changes (natural flow and water demand) and the managed water resources (regulated flow, supply and supply deficit). The contribution of the different drivers of change are quantified regionally, and also evaluated locally, using covariances. The integrated framework shows that water supply deficit is more predictable over the Missouri than the other regions in the Midwest. The predictability of the supply deficit mostly comes from long term changes in water demand although changes in runoff has a greater contribution, comparable to the contribution of changes in demand, over shorter time periods. The integrated framework also shows that spatially, water demand drives local supply deficit. Using elasticity, the sensitivity of supply deficit to drivers of change is established. The supply deficit is found to be more sensitive to changes in runoff than to changes in demand regionally. It contrasts with the covariance analysis that shows that water demand is the dominant driver of supply deficit over the analysed periods. The elasticity indicates the level of mitigation needed to control the demand in order to reduce the vulnerability of the integrated system in future periods. The elasticity analyses also emphasize the need to address uncertainty with respect to changes in natural flow in integrated assessment.« less
Kotb, Magd A.; Elmahdy, Hesham Nabeh; Khalifa, Nour El Deen Mahmoud; El-Deen, Mohamed Hamed Nasr; Lotfi, Mohamed Amr N.
2015-01-01
Abstract Evidence-based medicine (EBM) is delivered through a didactic, blended learning, and mixed models. Students are supposed to construct an answerable question in PICO (patient, intervention, comparison, and outcome) framework, acquire evidence through search of literature, appraise evidence, apply it to the clinical case scenario, and assess the evidence in relation to clinical context. Yet these teaching models have limitations especially those related to group work, for example, handling uncooperative students, students who fail to contribute, students who domineer, students who have personal conflict, their impact upon progress of their groups, and inconsistent individual acquisition of required skills. At Pediatrics Department, Faculty of Medicine, Cairo University, we designed a novel undergraduate pediatric EBM assignment online system to overcome shortcomings of previous didactic method and aimed to assess its effectiveness by prospective follow-up during academic years 2012 to 2013 and 2013 to 2014. The novel web-based online interactive system was tailored to provide sequential single and group assignments for each student. Single assignment addressed a specific case scenario question, while group assignment was teamwork that addressed different questions of same case scenario. Assignment comprised scholar content and skills. We objectively analyzed students’ performance by criterion-based assessment and subjectively by anonymous student questionnaire. A total of 2879 were enrolled in 5th year Pediatrics Course consecutively, of them 2779 (96.5%) logged in and 2554 (88.7%) submitted their work. They were randomly assigned to 292 groups. A total of 2277 (89.15%) achieved ≥80% of total mark (4/5), of them 717 (28.1%) achieved a full mark. A total of 2178 (85.27%) and 2359 (92.36%) made evidence-based conclusions and recommendations in single and group assignment, respectively (P < 0.001). A total of 1102 (43.1%) answered student questionnaire, of them 898 (81.48%) found e-educational experience satisfactory, 175 (15.88%) disagreed, and 29 (2.6%) could not decide. A total of 964 (87.47%) found single assignment educational, 913 (82.84%) found group assignment educational, and 794 (72.3%) enjoyed it. Web-based online interactive undergraduate EBM assignment was found effective in teaching medical students and assured individual student acquisition of concepts and skills of pediatric EMB. It was effective in mass education, data collection, and storage essential for system and student assessment. PMID:26200621
Kotb, Magd A; Elmahdy, Hesham Nabeh; Khalifa, Nour El Deen Mahmoud; El-Deen, Mohamed Hamed Nasr; Lotfi, Mohamed Amr N
2015-07-01
Evidence-based medicine (EBM) is delivered through a didactic, blended learning, and mixed models. Students are supposed to construct an answerable question in PICO (patient, intervention, comparison, and outcome) framework, acquire evidence through search of literature, appraise evidence, apply it to the clinical case scenario, and assess the evidence in relation to clinical context. Yet these teaching models have limitations especially those related to group work, for example, handling uncooperative students, students who fail to contribute, students who domineer, students who have personal conflict, their impact upon progress of their groups, and inconsistent individual acquisition of required skills. At Pediatrics Department, Faculty of Medicine, Cairo University, we designed a novel undergraduate pediatric EBM assignment online system to overcome shortcomings of previous didactic method and aimed to assess its effectiveness by prospective follow-up during academic years 2012 to 2013 and 2013 to 2014. The novel web-based online interactive system was tailored to provide sequential single and group assignments for each student. Single assignment addressed a specific case scenario question, while group assignment was teamwork that addressed different questions of same case scenario. Assignment comprised scholar content and skills. We objectively analyzed students' performance by criterion-based assessment and subjectively by anonymous student questionnaire. A total of 2879 were enrolled in 5th year Pediatrics Course consecutively, of them 2779 (96.5%) logged in and 2554 (88.7%) submitted their work. They were randomly assigned to 292 groups. A total of 2277 (89.15%) achieved ≥ 80% of total mark (4/5), of them 717 (28.1%) achieved a full mark. A total of 2178 (85.27%) and 2359 (92.36%) made evidence-based conclusions and recommendations in single and group assignment, respectively (P < 0.001). A total of 1102 (43.1%) answered student questionnaire, of them 898 (81.48%) found e-educational experience satisfactory, 175 (15.88%) disagreed, and 29 (2.6%) could not decide. A total of 964 (87.47%) found single assignment educational, 913 (82.84%) found group assignment educational, and 794 (72.3%) enjoyed it. Web-based online interactive undergraduate EBM assignment was found effective in teaching medical students and assured individual student acquisition of concepts and skills of pediatric EMB. It was effective in mass education, data collection, and storage essential for system and student assessment.
Accurate continuous geographic assignment from low- to high-density SNP data.
Guillot, Gilles; Jónsson, Hákon; Hinge, Antoine; Manchih, Nabil; Orlando, Ludovic
2016-04-01
Large-scale genotype datasets can help track the dispersal patterns of epidemiological outbreaks and predict the geographic origins of individuals. Such genetically-based geographic assignments also show a range of possible applications in forensics for profiling both victims and criminals, and in wildlife management, where poaching hotspot areas can be located. They, however, require fast and accurate statistical methods to handle the growing amount of genetic information made available from genotype arrays and next-generation sequencing technologies. We introduce a novel statistical method for geopositioning individuals of unknown origin from genotypes. Our method is based on a geostatistical model trained with a dataset of georeferenced genotypes. Statistical inference under this model can be implemented within the theoretical framework of Integrated Nested Laplace Approximation, which represents one of the major recent breakthroughs in statistics, as it does not require Monte Carlo simulations. We compare the performance of our method and an alternative method for geospatial inference, SPA in a simulation framework. We highlight the accuracy and limits of continuous spatial assignment methods at various scales by analyzing genotype datasets from a diversity of species, including Florida Scrub-jay birds Aphelocoma coerulescens, Arabidopsis thaliana and humans, representing 41-197,146 SNPs. Our method appears to be best suited for the analysis of medium-sized datasets (a few tens of thousands of loci), such as reduced-representation sequencing data that become increasingly available in ecology. http://www2.imm.dtu.dk/∼gigu/Spasiba/ gilles.b.guillot@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.
2013-01-01
A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168
Packet Randomized Experiments for Eliminating Classes of Confounders
Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.
2014-01-01
Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088
High-rise architecture in Ufa, Russia, based on crystallography canons
NASA Astrophysics Data System (ADS)
Narimanovich Sabitov, Ildar; Radikovna Kudasheva, Dilara; Yaroslavovich Vdovin, Denis
2018-03-01
The article considers fundamental steps of high-rise architecture forming stylistic tendencies, based on C. Willis and M. A. Korotich's studies. Crystallographic shaping as a direction is assigned on basis of classification by M. A. Korotich's. This direction is particularly examined and the main high-rise architecture forming aspects on basis of natural polycrystals forming principles are assigned. The article describes crystal forms transformation into an architectural composition, analyzes constructive systems within the framework of CTBUH (Council on Tall Buildings and Urban Habitat) classification, and picks out one of its types as the most optimal for using in buildings-crystals. The last stage of our research is the theoretical principles approbation into an experimental project of high-rise building in Ufa with the description of its contextual dislocation aspects.
A Framework for Simulation of Aircraft Flyover Noise Through a Non-Standard Atmosphere
NASA Technical Reports Server (NTRS)
Arntzen, Michael; Rizzi, Stephen A.; Visser, Hendrikus G.; Simons, Dick G.
2012-01-01
This paper describes a new framework for the simulation of aircraft flyover noise through a non-standard atmosphere. Central to the framework is a ray-tracing algorithm which defines multiple curved propagation paths, if the atmosphere allows, between the moving source and listener. Because each path has a different emission angle, synthesis of the sound at the source must be performed independently for each path. The time delay, spreading loss and absorption (ground and atmosphere) are integrated along each path, and applied to each synthesized aircraft noise source to simulate a flyover. A final step assigns each resulting signal to its corresponding receiver angle for the simulation of a flyover in a virtual reality environment. Spectrograms of the results from a straight path and a curved path modeling assumption are shown. When the aircraft is at close range, the straight path results are valid. Differences appear especially when the source is relatively far away at shallow elevation angles. These differences, however, are not significant in common sound metrics. While the framework used in this work performs off-line processing, it is conducive to real-time implementation.
NASA Astrophysics Data System (ADS)
Llorens, Pilar; Gallart, Francesc; Latron, Jérôme; Cid, Núria; Rieradevall, Maria; Prat, Narcís
2016-04-01
Aquatic life in temporary streams is strongly conditioned by the temporal variability of the hydrological conditions that control the occurrence and connectivity of diverse mesohabitats. In this context, the software TREHS (Temporary Rivers' Ecological and Hydrological Status) has been developed, in the framework of the LIFE Trivers project, to help managers for adequately implement the Water Framework Directive in this type of water bodies. TREHS, using the methodology described in Gallart et al (2012), defines six temporal 'aquatic states', based on the hydrological conditions representing different mesohabitats, for a given reach at a particular moment. Nevertheless, hydrological data for assessing the regime of temporary streams are often non-existent or scarce. The scarcity of flow data makes frequently impossible the characterization of temporary streams hydrological regimes and, as a consequence, the selection of the correct periods and methods to determine their ecological status. Because of its qualitative nature, the TREHS approach allows the use of alternative methodologies to assess the regime of temporary streams in the lack of observed flow data. However, to adapt the TREHS to this qualitative data both the temporal scheme (from monthly to seasonal) as well as the number of aquatic states (from 6 to 3) have been modified. Two alternatives complementary methodologies were tested within the TREHS framework to assess the regime of temporary streams: interviews and aerial photographs. All the gauging stations (13) belonging to the Catalan Internal Catchments (NE, Spain) with recurrent zero flows periods were selected to validate both methodologies. On one hand, non-structured interviews were carried out to inhabitants of villages and small towns near the gauging stations. Flow permanence metrics for input into TREHS were drawn from the notes taken during the interviews. On the other hand, the historical series of available aerial photographs (typically 10) were examined. In this case, flow permanence metrics were estimated as the proportion of photographs presenting stream flow. Results indicate that for streams being more than 25% of the time dry, interviews systematically underestimated flow, but the qualitative information given by inhabitants was of great interest to understand river dynamics. On the other hand, the use of aerial photographs gave a good estimation of flow permanence, but the seasonality was conditioned to the capture date of the aerial photographs. For these reasons, we recommend to use both methodologies together.
NASA Astrophysics Data System (ADS)
Saito, Namiko
Studies in turbulence often focus on two flow conditions, both of which occur frequently in real-world flows and are sought-after for their value in advancing turbulence theory. These are the high Reynolds number regime and the effect of wall surface roughness. In this dissertation, a Large-Eddy Simulation (LES) recreates both conditions over a wide range of Reynolds numbers Retau = O(102) - O(108) and accounts for roughness by locally modeling the statistical effects of near-wall anisotropic fine scales in a thin layer immediately above the rough surface. A subgrid, roughness-corrected wall model is introduced to dynamically transmit this modeled information from the wall to the outer LES, which uses a stretched-vortex subgrid-scale model operating in the bulk of the flow. Of primary interest is the Reynolds number and roughness dependence of these flows in terms of first and second order statistics. The LES is first applied to a fully turbulent uniformly-smooth/rough channel flow to capture the flow dynamics over smooth, transitionally rough and fully rough regimes. Results include a Moody-like diagram for the wall averaged friction factor, believed to be the first of its kind obtained from LES. Confirmation is found for experimentally observed logarithmic behavior in the normalized stream-wise turbulent intensities. Tight logarithmic collapse, scaled on the wall friction velocity, is found for smooth-wall flows when Re tau ≥ O(106) and in fully rough cases. Since the wall model operates locally and dynamically, the framework is used to investigate non-uniform roughness distribution cases in a channel, where the flow adjustments to sudden surface changes are investigated. Recovery of mean quantities and turbulent statistics after transitions are discussed qualitatively and quantitatively at various roughness and Reynolds number levels. The internal boundary layer, which is defined as the border between the flow affected by the new surface condition and the unaffected part, is computed, and a collapse of the profiles on a length scale containing the logarithm of friction Reynolds number is presented. Finally, we turn to the possibility of expanding the present framework to accommodate more general geometries. As a first step, the whole LES framework is modified for use in the curvilinear geometry of a fully-developed turbulent pipe flow, with implementation carried out in a spectral element solver capable of handling complex wall profiles. The friction factors have shown favorable agreement with the superpipe data, and the LES estimates of the Karman constant and additive constant of the log-law closely match values obtained from experiment.
Thamke, Joanna N.; LeCain, Gary D.; Ryter, Derek W.; Sando, Roy; Long, Andrew J.
2014-01-01
Regionally, water in the lower Tertiary and Upper Cretaceous aquifer systems flows in a northerly or northeasterly direction from the Powder River structural basin to the Williston structural basin. Groundwater flow in the Williston structural basin generally is easterly or northeasterly. Flow in the uppermost hydrogeologic units generally is more local and controlled by topography where unglaciated in the Williston structural basin than is flow in the glaciated part and in underlying aquifers. Groundwater flow in the Powder River structural basin generally is northerly with local variations greatest in the uppermost aquifers. Groundwater is confined, and flow is regional in the underlying aquifers.
REVIEWS OF TOPICAL PROBLEMS: Axisymmetric stationary flows in compact astrophysical objects
NASA Astrophysics Data System (ADS)
Beskin, Vasilii S.
1997-07-01
A review is presented of the analytical results available for a large class of axisymmetric stationary flows in the vicinity of compact astrophysical objects. The determination of the two-dimensional structure of the poloidal magnetic field (hydrodynamic flow field) faces severe difficulties, due to the complexity of the trans-field equation for stationary axisymmetric flows. However, an approach exists which enables direct problems to be solved even within the balance law framework. This possibility arises when an exact solution to the equation is available and flows close to it are investigated. As a result, with the use of simple model problems, the basic features of supersonic flows past real compact objects are determined.
Advancing the use of performance evaluation in health care.
Traberg, Andreas; Jacobsen, Peter; Duthiers, Nadia Monique
2014-01-01
The purpose of this paper is to develop a framework for health care performance evaluation that enables decision makers to identify areas indicative of corrective actions. The framework should provide information on strategic pro-/regress in an operational context that justifies the need for organizational adjustments. The study adopts qualitative methods for constructing the framework, subsequently implementing the framework in a Danish magnetic resonance imaging (MRI) unit. Workshops and interviews form the basis of the qualitative construction phase, and two internal and five external databases are used for a quantitative data collection. By aggregating performance outcomes, collective measures of performance are achieved. This enables easy and intuitive identification of areas not strategically aligned. In general, the framework has proven helpful in an MRI unit, where operational decision makers have been struggling with extensive amounts of performance information. The implementation of the framework in a single case in a public and highly political environment restricts the generalizing potential. The authors acknowledge that there may be more suitable approaches in organizations with different settings. The strength of the framework lies in the identification of performance problems prior to decision making. The quality of decisions is directly related to the individual decision maker. The only function of the framework is to support these decisions. The study demonstrates a more refined and transparent use of performance reporting by combining strategic weight assignment and performance aggregation in hierarchies. In this way, the framework accentuates performance as a function of strategic progress or regress, thus assisting decision makers in exerting operational effort in pursuit of strategic alignment.
2009-04-03
Project, 2002. Bell D.: The Coming of Post-Industrial Society. Basic Books, New York 1976. Blumer H.: Symbolic Interactionism – Perspective and...communication and use by assigned meaning through known conventions used in symbolic representation. 05. The ability to receive, share and transmit information...communicated by symbols (information), i.e., through concepts within the mind that represent reality. 33. The meaning and value of information depends
Reconfiguration in Robust Distributed Real-Time Systems Based on Global Checkpoints
1991-12-01
achieved by utilizing distributed systems in which a single application program executes on multiple processors, connected to a network. The distributed...single application program executes on multiple proces- sors, connected to a network. The distributed nature of such systems make it possible to ...resident at every node. How - ever, the responsibility for execution of a particular function is assigned to only one node in this framework. This function
ERIC Educational Resources Information Center
Branum-Martin, Lee; Patchan, Melissa M.
2016-01-01
Peer learning is often used in classrooms to help and support knowledge and skill acquisition. One form of peer learning, peer assessment, involves the quantitative (i.e., peer ratings) or qualitative (i.e., peer feedback) evaluation of a learner's performance by another learner among students. While we might be concerned about the quality of the…
A Multithreaded Missions And Means Framework (MMF) Concept Report
2012-03-01
Vasconcelos , W.; Gibson, C.; Bar-Noy, A.; Borowiecki, K.; La Porta, T.; Pizzocaro, D.; Rowaihy, H.; Pearson, G.; Pham, T. An Ontology Centric...M.; de Mel, G.; Vasconcelos , W.; Sleeman, D.; Colley, S.; La Porta, T. An Ontology-Based Approach to Sensor-Mission Assignment. Proceedings of the...1st Annual Conference of the International Technology Alliance (ACITA 2007), 2007. Preece, A.; Gomez, M.; de Mel, G.; Vasconcelos , W.; Sleeman, D
Coalition Formation under Uncertainty
2010-03-01
world robotics and demonstrate the algorithm’s scalability. This provides a framework well suited to decentralized task allocation in general collectives...impatience and acquiescence to define a robot allocation to a task in a decentralized manner. The tasks are assigned to the entire collective, and one...20] allocates tasks to robots with a first-price auction method [31]. It announces a task with defined metrics, then the robots issue bids. The task
Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N
2018-06-01
An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.
Information Flow in Interaction Networks II: Channels, Path Lengths, and Potentials
Stojmirović, Aleksandar
2012-01-01
Abstract In our previous publication, a framework for information flow in interaction networks based on random walks with damping was formulated with two fundamental modes: emitting and absorbing. While many other network analysis methods based on random walks or equivalent notions have been developed before and after our earlier work, one can show that they can all be mapped to one of the two modes. In addition to these two fundamental modes, a major strength of our earlier formalism was its accommodation of context-specific directed information flow that yielded plausible and meaningful biological interpretation of protein functions and pathways. However, the directed flow from origins to destinations was induced via a potential function that was heuristic. Here, with a theoretically sound approach called the channel mode, we extend our earlier work for directed information flow. This is achieved by constructing a potential function facilitating a purely probabilistic interpretation of the channel mode. For each network node, the channel mode combines the solutions of emitting and absorbing modes in the same context, producing what we call a channel tensor. The entries of the channel tensor at each node can be interpreted as the amount of flow passing through that node from an origin to a destination. Similarly to our earlier model, the channel mode encompasses damping as a free parameter that controls the locality of information flow. Through examples involving the yeast pheromone response pathway, we illustrate the versatility and stability of our new framework. PMID:22409812
Cartwright, Jennifer M.; Caldwell, Casey; Nebiker, Steven; Knight, Rodney
2017-01-01
This paper presents a conceptual framework to operationalize flow–ecology relationships into decision-support systems of practical use to water-resource managers, who are commonly tasked with balancing multiple competing socioeconomic and environmental priorities. We illustrate this framework with a case study, whereby fish community responses to various water-management scenarios were predicted in a partially regulated river system at a local watershed scale. This case study simulates management scenarios based on interactive effects of dam operation protocols, withdrawals for municipal water supply, effluent discharges from wastewater treatment, and inter-basin water transfers. Modeled streamflow was integrated with flow–ecology relationships relating hydrologic departure from reference conditions to fish species richness, stratified by trophic, reproductive, and habitat characteristics. Adding a hypothetical new water-withdrawal site was predicted to increase the frequency of low-flow conditions with adverse effects for several fish groups. Imposition of new reservoir release requirements was predicted to enhance flow and fish species richness immediately downstream of the reservoir, but these effects were dissipated further downstream. The framework presented here can be used to translate flow–ecology relationships into evidence-based management by developing decision-support systems for conservation of riverine biodiversity while optimizing water availability for human use.
Why are natural disasters not 'natural' for victims?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumagai, Yoshitaka; Edwards, John; Carroll, Matthew S.
Some type of formal or informal social assessment is often carried out in the wake of natural disasters. One often-observed phenomenon in such situations is that disaster victims and their sympathizers tend to focus on those elements of disasters that might have been avoided or mitigated by human intervention and thus assign 'undue' levels of responsibility to human agents. Often the responsibility or blame is directed at the very government agencies charged with helping people cope with and recover from the event. This phenomenon presents particular challenges for those trying to understand the social impacts of such events because ofmore » the reflexive nature of such analysis. Often the social analyst or even the government agency manager must sort through such perceptions and behavior and (at least implicitly) make judgments about which assignments of responsibility may have some validity and which are largely the result of the psychology of the disaster itself. This article presents a conceptual framework derived largely from social psychology to help develop a better understand such perceptions and behavior. While no 'magic bullet' formula for evaluating the validity of disaster victims' claims is presented, the conceptual framework is presented as a starting point for understanding this particular aspect of the psychology of natural disasters.« less
Prediction of Complex Aerodynamic Flows with Explicit Algebraic Stress Models
NASA Technical Reports Server (NTRS)
Abid, Ridha; Morrison, Joseph H.; Gatski, Thomas B.; Speziale, Charles G.
1996-01-01
An explicit algebraic stress equation, developed by Gatski and Speziale, is used in the framework of K-epsilon formulation to predict complex aerodynamic turbulent flows. The nonequilibrium effects are modeled through coefficients that depend nonlinearly on both rotational and irrotational strains. The proposed model was implemented in the ISAAC Navier-Stokes code. Comparisons with the experimental data are presented which clearly demonstrate that explicit algebraic stress models can predict the correct response to nonequilibrium flow.
NASA Astrophysics Data System (ADS)
Mahmoudzadeh, Javid; Wlodarczyk, Marta; Cassel, Kevin
2017-11-01
Development of excessive intimal hyperplasia (IH) in the cephalic vein of renal failure patients who receive chronic hemodialysis treatment results in vascular access failure and multiple treatment complications. Specifically, cephalic arch stenosis (CAS) is known to exacerbate hypertensive blood pressure, thrombosis, and subsequent cardiovascular incidents that would necessitate costly interventional procedures with low success rates. It has been hypothesized that excessive blood flow rate post access maturation which strongly violates the venous homeostasis is the main hemodynamic factor that orchestrates the onset and development of CAS. In this article, a computational framework based on a strong coupling of computational fluid dynamics (CFD) and shape optimization is proposed that aims to identify the effective blood flow rate on a patient-specific basis that avoids the onset of CAS while providing the adequate blood flow rate required to facilitate hemodialysis. This effective flow rate can be achieved through implementation of Miller's surgical banding method after the maturation of the arteriovenous fistula and is rooted in the relaxation of wall stresses back to a homeostatic target value. The results are indicative that this optimized hemodialysis blood flow rate is, in fact, a subject-specific value that can be assessed post vascular access maturation and prior to the initiation of chronic hemodialysis treatment as a mitigative action against CAS-related access failure. This computational technology can be employed for individualized dialysis treatment.
Stability and sensitivity of ABR flow control protocols
NASA Astrophysics Data System (ADS)
Tsai, Wie K.; Kim, Yuseok; Chiussi, Fabio; Toh, Chai-Keong
1998-10-01
This tutorial paper surveys the important issues in stability and sensitivity analysis of ABR flow control of ATM networks. THe stability and sensitivity issues are formulated in a systematic framework. Four main cause of instability in ABR flow control are identified: unstable control laws, temporal variations of available bandwidth with delayed feedback control, misbehaving components, and interactions between higher layer protocols and ABR flow control. Popular rate-based ABR flow control protocols are evaluated. Stability and sensitivity is shown to be the fundamental issues when the network has dynamically-varying bandwidth. Simulation result confirming the theoretical studies are provided. Open research problems are discussed.
Fine bed material in pools of natural gravel bed channels
Thomas E. Lisle; Sue Hilton
1999-01-01
Abstract - Natural gravel bed channels commonly contain a fine mode of sand and fine gravel that fills voids of the bed framework of coarser gravel. If the supply of fine bed material exceeds the storage capacity of framework voids, excess fine material forms surficial patches, which can be voluminous in pools during low flow. Data collected in 34 natural channels in...
Patient-Specific Simulation of Cardiac Blood Flow From High-Resolution Computed Tomography.
Lantz, Jonas; Henriksson, Lilian; Persson, Anders; Karlsson, Matts; Ebbers, Tino
2016-12-01
Cardiac hemodynamics can be computed from medical imaging data, and results could potentially aid in cardiac diagnosis and treatment optimization. However, simulations are often based on simplified geometries, ignoring features such as papillary muscles and trabeculae due to their complex shape, limitations in image acquisitions, and challenges in computational modeling. This severely hampers the use of computational fluid dynamics in clinical practice. The overall aim of this study was to develop a novel numerical framework that incorporated these geometrical features. The model included the left atrium, ventricle, ascending aorta, and heart valves. The framework used image registration to obtain patient-specific wall motion, automatic remeshing to handle topological changes due to the complex trabeculae motion, and a fast interpolation routine to obtain intermediate meshes during the simulations. Velocity fields and residence time were evaluated, and they indicated that papillary muscles and trabeculae strongly interacted with the blood, which could not be observed in a simplified model. The framework resulted in a model with outstanding geometrical detail, demonstrating the feasibility as well as the importance of a framework that is capable of simulating blood flow in physiologically realistic hearts.
Rattner, Alexander S.; Guillen, Donna Post; Joshi, Alark; ...
2016-03-17
Photo- and physically realistic techniques are often insufficient for visualization of fluid flow simulations, especially for 3D and time-varying studies. Substantial research effort has been dedicated to the development of non-photorealistic and illustration-inspired visualization techniques for compact and intuitive presentation of such complex datasets. However, a great deal of work has been reproduced in this field, as many research groups have developed specialized visualization software. Additionally, interoperability between illustrative visualization software is limited due to diverse processing and rendering architectures employed in different studies. In this investigation, a framework for illustrative visualization is proposed, and implemented in MarmotViz, a ParaViewmore » plug-in, enabling its use on a variety of computing platforms with various data file formats and mesh geometries. Region-of-interest identification and feature-tracking algorithms incorporated into this tool are described. Implementations of multiple illustrative effect algorithms are also presented to demonstrate the use and flexibility of this framework. Here, by providing an integrated framework for illustrative visualization of CFD data, MarmotViz can serve as a valuable asset for the interpretation of simulations of ever-growing scale.« less
Can You Build It? Using Manipulatives to Assess Student Understanding of Food-Web Concepts
ERIC Educational Resources Information Center
Grumbine, Richard
2012-01-01
This article outlines an exercise that assesses student knowledge of food-web and energy-flow concepts. Students work in teams and use manipulatives to build food-web models based on criteria assigned by the instructor. The models are then peer reviewed according to guidelines supplied by the instructor.
Strategic planning for health care management information systems.
Rosenberger, H R; Kaiser, K M
1985-01-01
Using a planning methodology and a structured design technique for analyzing data and data flow, information requirements can be derived to produce a strategic plan for a management information system. Such a long-range plan classifies information groups and assigns them priorities according to the goals of the organization. The approach emphasizes user involvement.
Use of an Interactive General-Purpose Computer Terminal to Simulate Training Equipment Operation.
ERIC Educational Resources Information Center
Lahey, George F.; And Others
Trainees from Navy Basic Electricity/Electronics School were assigned to receive either computer-assisted instruction (CAI) or conventional individualized instruction in a segment of a course requiring use of a multimeter to measure resistance and current flow. The (CAI) group used PLATO IV plasma-screen terminals; individualized instruction…
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or timing of cash flows are uncertain and are not fixed under § 436.14, Federal agencies may examine the impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order...
Integrating Business Core Knowledge through Upper Division Report Composition
ERIC Educational Resources Information Center
Roach, Joy; Tracy, Daniel; Durden, Kay
2007-01-01
The most ambitious project of many undergraduate business communication courses is the formal report. This assignment typically requires the use of many writing skills nurtured throughout the course. Skills such as proper style, tone, organization, flow, and mechanics are enhanced through the writing of memos and various types of letters. While…
NASA Technical Reports Server (NTRS)
Smith, Greg
2003-01-01
Schedule risk assessments determine the likelihood of finishing on time. Each task in a schedule has a varying degree of probability of being finished on time. A schedule risk assessment quantifies these probabilities by assigning values to each task. This viewgraph presentation contains a flow chart for conducting a schedule risk assessment, and profiles applicable several methods of data analysis.