The mutual causality analysis between the stock and futures markets
NASA Astrophysics Data System (ADS)
Yao, Can-Zhong; Lin, Qing-Wen
2017-07-01
In this paper we employ the conditional Granger causality model to estimate the information flow, and find that the improved model outperforms the Granger causality model in revealing the asymmetric correlation between stocks and futures in the Chinese market. First, we find that information flows estimated by Granger causality tests from futures to stocks are greater than those from stocks to futures. Additionally, average correlation coefficients capture some important characteristics between stock prices and information flows over time. Further, we find that direct information flows estimated by conditional Granger causality tests from stocks to futures are greater than those from futures to stocks. Besides, the substantial increases of information flows and direct information flows exhibit a certain degree of synchronism with the occurrences of important events. Finally, the comparative analysis with the asymmetric ratio and the bootstrap technique demonstrates the slight asymmetry of information flows and the significant asymmetry of direct information flows. It reveals that the information flows from futures to stocks are slightly greater than those in the reverse direction, while the direct information flows from stocks to futures are significantly greater than those in the reverse direction.
Cross Flow Parameter Calculation for Aerodynamic Analysis
NASA Technical Reports Server (NTRS)
Norman, David, Jr. (Inventor)
2014-01-01
A system and method for determining a cross flow angle for a feature on a structure. A processor unit receives location information identifying a location of the feature on the structure, determines an angle of the feature, identifies flow information for the location, determines a flow angle using the flow information, and determines the cross flow angle for the feature using the flow angle and the angle of the feature. The flow information describes a flow of fluid across the structure. The flow angle comprises an angle of the flow of fluid across the structure for the location of the feature.
Information-Systems Data-Flow Diagram
NASA Technical Reports Server (NTRS)
Blosiu, J. O.
1983-01-01
Single form presents clear picture of entire system. Form giving relational review of data flow well suited to information system planning, analysis, engineering, and management. Used to review data flow for developing system or one already in use.
Information Flow Analysis of Level 4 Payload Processing Operations
NASA Technical Reports Server (NTRS)
Danz, Mary E.
1991-01-01
The Level 4 Mission Sequence Test (MST) was studied to develop strategies and recommendations to facilitate information flow. Recommendations developed as a result of this study include revised format of the Test and Assembly Procedure (TAP) document and a conceptualized software based system to assist in the management of information flow during the MST.
Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling
NASA Astrophysics Data System (ADS)
Liu, D.; Guo, S.; Lian, Y.
2014-12-01
Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.
Information flow to assess cardiorespiratory interactions in patients on weaning trials.
Vallverdú, M; Tibaduisa, O; Clariá, F; Hoyer, D; Giraldo, B; Benito, S; Caminal, P
2006-01-01
Nonlinear processes of the autonomic nervous system (ANS) can produce breath-to-breath variability in the pattern of breathing. In order to provide assess to these nonlinear processes, nonlinear statistical dependencies between heart rate variability and respiratory pattern variability are analyzed. In this way, auto-mutual information and cross-mutual information concepts are applied. This information flow analysis is presented as a short-term non linear analysis method to investigate the information flow interactions in patients on weaning trials. 78 patients from mechanical ventilation were studied: Group A of 28 patients that failed to maintain spontaneous breathing and were reconnected; Group B of 50 patients with successful trials. The results show lower complexity with an increase of information flow in group A than in group B. Furthermore, a more (weakly) coupled nonlinear oscillator behavior is observed in the series of group A than in B.
Medina, K.D.; Tasker, Gary D.
1985-01-01
The surface water data network in Kansas was analyzed using generalized least squares regression for its effectiveness in providing regional streamflow information. The correlation and time-sampling error of the streamflow characteristic are considered in the generalized least squares method. Unregulated medium-flow, low-flow and high-flow characteristics were selected to be representative of the regional information that can be obtained from streamflow gaging station records for use in evaluating the effectiveness of continuing the present network stations, discontinuing some stations; and/or adding new stations. The analysis used streamflow records for all currently operated stations that were not affected by regulation and discontinued stations for which unregulated flow characteristics , as well as physical and climatic characteristics, were available. The state was divided into three network areas, western, northeastern, and southeastern Kansas, and analysis was made for three streamflow characteristics in each area, using three planning horizons. The analysis showed that the maximum reduction of sampling mean square error for each cost level could be obtained by adding new stations and discontinuing some of the present network stations. Large reductions in sampling mean square error for low-flow information could be accomplished in all three network areas, with western Kansas having the most dramatic reduction. The addition of new stations would be most beneficial for man- flow information in western Kansas, and to lesser degrees in the other two areas. The reduction of sampling mean square error for high-flow information would benefit most from the addition of new stations in western Kansas, and the effect diminishes to lesser degrees in the other two areas. Southeastern Kansas showed the smallest error reduction in high-flow information. A comparison among all three network areas indicated that funding resources could be most effectively used by discontinuing more stations in northeastern and southeastern Kansas and establishing more new stations in western Kansas. (Author 's abstract)
Low-flow characteristics for selected streams in Indiana
Fowler, Kathleen K.; Wilson, John T.
2015-01-01
The management and availability of Indiana’s water resources increase in importance every year. Specifically, information on low-flow characteristics of streams is essential to State water-management agencies. These agencies need low-flow information when working with issues related to irrigation, municipal and industrial water supplies, fish and wildlife protection, and the dilution of waste. Industrial, municipal, and other facilities must obtain National Pollutant Discharge Elimination System (NPDES) permits if their discharges go directly to surface waters. The Indiana Department of Environmental Management (IDEM) requires low-flow statistics in order to administer the NPDES permit program. Low-flow-frequency characteristics were computed for 272 continuous-record stations. The information includes low-flow-frequency analysis, flow-duration analysis, and harmonic mean for the continuous-record stations. For those stations affected by some form of regulation, low-flow frequency curves are based on the longest period of homogeneous record under current conditions. Low-flow-frequency values and harmonic mean flow (if sufficient data were available) were estimated for the 166 partial-record stations. Partial-record stations are ungaged sites where streamflow measurements were made at base flow.
Flow Analysis Tool White Paper
NASA Technical Reports Server (NTRS)
Boscia, Nichole K.
2012-01-01
Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.
NASA Astrophysics Data System (ADS)
Lu, Mujie; Shang, Wenjie; Ji, Xinkai; Hua, Mingzhuang; Cheng, Kuo
2015-12-01
Nowadays, intelligent transportation system (ITS) has already become the new direction of transportation development. Traffic data, as a fundamental part of intelligent transportation system, is having a more and more crucial status. In recent years, video observation technology has been widely used in the field of traffic information collecting. Traffic flow information contained in video data has many advantages which is comprehensive and can be stored for a long time, but there are still many problems, such as low precision and high cost in the process of collecting information. This paper aiming at these problems, proposes a kind of traffic target detection method with broad applicability. Based on three different ways of getting video data, such as aerial photography, fixed camera and handheld camera, we develop a kind of intelligent analysis software which can be used to extract the macroscopic, microscopic traffic flow information in the video, and the information can be used for traffic analysis and transportation planning. For road intersections, the system uses frame difference method to extract traffic information, for freeway sections, the system uses optical flow method to track the vehicles. The system was applied in Nanjing, Jiangsu province, and the application shows that the system for extracting different types of traffic flow information has a high accuracy, it can meet the needs of traffic engineering observations and has a good application prospect.
Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S
2018-03-01
Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.
Survey of Human Systems Integration (HSI) Tools for USCG Acquisitions
2009-04-01
an IMPRINT HPM. IMPRINT uses task network modeling to represent human performance. As the name implies, task networks use a flowchart type format...tools; and built-in tutoring support for beginners . A perceptual/motor layer extending ACT-R’s theory of cognition to perception and action is also...chisystems.com B.8 Information and Functional Flow Analysis Description In information flow analysis, a flowchart of the information and decisions
Medina, K.D.; Tasker, Gary D.
1987-01-01
This report documents the results of an analysis of the surface-water data network in Kansas for its effectiveness in providing regional streamflow information. The network was analyzed using generalized least squares regression. The correlation and time-sampling error of the streamflow characteristic are considered in the generalized least squares method. Unregulated medium-, low-, and high-flow characteristics were selected to be representative of the regional information that can be obtained from streamflow-gaging-station records for use in evaluating the effectiveness of continuing the present network stations, discontinuing some stations, and (or) adding new stations. The analysis used streamflow records for all currently operated stations that were not affected by regulation and for discontinued stations for which unregulated flow characteristics, as well as physical and climatic characteristics, were available. The State was divided into three network areas, western, northeastern, and southeastern Kansas, and analysis was made for the three streamflow characteristics in each area, using three planning horizons. The analysis showed that the maximum reduction of sampling mean-square error for each cost level could be obtained by adding new stations and discontinuing some current network stations. Large reductions in sampling mean-square error for low-flow information could be achieved in all three network areas, the reduction in western Kansas being the most dramatic. The addition of new stations would be most beneficial for mean-flow information in western Kansas. The reduction of sampling mean-square error for high-flow information would benefit most from the addition of new stations in western Kansas. Southeastern Kansas showed the smallest error reduction in high-flow information. A comparison among all three network areas indicated that funding resources could be most effectively used by discontinuing more stations in northeastern and southeastern Kansas and establishing more new stations in western Kansas.
Information Flow in the Launch Vehicle Design/Analysis Process
NASA Technical Reports Server (NTRS)
Humphries, W. R., Sr.; Holland, W.; Bishop, R.
1999-01-01
This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.
Static Analysis of Mobile Programs
2017-02-01
information flow analysis has the potential to significantly aid human auditors , but it is handicapped by high false positive rates. Instead, auditors ...presents these specifications to a human auditor for validation. We have implemented this framework for a taint analysis of An- droid apps that relies on...of queries to a human auditor . 6.4 Inferring Library Information Flow Specifications Using Dynamic Anal- ysis In [15], we present a technique to mine
Rényi’s information transfer between financial time series
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad
2012-05-01
In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter q. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990-31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008-11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia-Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.
Information Flow through a Model of the C. elegans Klinotaxis Circuit
Izquierdo, Eduardo J.; Williams, Paul L.; Beer, Randall D.
2015-01-01
Understanding how information about external stimuli is transformed into behavior is one of the central goals of neuroscience. Here we characterize the information flow through a complete sensorimotor circuit: from stimulus, to sensory neurons, to interneurons, to motor neurons, to muscles, to motion. Specifically, we apply a recently developed framework for quantifying information flow to a previously published ensemble of models of salt klinotaxis in the nematode worm Caenorhabditis elegans. Despite large variations in the neural parameters of individual circuits, we found that the overall information flow architecture circuit is remarkably consistent across the ensemble. This suggests structural connectivity is not necessarily predictive of effective connectivity. It also suggests information flow analysis captures general principles of operation for the klinotaxis circuit. In addition, information flow analysis reveals several key principles underlying how the models operate: (1) Interneuron class AIY is responsible for integrating information about positive and negative changes in concentration, and exhibits a strong left/right information asymmetry. (2) Gap junctions play a crucial role in the transfer of information responsible for the information symmetry observed in interneuron class AIZ. (3) Neck motor neuron class SMB implements an information gating mechanism that underlies the circuit’s state-dependent response. (4) The neck carries more information about small changes in concentration than about large ones, and more information about positive changes in concentration than about negative ones. Thus, not all directions of movement are equally informative for the worm. Each of these findings corresponds to hypotheses that could potentially be tested in the worm. Knowing the results of these experiments would greatly refine our understanding of the neural circuit underlying klinotaxis. PMID:26465883
Information Flow through a Model of the C. elegans Klinotaxis Circuit.
Izquierdo, Eduardo J; Williams, Paul L; Beer, Randall D
2015-01-01
Understanding how information about external stimuli is transformed into behavior is one of the central goals of neuroscience. Here we characterize the information flow through a complete sensorimotor circuit: from stimulus, to sensory neurons, to interneurons, to motor neurons, to muscles, to motion. Specifically, we apply a recently developed framework for quantifying information flow to a previously published ensemble of models of salt klinotaxis in the nematode worm Caenorhabditis elegans. Despite large variations in the neural parameters of individual circuits, we found that the overall information flow architecture circuit is remarkably consistent across the ensemble. This suggests structural connectivity is not necessarily predictive of effective connectivity. It also suggests information flow analysis captures general principles of operation for the klinotaxis circuit. In addition, information flow analysis reveals several key principles underlying how the models operate: (1) Interneuron class AIY is responsible for integrating information about positive and negative changes in concentration, and exhibits a strong left/right information asymmetry. (2) Gap junctions play a crucial role in the transfer of information responsible for the information symmetry observed in interneuron class AIZ. (3) Neck motor neuron class SMB implements an information gating mechanism that underlies the circuit's state-dependent response. (4) The neck carries more information about small changes in concentration than about large ones, and more information about positive changes in concentration than about negative ones. Thus, not all directions of movement are equally informative for the worm. Each of these findings corresponds to hypotheses that could potentially be tested in the worm. Knowing the results of these experiments would greatly refine our understanding of the neural circuit underlying klinotaxis.
Nakagawa, Hiroko; Yuno, Tomoji; Itho, Kiichi
2009-03-01
Recently, specific detection method for Bacteria, by flow cytometry method using nucleic acid staining, was developed as a function of automated urine formed elements analyzer for routine urine testing. Here, we performed a basic study on this bacteria analysis method. In addition, we also have a comparison among urine sediment analysis, urine Gram staining and urine quantitative cultivation, the conventional methods performed up to now. As a result, the bacteria analysis with flow cytometry method that uses nucleic acid staining was excellent in reproducibility, and higher sensitivity compared with microscopic urinary sediment analysis. Based on the ROC curve analysis, which settled urine culture method as standard, cut-off level of 120/microL was defined and its sensitivity = 85.7%, specificity = 88.2%. In the analysis of scattergram, accompanied with urine culture method, among 90% of rod positive samples, 80% of dots were appeared in the area of 30 degrees from axis X. In addition, one case even indicated that analysis of bacteria by flow cytometry and scattergram of time series analysis might be helpful to trace the progress of causative bacteria therefore the information supposed to be clinically significant. Reporting bacteria information with nucleic acid staining flow cytometry method is expected to contribute to a rapid diagnostics and treatment of urinary tract infections. Besides, the contribution to screening examination of microbiology and clinical chemistry, will deliver a more efficient solution to urine analysis.
Information flow dynamics in the brain
NASA Astrophysics Data System (ADS)
Rabinovich, Mikhail I.; Afraimovich, Valentin S.; Bick, Christian; Varona, Pablo
2012-03-01
Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.
Content analysis in information flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grusho, Alexander A.; Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow; Grusho, Nick A.
The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.
Improved analysis of transient temperature data from permanent down-hole gauges (PDGs)
NASA Astrophysics Data System (ADS)
Zhang, Yiqun; Zheng, Shiyi; Wang, Qi
2017-08-01
With the installation of permanent down-hole gauges (PDGs) during oil field development, large volumes of high resolution and continuous down-hole information are obtainable. The interpretation of these real-time temperature and pressure data can optimize well performance, provide information about the reservoir and continuously calibrate the reservoir model. Although the dynamic temperature data have been interpreted in practice to predict flow profiling and provide characteristic information of the reservoir, almost all of the approaches rely on established non-isothermal models which depend on thermodynamic parameters. Another problem comes from the temperature transient analysis (TTA), which is underutilized compared with pressure transient analysis (PTA). In this study, several model-independent methods of TTA were performed. The entire set of PDG data consists of many flow events. By utilizing the wavelet transform, the exact points of flow-rate changes can be located. The flow regime changes, for example, from early time linear flow to later time pseudo-radial flow, among every transient period with constant flow-rate. For the early time region (ETR) that is caused by flow-rate change operations, the TTA, along with the PTA can greatly reduce the uncertainties in flow regime diagnosis. Then, the temperature variations during ETR were examined to infer the true reservoir temperature history, and the relationships between the wavelet detailed coefficients and the flow-rate changes were analysed. For the scenarios with constant reservoir-well parameters, the detailed flow-rate history can be generated by calculating the coefficient of relationship in advance. For later times, the flow regime changes to pseudo-radial flow. An analytical solution was introduced to describe the sand-face temperature. The formation parameters, such as permeability and skin factor, were estimated with the previously calculated flow-rate. It is necessary to analyse temperature variation to overcome data limitation problems when information from other down-hole tools (e.g. expensive but unstable flow meters) is insufficient. This study shows the success in wellbore storage regime diagnosis, flow-rate history reconstruction, and formation parameters estimation using transient temperature data.
Describing and Modeling Workflow and Information Flow in Chronic Disease Care
Unertl, Kim M.; Weinger, Matthew B.; Johnson, Kevin B.; Lorenzi, Nancy M.
2009-01-01
Objectives The goal of the study was to develop an in-depth understanding of work practices, workflow, and information flow in chronic disease care, to facilitate development of context-appropriate informatics tools. Design The study was conducted over a 10-month period in three ambulatory clinics providing chronic disease care. The authors iteratively collected data using direct observation and semi-structured interviews. Measurements The authors observed all aspects of care in three different chronic disease clinics for over 150 hours, including 157 patient-provider interactions. Observation focused on interactions among people, processes, and technology. Observation data were analyzed through an open coding approach. The authors then developed models of workflow and information flow using Hierarchical Task Analysis and Soft Systems Methodology. The authors also conducted nine semi-structured interviews to confirm and refine the models. Results The study had three primary outcomes: models of workflow for each clinic, models of information flow for each clinic, and an in-depth description of work practices and the role of health information technology (HIT) in the clinics. The authors identified gaps between the existing HIT functionality and the needs of chronic disease providers. Conclusions In response to the analysis of workflow and information flow, the authors developed ten guidelines for design of HIT to support chronic disease care, including recommendations to pursue modular approaches to design that would support disease-specific needs. The study demonstrates the importance of evaluating workflow and information flow in HIT design and implementation. PMID:19717802
Chamberlin, Kent; Smith, Wayne; Chirgwin, Christopher; Appasani, Seshank; Rioux, Paul
2014-12-01
The purpose of this study was to investigate "earthing" from an electrical perspective through measurement and analysis of the naturally occurring electron flow between the human body or a control and ground as this relates to the magnitude of the charge exchange, the relationship between the charge exchange and body functions (respiration and heart rate), and the detection of other information that might be contained in the charge exchange. Sensitive, low-noise instrumentation was designed and fabricated to measure low-level current flow at low frequencies. This instrumentation was used to record current flow between human subjects or a control and ground, and these measurements were performed approximately 40 times under varied circumstances. The results of these measurements were analyzed to determine if information was contained in the current exchange. The currents flowing between the human body and ground were small (nanoamperes), and they correlated with subject motion. There did not appear to be any information contained in this exchange except for information about subject motion. This study showed that currents flow between the environment (earth) and a grounded human body; however, these currents are small (nanoamperes) and do not appear to contain information other than information about subject motion.
Chamberlin, Kent; Smith, Wayne; Chirgwin, Christopher; Appasani, Seshank; Rioux, Paul
2014-01-01
Objective The purpose of this study was to investigate “earthing” from an electrical perspective through measurement and analysis of the naturally occurring electron flow between the human body or a control and ground as this relates to the magnitude of the charge exchange, the relationship between the charge exchange and body functions (respiration and heart rate), and the detection of other information that might be contained in the charge exchange. Methods Sensitive, low-noise instrumentation was designed and fabricated to measure low-level current flow at low frequencies. This instrumentation was used to record current flow between human subjects or a control and ground, and these measurements were performed approximately 40 times under varied circumstances. The results of these measurements were analyzed to determine if information was contained in the current exchange. Results The currents flowing between the human body and ground were small (nanoamperes), and they correlated with subject motion. There did not appear to be any information contained in this exchange except for information about subject motion. Conclusions This study showed that currents flow between the environment (earth) and a grounded human body; however, these currents are small (nanoamperes) and do not appear to contain information other than information about subject motion. PMID:25435837
The Tacitness of Tacitus. A Methodological Approach to European Thought. No. 46.
ERIC Educational Resources Information Center
Bierschenk, Bernhard
This study measured the analysis of verbal flows by means of volume-elasticity measures and the analysis of information flow structures and their representations in the form of a metaphysical cube. A special purpose system of computer programs (PERTEX) was used to establish the language space in which the textual flow patterns occurred containing…
Sando, Steven K.; McCarthy, Peter M.
2018-05-10
This report documents the methods for peak-flow frequency (hereinafter “frequency”) analysis and reporting for streamgages in and near Montana following implementation of the Bulletin 17C guidelines. The methods are used to provide estimates of peak-flow quantiles for 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for selected streamgages operated by the U.S. Geological Survey Wyoming-Montana Water Science Center (WY–MT WSC). These annual exceedance probabilities correspond to 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals, respectively.Standard procedures specific to the WY–MT WSC for implementing the Bulletin 17C guidelines include (1) the use of the Expected Moments Algorithm analysis for fitting the log-Pearson Type III distribution, incorporating historical information where applicable; (2) the use of weighted skew coefficients (based on weighting at-site station skew coefficients with generalized skew coefficients from the Bulletin 17B national skew map); and (3) the use of the Multiple Grubbs-Beck Test for identifying potentially influential low flows. For some streamgages, the peak-flow records are not well represented by the standard procedures and require user-specified adjustments informed by hydrologic judgement. The specific characteristics of peak-flow records addressed by the informed-user adjustments include (1) regulated peak-flow records, (2) atypical upper-tail peak-flow records, and (3) atypical lower-tail peak-flow records. In all cases, the informed-user adjustments use the Expected Moments Algorithm fit of the log-Pearson Type III distribution using the at-site station skew coefficient, a manual potentially influential low flow threshold, or both.Appropriate methods can be applied to at-site frequency estimates to provide improved representation of long-term hydroclimatic conditions. The methods for improving at-site frequency estimates by weighting with regional regression equations and by Maintenance of Variance Extension Type III record extension are described.Frequency analyses were conducted for 99 example streamgages to indicate various aspects of the frequency-analysis methods described in this report. The frequency analyses and results for the example streamgages are presented in a separate data release associated with this report consisting of tables and graphical plots that are structured to include information concerning the interpretive decisions involved in the frequency analyses. Further, the separate data release includes the input files to the PeakFQ program, version 7.1, including the peak-flow data file and the analysis specification file that were used in the peak-flow frequency analyses. Peak-flow frequencies are also reported in separate data releases for selected streamgages in the Beaverhead River and Clark Fork Basins and also for selected streamgages in the Ruby, Jefferson, and Madison River Basins.
Predicting Information Flows in Network Traffic.
ERIC Educational Resources Information Center
Hinich, Melvin J.; Molyneux, Robert E.
2003-01-01
Discusses information flow in networks and predicting network traffic and describes a study that uses time series analysis on a day's worth of Internet log data. Examines nonlinearity and traffic invariants, and suggests that prediction of network traffic may not be possible with current techniques. (Author/LRW)
Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood
NASA Astrophysics Data System (ADS)
Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver
2016-09-01
Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.
Unsteady RANS/DES analysis of flow around helicopter rotor blades at forword flight conditions
NASA Astrophysics Data System (ADS)
Zhang, Zhenyu; Qian, Yaoru
2018-05-01
In this paper, the complex flows around forward-flying helicopter blades are numerically investigated. Both the Reynolds-averaged Navier-Stokes (RANS) and the Detached Eddy Simulation (DES) methods are used for the analysis of characteristics like local dynamic flow separation, effects of radial sweeping and reversed flow. The flow was solved by a highly efficient finite volume solver with multi-block structured grids. Focusing upon the complexity of the advance ratio effects, above properties are fully recognized. The current results showed significant agreements between both RANS and DES methods at phases with attached flow phases. Detailed information of separating flow near the withdrawal phases are given by DES results. The flow analysis of these blades under reversed flow reveals a significant interaction between the reversed flow and the span-wise sweeping.
Determination of Reaction Stoichiometries by Flow Injection Analysis.
ERIC Educational Resources Information Center
Rios, Angel; And Others
1986-01-01
Describes a method of flow injection analysis intended for calculation of complex-formation and redox reaction stoichiometries based on a closed-loop configuration. The technique is suitable for use in undergraduate laboratories. Information is provided for equipment, materials, procedures, and sample results. (JM)
On the Imbalance of International Communication: An Analysis, a Review and Some Solutions.
ERIC Educational Resources Information Center
Hsia, H. J.
Current international communication is typified by flow of information from the northern to the southern hemisphere, dominated by the developed nations in information gathering and dissemination, and intensified by technological advances. The imbalance of communication flow, considered by developing nations as responsible for political, economic,…
Relative Influence of Professional Counseling Journals
ERIC Educational Resources Information Center
Fernando, Delini M.; Barrio Minton, Casey A.
2011-01-01
The authors used social network analysis of citation data to study the flow of information and relative influence of 17 professional counseling journals. Although the "Journal of Counseling & Development" ranked very highly in all measures of journal influence, several division journals emerged as key players in the flow of information within the…
Effects of background stimulation upon eye-movement information.
Nakamura, S
1996-04-01
To investigate the effects of background stimulation upon eye-movement information (EMI), the perceived deceleration of the target motion during pursuit eye movement (Aubert-Fleishl paradox) was analyzed. In the experiment, a striped pattern was used as a background stimulus with various brightness contrasts and spatial frequencies for serially manipulating the attributions of the background stimulus. Analysis showed that the retinal-image motion of the background stimulus (optic flow) affected eye-movement information and that the effects of optic flow became stronger when high contrast and low spatial frequency stripes were presented as the background stimulus. In conclusion, optic flow is one source of eye-movement information in determining real object motion, and the effectiveness of optic flow depends on the attributes of the background stimulus.
Analysis of internal flow characteristics of a smooth-disk water-brake dynamometer
NASA Technical Reports Server (NTRS)
Evans, D. G.
1973-01-01
The principal of absorbing power with an enclosed partially submerged rotating disk through the turbulent viscous shearing of water is discussed. Reference information is used to develop a flow model of the water brake. A method is then presented that uses vector diagrams to relate the effects of rotational flow, through flow, and secondary flow to power absorption. The method is used to describe the operating characteristics of an example 111-cm (43.7-in.) diameter water brake. Correlating performance parameters are developed in a dimensional analysis.
DOT National Transportation Integrated Search
2014-12-01
The report documents policy considerations for the Intelligent Network Flow Optimization (INFLO) connected vehicle applications bundle. INFLO aims to optimize network flow on freeways and arterials by informing motorists of existing and impendi...
Meridional Flow Measurements: Comparisons Between Ring Diagram Analysis and Fourier-Hankel Analysis
NASA Astrophysics Data System (ADS)
Zaatri, A.; Roth, M.
2008-09-01
The meridional circulation is a weak flow with amplitude in the order of 10 m/s on the solar surface. As this flow could be responsible for the transport of magnetic flux during the solar cycle it has become a crucial ingredient in some dynamo models. However, only less is known about the overall structure of the meridional circulation. Helioseismology is able to provide information on the structure of this flow in the solar interior. One widely used helioseismic technique for measuring frequency shifts due to horizontal flows in the subsurface layers of the sun is the ring diagram analyis (Corbard et al. 2003). It is based on the analysis of frequency shifts in the solar oscillation power spectrum as a function of the orientation of the wave vector. This then allows drawing conclusions on the strength of meridional flow, too. Ring diagram analysis is currently limited to the analysis of the wave field in only a small region on the solar surface. Consequently, information on the solar interior can only be inferred down to a depth of about 16 Mm. Another helioseismology method that promises to estimate the meridional flow strength down to greater depths is the Fourier-Hankel analysis (Krieger et al. 2007). This technique is based on a decomposition of the wave field in poleward and equatorward propagating waves. A possible frequency shift between them is then due to the meridional flow. We have been motivated for carrying out a comparative study between the two techniques to measure the meridional flow. We investigate the degree of coherence between the two methods by analyzing the same data sets recorded by the SOHO-MDI and GONG instruments.
2010-01-01
X.C. Xuan, Cryogenics, 43, pp. 117-124 (2003). 11. J. Chen, X. Chen, and C. Wu, Exergy , an International Journal , 1, pp. 100-106 (2001). 12. C.S...THERMODYNAMIC ANALYSIS AND OPTIMIZATION BASED ON EXERGY FLOW FOR A TWOSTAGED PULSE TUBE REFRIGERATOR A. Razani, T. Fraser, C. Dodson, and T. Roberts...2012) Additional information on AIP Conf. Proc. Journal Homepage: http://proceedings.aip.org/ Journal Information: http://proceedings.aip.org
Tan, Chao; Zhao, Jia; Dong, Feng
2015-03-01
Flow behavior characterization is important to understand gas-liquid two-phase flow mechanics and further establish its description model. An Electrical Resistance Tomography (ERT) provides information regarding flow conditions at different directions where the sensing electrodes implemented. We extracted the multivariate sample entropy (MSampEn) by treating ERT data as a multivariate time series. The dynamic experimental results indicate that the MSampEn is sensitive to complexity change of flow patterns including bubbly flow, stratified flow, plug flow and slug flow. MSampEn can characterize the flow behavior at different direction of two-phase flow, and reveal the transition between flow patterns when flow velocity changes. The proposed method is effective to analyze two-phase flow pattern transition by incorporating information of different scales and different spatial directions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Cooley, Richard L.
1982-01-01
Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.
#FluxFlow: Visual Analysis of Anomalous Information Spreading on Social Media.
Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher
2014-12-01
We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
Global Qualitative Flow-Path Modeling for Local State Determination in Simulation and Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Fleming, Land D. (Inventor)
1998-01-01
For qualitative modeling and analysis, a general qualitative abstraction of power transmission variables (flow and effort) for elements of flow paths includes information on resistance, net flow, permissible directions of flow, and qualitative potential is discussed. Each type of component model has flow-related variables and an associated internal flow map, connected into an overall flow network of the system. For storage devices, the implicit power transfer to the environment is represented by "virtual" circuits that include an environmental junction. A heterogeneous aggregation method simplifies the path structure. A method determines global flow-path changes during dynamic simulation and analysis, and identifies corresponding local flow state changes that are effects of global configuration changes. Flow-path determination is triggered by any change in a flow-related device variable in a simulation or analysis. Components (path elements) that may be affected are identified, and flow-related attributes favoring flow in the two possible directions are collected for each of them. Next, flow-related attributes are determined for each affected path element, based on possibly conflicting indications of flow direction. Spurious qualitative ambiguities are minimized by using relative magnitudes and permissible directions of flow, and by favoring flow sources over effort sources when comparing flow tendencies. The results are output to local flow states of affected components.
Information transmission and signal permutation in active flow networks
NASA Astrophysics Data System (ADS)
Woodhouse, Francis G.; Fawcett, Joanna B.; Dunkel, Jörn
2018-03-01
Recent experiments show that both natural and artificial microswimmers in narrow channel-like geometries will self-organise to form steady, directed flows. This suggests that networks of flowing active matter could function as novel autonomous microfluidic devices. However, little is known about how information propagates through these far-from-equilibrium systems. Through a mathematical analogy with spin-ice vertex models, we investigate here the input–output characteristics of generic incompressible active flow networks (AFNs). Our analysis shows that information transport through an AFN is inherently different from conventional pressure or voltage driven networks. Active flows on hexagonal arrays preserve input information over longer distances than their passive counterparts and are highly sensitive to bulk topological defects, whose presence can be inferred from marginal input–output distributions alone. This sensitivity further allows controlled permutations on parallel inputs, revealing an unexpected link between active matter and group theory that can guide new microfluidic mixing strategies facilitated by active matter and aid the design of generic autonomous information transport networks.
Zattoni, Andrea; Melucci, Dora; Reschiglian, Pierluigi; Sanz, Ramsés; Puignou, Lluís; Galceran, Maria Teresa
2004-10-29
Yeasts are widely used in several areas of food industry, e.g. baking, beer brewing, and wine production. Interest in new analytical methods for quality control and characterization of yeast cells is thus increasing. The biophysical properties of yeast cells, among which cell size, are related to yeast cell capabilities to produce primary and secondary metabolites during the fermentation process. Biophysical properties of winemaking yeast strains can be screened by field-flow fractionation (FFF). In this work we present the use of flow FFF (FlFFF) with turbidimetric multi-wavelength detection for the number-size distribution analysis of different commercial winemaking yeast varieties. The use of a diode-array detector allows to apply to dispersed samples like yeast cells the recently developed method for number-size (or mass-size) analysis in flow-assisted separation techniques. Results for six commercial winemaking yeast strains are compared with data obtained by a standard method for cell sizing (Coulter counter). The method here proposed gives, at short analysis time, accurate information on the number of cells of a given size, and information on the total number of cells.
A PROPOSED CHEMICAL INFORMATION AND DATA SYSTEM. VOLUME I.
CHEMICAL COMPOUNDS, *DATA PROCESSING, *INFORMATION RETRIEVAL, * CHEMICAL ANALYSIS, INPUT OUTPUT DEVICES, COMPUTER PROGRAMMING, CLASSIFICATION...CONFIGURATIONS, DATA STORAGE SYSTEMS, ATOMS, MOLECULES, PERFORMANCE( ENGINEERING ), MAINTENANCE, SUBJECT INDEXING, MAGNETIC TAPE, AUTOMATIC, MILITARY REQUIREMENTS, TYPEWRITERS, OPTICS, TOPOLOGY, STATISTICAL ANALYSIS, FLOW CHARTING.
Wang, Cuicui; Vieito, João Paulo; Ma, Qingguo
2015-01-01
This investigation is among the first ones to analyze the neural basis of an investment process with money flow information of financial market, using a simplified task where volunteers had to choose to buy or not to buy stocks based on the display of positive or negative money flow information. After choosing “to buy” or “not to buy,” participants were presented with feedback. At the same time, event-related potentials (ERPs) were used to record investor's brain activity and capture the event-related negativity (ERN) and feedback-related negativity (FRN) components. The results of ERN suggested that there might be a higher risk and more conflict when buying stocks with negative net money flow information than positive net money flow information, and the inverse was also true for the “not to buy” stocks option. The FRN component evoked by the bad outcome of a decision was more negative than that by the good outcome, which reflected the difference between the values of the actual and expected outcome. From the research, we could further understand how investors perceived money flow information of financial market and the neural cognitive effect in investment process. PMID:26557139
Wang, Cuicui; Vieito, João Paulo; Ma, Qingguo
2015-01-01
This investigation is among the first ones to analyze the neural basis of an investment process with money flow information of financial market, using a simplified task where volunteers had to choose to buy or not to buy stocks based on the display of positive or negative money flow information. After choosing "to buy" or "not to buy," participants were presented with feedback. At the same time, event-related potentials (ERPs) were used to record investor's brain activity and capture the event-related negativity (ERN) and feedback-related negativity (FRN) components. The results of ERN suggested that there might be a higher risk and more conflict when buying stocks with negative net money flow information than positive net money flow information, and the inverse was also true for the "not to buy" stocks option. The FRN component evoked by the bad outcome of a decision was more negative than that by the good outcome, which reflected the difference between the values of the actual and expected outcome. From the research, we could further understand how investors perceived money flow information of financial market and the neural cognitive effect in investment process.
2010-03-01
Employ NetFlow on Edge Router ......................................... 45 E. IMPLEMENT AN INTEGRATED VULNERABILITY ASSESSMENT. 48 1. Conduct...45 Figure 18. Netflow Information on Unauthorized Connections ............................ 46 Figure 19. Algorithm for Detecting...indicating that an attack has being initiated from this port. Figure 17. Information on Traffic Generated by Suspicious Host 3. Employ NetFlow
Information Processing in Living Systems
NASA Astrophysics Data System (ADS)
Tkačik, Gašper; Bialek, William
2016-03-01
Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.
Lei, Shufei; Iles, Alastair; Kelly, Maggi
2015-07-01
Some of the factors that can contribute to the success of collaborative adaptive management--such as social learning, open communication, and trust--are built upon a foundation of the open exchange of information about science and management between participants and the public. Despite the importance of information transparency, the use and flow of information in collaborative adaptive management has not been characterized in detail in the literature, and currently there exist opportunities to develop strategies for increasing the exchange of information, as well as to track information flow in such contexts. As digital information channels and networks have been increased over the last decade, powerful new information monitoring tools have also been evolved allowing for the complete characterization of information products through their production, transport, use, and monitoring. This study uses these tools to investigate the use of various science and management information products in a case study--the Sierra Nevada Adaptive Management Project--using a mixed method (citation analysis, web analytics, and content analysis) research approach borrowed from the information processing and management field. The results from our case study show that information technologies greatly facilitate the flow and use of digital information, leading to multiparty collaborations such as knowledge transfer and public participation in science research. We conclude with recommendations for expanding information exchange in collaborative adaptive management by taking advantage of available information technologies and networks.
NASA Astrophysics Data System (ADS)
Lei, Shufei; Iles, Alastair; Kelly, Maggi
2015-07-01
Some of the factors that can contribute to the success of collaborative adaptive management—such as social learning, open communication, and trust—are built upon a foundation of the open exchange of information about science and management between participants and the public. Despite the importance of information transparency, the use and flow of information in collaborative adaptive management has not been characterized in detail in the literature, and currently there exist opportunities to develop strategies for increasing the exchange of information, as well as to track information flow in such contexts. As digital information channels and networks have been increased over the last decade, powerful new information monitoring tools have also been evolved allowing for the complete characterization of information products through their production, transport, use, and monitoring. This study uses these tools to investigate the use of various science and management information products in a case study—the Sierra Nevada Adaptive Management Project—using a mixed method (citation analysis, web analytics, and content analysis) research approach borrowed from the information processing and management field. The results from our case study show that information technologies greatly facilitate the flow and use of digital information, leading to multiparty collaborations such as knowledge transfer and public participation in science research. We conclude with recommendations for expanding information exchange in collaborative adaptive management by taking advantage of available information technologies and networks.
Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark
2015-01-01
This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.
Simultaneous Neutron and X-ray Tomography for Quantitative analysis of Geological Samples
NASA Astrophysics Data System (ADS)
LaManna, J.; Hussey, D. S.; Baltic, E.; Jacobson, D. L.
2016-12-01
Multiphase flow is a critical area of research for shale gas, oil recovery, underground CO2 sequestration, geothermal power, and aquifer management. It is critical to understand the porous structure of the geological formations in addition to the fluid/pore and fluid/fluid interactions. Difficulties for analyzing flow characteristics of rock cores are in obtaining 3D distribution information on the fluid flow and maintaining the cores in a state for other analysis methods. Two powerful non-destructive methods for obtaining 3D structural and compositional information are X-ray and neutron tomography. X-ray tomography produces information on density and structure while neutrons excel at acquiring the liquid phase and produces compositional information. These two methods can offer strong complementary information but are typically conducted at separate times and often at different facilities. This poses issues for obtaining dynamic and stochastic information as the sample will change between analysis modes. To address this, NIST has developed a system that allows for multimodal, simultaneous tomography using thermal neutrons and X-rays by placing a 90 keVp micro-focus X-ray tube 90° to the neutron beam. High pressure core holders that simulate underground conditions have been developed to facilitate simultaneous tomography. These cells allow for the control of confining pressure, axial load, temperature, and fluid flow through the core. This talk will give an overview the simultaneous neutron and x-ray tomography capabilities at NIST, the benefits of multimodal imaging, environmental equipment for geology studies, and several case studies that have been conducted at NIST.
An Approach and Instrumentation for Management System Analysis
1974-10-01
Benefit Analysis Systems Analysis Manpower Planning Resource Planning Information Theory 20. ABSTRACT (Conlliwa on ravaraa alda It nacaaaary...participants the data necessary to trace both formal and informal information flows and make cost- benefit judgments about specific communications. The...network within a manage- ment structure and to provide a basis tor preliminary cost- benefit evaluations. This objective was in response to Phase I of the
Understanding virtual water flows: A multiregion input-output case study of Victoria
NASA Astrophysics Data System (ADS)
Lenzen, Manfred
2009-09-01
This article explains and interprets virtual water flows from the well-established perspective of input-output analysis. Using a case study of the Australian state of Victoria, it demonstrates that input-output analysis can enumerate virtual water flows without systematic and unknown truncation errors, an issue which has been largely absent from the virtual water literature. Whereas a simplified flow analysis from a producer perspective would portray Victoria as a net virtual water importer, enumerating the water embodiments across the full supply chain using input-output analysis shows Victoria as a significant net virtual water exporter. This study has succeeded in informing government policy in Australia, which is an encouraging sign that input-output analysis will be able to contribute much value to other national and international applications.
Turbulence measurements using the laser Doppler velocimeter
NASA Technical Reports Server (NTRS)
Dunning, J. W., Jr.; Berman, N. S.
1971-01-01
The photomultiplier signal representing the axial velocity of water within a glass pipe is examined. It is shown that with proper analysis of the photomultiplier signal, the turbulent information that can be obtained in liquid flows is equivalent to recent hot film studies. In shear flows the signal from the laser Doppler velocimeter contains additional information which may be related to the average shear.
SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series
Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory
2018-03-07
This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.
RF-photonic chirp encoder and compressor for seamless analysis of information flow.
Zalevsky, Zeev; Shemer, Amir; Zach, Shlomo
2008-05-26
In this paper we realize an RF photonic chirp compression system that compresses a continuous stream of incoming RF data (modulated on top of an optical carrier) into a train of temporal short pulses. Each pulse in the train can be separated and treated individually while being sampled by low rate optical switch and without temporal loses of the incoming flow of information. Each such pulse can be filtered and analyzed differently. The main advantage of the proposed system is its capability of being able to handle, seamlessly, high rate information flow with all-optical means and with low rate optical switches.
Smooth information flow in temperature climate network reflects mass transport
NASA Astrophysics Data System (ADS)
Hlinka, Jaroslav; Jajcay, Nikola; Hartman, David; Paluš, Milan
2017-03-01
A directed climate network is constructed by Granger causality analysis of air temperature time series from a regular grid covering the whole Earth. Using winner-takes-all network thresholding approach, a structure of a smooth information flow is revealed, hidden to previous studies. The relevance of this observation is confirmed by comparison with the air mass transfer defined by the wind field. Their close relation illustrates that although the information transferred due to the causal influence is not a physical quantity, the information transfer is tied to the transfer of mass and energy.
Perceptual analysis of vibrotactile flows on a mobile device.
Seo, Jongman; Choi, Seungmoon
2013-01-01
"Vibrotactile flow" refers to a continuously moving sensation of vibrotactile stimulation applied by a few actuators directly onto the skin or through a rigid medium. Research demonstrated the effectiveness of vibrotactile flow for conveying intuitive directional information on a mobile device. In this paper, we extend previous research by investigating the perceptual characteristics of vibrotactile flows rendered on a mobile device and proposing a synthesis framework for vibrotactile flows with desired perceptual properties.
A demonstration of the instream flow incremental methodology, Shenandoah River
Zappia, Humbert; Hayes, Donald C.
1998-01-01
Current and projected demands on the water resources of the Shenandoah River have increased concerns for the potential effect of these demands on the natural integrity of the Shenandoah River system. The Instream Flow Incremental Method (IFIM) process attempts to integrate concepts of water-supply planning, analytical hydraulic engineering models, and empirically derived habitat versus flow functions to address water-use and instream-flow issues and questions concerning life-stage specific effects on selected species and the general well being of aquatic biological populations.The demonstration project also sets the stage for the identification and compilation of the major instream-flow issues in the Shenandoah River Basin, development of the required multidisciplinary technical team to conduct more detailed studies, and development of basin specific habitat and flow requirements for fish species, species assemblages, and various water uses in the Shenandoah River Basin. This report presents the results of an IFIM demonstration project, conducted on the main stem Shenandoah River in Virginia, during 1996 and 1997, using the Physical Habitat Simulation System (PHABSIM) model.Output from PHABSIM is used to address the general flow requirements for water supply and recreation and habitat for selected life stages of several fish species. The model output is only a small part of the information necessary for effective decision making and management of river resources. The information by itself is usually insufficient for formulation of recommendations regarding instream-flow requirements. Additional information, for example, can be obtained by analysis of habitat time-series data, habitat duration data, and habitat bottlenecks. Alternative-flow analysis and habitat-duration curves are presented.
Representational dynamics of object recognition: Feedforward and feedback information flows.
Goddard, Erin; Carlson, Thomas A; Dermody, Nadene; Woolgar, Alexandra
2016-03-01
Object perception involves a range of visual and cognitive processes, and is known to include both a feedfoward flow of information from early visual cortical areas to higher cortical areas, along with feedback from areas such as prefrontal cortex. Previous studies have found that low and high spatial frequency information regarding object identity may be processed over different timescales. Here we used the high temporal resolution of magnetoencephalography (MEG) combined with multivariate pattern analysis to measure information specifically related to object identity in peri-frontal and peri-occipital areas. Using stimuli closely matched in their low-level visual content, we found that activity in peri-occipital cortex could be used to decode object identity from ~80ms post stimulus onset, and activity in peri-frontal cortex could also be used to decode object identity from a later time (~265ms post stimulus onset). Low spatial frequency information related to object identity was present in the MEG signal at an earlier time than high spatial frequency information for peri-occipital cortex, but not for peri-frontal cortex. We additionally used Granger causality analysis to compare feedforward and feedback influences on representational content, and found evidence of both an early feedfoward flow and later feedback flow of information related to object identity. We discuss our findings in relation to existing theories of object processing and propose how the methods we use here could be used to address further questions of the neural substrates underlying object perception. Copyright © 2016 Elsevier Inc. All rights reserved.
Information Graph Flow: A Geometric Approximation of Quantum and Statistical Systems
NASA Astrophysics Data System (ADS)
Vanchurin, Vitaly
2018-05-01
Given a quantum (or statistical) system with a very large number of degrees of freedom and a preferred tensor product factorization of the Hilbert space (or of a space of distributions) we describe how it can be approximated with a very low-dimensional field theory with geometric degrees of freedom. The geometric approximation procedure consists of three steps. The first step is to construct weighted graphs (we call information graphs) with vertices representing subsystems (e.g., qubits or random variables) and edges representing mutual information (or the flow of information) between subsystems. The second step is to deform the adjacency matrices of the information graphs to that of a (locally) low-dimensional lattice using the graph flow equations introduced in the paper. (Note that the graph flow produces very sparse adjacency matrices and thus might also be used, for example, in machine learning or network science where the task of graph sparsification is of a central importance.) The third step is to define an emergent metric and to derive an effective description of the metric and possibly other degrees of freedom. To illustrate the procedure we analyze (numerically and analytically) two information graph flows with geometric attractors (towards locally one- and two-dimensional lattices) and metric perturbations obeying a geometric flow equation. Our analysis also suggests a possible approach to (a non-perturbative) quantum gravity in which the geometry (a secondary object) emerges directly from a quantum state (a primary object) due to the flow of the information graphs.
Information Graph Flow: A Geometric Approximation of Quantum and Statistical Systems
NASA Astrophysics Data System (ADS)
Vanchurin, Vitaly
2018-06-01
Given a quantum (or statistical) system with a very large number of degrees of freedom and a preferred tensor product factorization of the Hilbert space (or of a space of distributions) we describe how it can be approximated with a very low-dimensional field theory with geometric degrees of freedom. The geometric approximation procedure consists of three steps. The first step is to construct weighted graphs (we call information graphs) with vertices representing subsystems (e.g., qubits or random variables) and edges representing mutual information (or the flow of information) between subsystems. The second step is to deform the adjacency matrices of the information graphs to that of a (locally) low-dimensional lattice using the graph flow equations introduced in the paper. (Note that the graph flow produces very sparse adjacency matrices and thus might also be used, for example, in machine learning or network science where the task of graph sparsification is of a central importance.) The third step is to define an emergent metric and to derive an effective description of the metric and possibly other degrees of freedom. To illustrate the procedure we analyze (numerically and analytically) two information graph flows with geometric attractors (towards locally one- and two-dimensional lattices) and metric perturbations obeying a geometric flow equation. Our analysis also suggests a possible approach to (a non-perturbative) quantum gravity in which the geometry (a secondary object) emerges directly from a quantum state (a primary object) due to the flow of the information graphs.
High speed digital holographic interferometry for hypersonic flow visualization
NASA Astrophysics Data System (ADS)
Hegde, G. M.; Jagdeesh, G.; Reddy, K. P. J.
2013-06-01
Optical imaging techniques have played a major role in understanding the flow dynamics of varieties of fluid flows, particularly in the study of hypersonic flows. Schlieren and shadowgraph techniques have been the flow diagnostic tools for the investigation of compressible flows since more than a century. However these techniques provide only the qualitative information about the flow field. Other optical techniques such as holographic interferometry and laser induced fluorescence (LIF) have been used extensively for extracting quantitative information about the high speed flows. In this paper we present the application of digital holographic interferometry (DHI) technique integrated with short duration hypersonic shock tunnel facility having 1 ms test time, for quantitative flow visualization. Dynamics of the flow fields in hypersonic/supersonic speeds around different test models is visualized with DHI using a high-speed digital camera (0.2 million fps). These visualization results are compared with schlieren visualization and CFD simulation results. Fringe analysis is carried out to estimate the density of the flow field.
Ploner, Stefan B; Moult, Eric M; Choi, WooJhon; Waheed, Nadia K; Lee, ByungKun; Novais, Eduardo A; Cole, Emily D; Potsaid, Benjamin; Husvogt, Lennart; Schottenhamml, Julia; Maier, Andreas; Rosenfeld, Philip J; Duker, Jay S; Hornegger, Joachim; Fujimoto, James G
2016-12-01
Currently available optical coherence tomography angiography systems provide information about blood flux but only limited information about blood flow speed. The authors develop a method for mapping the previously proposed variable interscan time analysis (VISTA) algorithm into a color display that encodes relative blood flow speed. Optical coherence tomography angiography was performed with a 1,050 nm, 400 kHz A-scan rate, swept source optical coherence tomography system using a 5 repeated B-scan protocol. Variable interscan time analysis was used to compute the optical coherence tomography angiography signal from B-scan pairs having 1.5 millisecond and 3.0 milliseconds interscan times. The resulting VISTA data were then mapped to a color space for display. The authors evaluated the VISTA visualization algorithm in normal eyes (n = 2), nonproliferative diabetic retinopathy eyes (n = 6), proliferative diabetic retinopathy eyes (n = 3), geographic atrophy eyes (n = 4), and exudative age-related macular degeneration eyes (n = 2). All eyes showed blood flow speed variations, and all eyes with pathology showed abnormal blood flow speeds compared with controls. The authors developed a novel method for mapping VISTA into a color display, allowing visualization of relative blood flow speeds. The method was found useful, in a small case series, for visualizing blood flow speeds in a variety of ocular diseases and serves as a step toward quantitative optical coherence tomography angiography.
Measuring flow velocity and flow direction by spatial and temporal analysis of flow fluctuations.
Chagnaud, Boris P; Brücker, Christoph; Hofmann, Michael H; Bleckmann, Horst
2008-04-23
If exposed to bulk water flow, fish lateral line afferents respond only to flow fluctuations (AC) and not to the steady (DC) component of the flow. Consequently, a single lateral line afferent can encode neither bulk flow direction nor velocity. It is possible, however, for a fish to obtain bulk flow information using multiple afferents that respond only to flow fluctuations. We show by means of particle image velocimetry that, if a flow contains fluctuations, these fluctuations propagate with the flow. A cross-correlation of water motion measured at an upstream point with that at a downstream point can then provide information about flow velocity and flow direction. In this study, we recorded from pairs of primary lateral line afferents while a fish was exposed to either bulk water flow, or to the water motion caused by a moving object. We confirm that lateral line afferents responded to the flow fluctuations and not to the DC component of the flow, and that responses of many fiber pairs were highly correlated, if they were time-shifted to correct for gross flow velocity and gross flow direction. To prove that a cross-correlation mechanism can be used to retrieve the information about gross flow velocity and direction, we measured the flow-induced bending motions of two flexible micropillars separated in a downstream direction. A cross-correlation of the bending motions of these micropillars did indeed produce an accurate estimate of the velocity vector along the direction of the micropillars.
Turbulence measurements using the laser Doppler velocimeter.
NASA Technical Reports Server (NTRS)
Dunning, J. W., Jr.; Berman, N. S.
1971-01-01
The photomultiplier signal representing the axial velocity of water within a glass pipe is considered. It is shown that with proper analysis of the photomultiplier signal, the turbulent information that can be obtained in liquid flows is equivalent to results obtained in recent hot film studies. In shear flows the signal from the laser Doppler velocimeter contains additional information which may be related to the average shear.
Straub, D.E.
1998-01-01
The streamflow-gaging station network in Ohio was evaluated for its effectiveness in providing regional streamflow information. The analysis involved application of the principles of generalized least squares regression between streamflow and climatic and basin characteristics. Regression equations were developed for three flow characteristics: (1) the instantaneous peak flow with a 100-year recurrence interval (P100), (2) the mean annual flow (Qa), and (3) the 7-day, 10-year low flow (7Q10). All active and discontinued gaging stations with 5 or more years of unregulated-streamflow data with respect to each flow characteristic were used to develop the regression equations. The gaging-station network was evaluated for the current (1996) condition of the network and estimated conditions of various network strategies if an additional 5 and 20 years of streamflow data were collected. Any active or discontinued gaging station with (1) less than 5 years of unregulated-streamflow record, (2) previously defined basin and climatic characteristics, and (3) the potential for collection of more unregulated-streamflow record were included in the network strategies involving the additional 5 and 20 years of data. The network analysis involved use of the regression equations, in combination with location, period of record, and cost of operation, to determine the contribution of the data for each gaging station to regional streamflow information. The contribution of each gaging station was based on a cost-weighted reduction of the mean square error (average sampling-error variance) associated with each regional estimating equation. All gaging stations included in the network analysis were then ranked according to their contribution to the regional information for each flow characteristic. The predictive ability of the regression equations developed from the gaging station network could be improved for all three flow characteristics with the collection of additional streamflow data. The addition of new gaging stations to the network would result in an even greater improvement of the accuracy of the regional regression equations. Typically, continued data collection at stations with unregulated streamflow for all flow conditions that had less than 11 years of record with drainage areas smaller than 200 square miles contributed the largest cost-weighted reduction to the average sampling-error variance of the regional estimating equations. The results of the network analyses can be used to prioritize the continued operation of active gaging stations or the reactivation of discontinued gaging stations if the objective is to maximize the regional information content in the streamflow-gaging station network.
Typing Local Control and State Using Flow Analysis
NASA Astrophysics Data System (ADS)
Guha, Arjun; Saftoiu, Claudiu; Krishnamurthi, Shriram
Programs written in scripting languages employ idioms that confound conventional type systems. In this paper, we highlight one important set of related idioms: the use of local control and state to reason informally about types. To address these idioms, we formalize run-time tags and their relationship to types, and use these to present a novel strategy to integrate typing with flow analysis in a modular way. We demonstrate that in our separation of typing and flow analysis, each component remains conventional, their composition is simple, but the result can handle these idioms better than either one alone.
The epidemic spreading model and the direction of information flow in brain networks.
Meier, J; Zhou, X; Hillebrand, A; Tewarie, P; Stam, C J; Van Mieghem, P
2017-05-15
The interplay between structural connections and emerging information flow in the human brain remains an open research problem. A recent study observed global patterns of directional information flow in empirical data using the measure of transfer entropy. For higher frequency bands, the overall direction of information flow was from posterior to anterior regions whereas an anterior-to-posterior pattern was observed in lower frequency bands. In this study, we applied a simple Susceptible-Infected-Susceptible (SIS) epidemic spreading model on the human connectome with the aim to reveal the topological properties of the structural network that give rise to these global patterns. We found that direct structural connections induced higher transfer entropy between two brain regions and that transfer entropy decreased with increasing distance between nodes (in terms of hops in the structural network). Applying the SIS model, we were able to confirm the empirically observed opposite information flow patterns and posterior hubs in the structural network seem to play a dominant role in the network dynamics. For small time scales, when these hubs acted as strong receivers of information, the global pattern of information flow was in the posterior-to-anterior direction and in the opposite direction when they were strong senders. Our analysis suggests that these global patterns of directional information flow are the result of an unequal spatial distribution of the structural degree between posterior and anterior regions and their directions seem to be linked to different time scales of the spreading process. Copyright © 2017 Elsevier Inc. All rights reserved.
Multimodel Simulation of Water Flow: Uncertainty Analysis
USDA-ARS?s Scientific Manuscript database
Simulations of soil water flow require measurements of soil hydraulic properties which are particularly difficult at the field scale. Laboratory measurements provide hydraulic properties at scales finer than the field scale, whereas pedotransfer functions (PTFs) integrate information on hydraulic pr...
Sequential Ideal-Observer Analysis of Visual Discriminations.
ERIC Educational Resources Information Center
Geisler, Wilson S.
1989-01-01
A new analysis, based on the concept of the ideal observer in signal detection theory, is described. It allows: tracing of the flow of discrimination information through the initial physiological stages of visual processing for arbitrary spatio-chromatic stimuli, and measurement of the information content of said visual stimuli. (TJH)
An annotation system for 3D fluid flow visualization
NASA Technical Reports Server (NTRS)
Loughlin, Maria M.; Hughes, John F.
1995-01-01
Annotation is a key activity of data analysis. However, current systems for data analysis focus almost exclusively on visualization. We propose a system which integrates annotations into a visualization system. Annotations are embedded in 3D data space, using the Post-it metaphor. This embedding allows contextual-based information storage and retrieval, and facilitates information sharing in collaborative environments. We provide a traditional database filter and a Magic Lens filter to create specialized views of the data. The system has been customized for fluid flow applications, with features which allow users to store parameters of visualization tools and sketch 3D volumes.
Gao, Zhong-Ke; Dang, Wei-Dong; Li, Shan; Yang, Yu-Xuan; Wang, Hong-Tao; Sheng, Jing-Ran; Wang, Xiao-Fan
2017-07-14
Numerous irregular flow structures exist in the complicated multiphase flow and result in lots of disparate spatial dynamical flow behaviors. The vertical oil-water slug flow continually attracts plenty of research interests on account of its significant importance. Based on the spatial transient flow information acquired through our designed double-layer distributed-sector conductance sensor, we construct multilayer modality-based network to encode the intricate spatial flow behavior. Particularly, we calculate the PageRank versatility and multilayer weighted clustering coefficient to quantitatively explore the inferred multilayer modality-based networks. Our analysis allows characterizing the complicated evolution of oil-water slug flow, from the opening formation of oil slugs, to the succedent inter-collision and coalescence among oil slugs, and then to the dispersed oil bubbles. These properties render our developed method particularly powerful for mining the essential flow features from the multilayer sensor measurements.
On an interface of the online system for a stochastic analysis of the varied information flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorshenin, Andrey K.; MIREA, MGUPI; Kuzmin, Victor Yu.
The article describes a possible approach to the construction of an interface of an online asynchronous system that allows researchers to analyse varied information flows. The implemented stochastic methods are based on the mixture models and the method of moving separation of mixtures. The general ideas of the system functionality are demonstrated on an example for some moments of a finite normal mixture.
Connectomics-based analysis of information flow in the Drosophila brain.
Shih, Chi-Tin; Sporns, Olaf; Yuan, Shou-Li; Su, Ta-Shun; Lin, Yen-Jen; Chuang, Chao-Chun; Wang, Ting-Yuan; Lo, Chung-Chuang; Greenspan, Ralph J; Chiang, Ann-Shyn
2015-05-18
Understanding the overall patterns of information flow within the brain has become a major goal of neuroscience. In the current study, we produced a first draft of the Drosophila connectome at the mesoscopic scale, reconstructed from 12,995 images of neuron projections collected in FlyCircuit (version 1.1). Neuron polarities were predicted according to morphological criteria, with nodes of the network corresponding to brain regions designated as local processing units (LPUs). The weight of each directed edge linking a pair of LPUs was determined by the number of neuron terminals that connected one LPU to the other. The resulting network showed hierarchical structure and small-world characteristics and consisted of five functional modules that corresponded to sensory modalities (olfactory, mechanoauditory, and two visual) and the pre-motor center. Rich-club organization was present in this network and involved LPUs in all sensory centers, and rich-club members formed a putative motor center of the brain. Major intra- and inter-modular loops were also identified that could play important roles for recurrent and reverberant information flow. The present analysis revealed whole-brain patterns of network structure and information flow. Additionally, we propose that the overall organizational scheme showed fundamental similarities to the network structure of the mammalian brain. Copyright © 2015 Elsevier Ltd. All rights reserved.
Short-time Lyapunov exponent analysis and the transition to chaos in Taylor-Couette flow
NASA Technical Reports Server (NTRS)
Vastano, John A.; Moser, Robert D.
1991-01-01
The physical mechanism driving the weakly chaotic Taylor-Couette flow is investigated using the short-time Liapunov exponent analysis. In this procedure, the transition from quasi-periodicity to chaos is studied using direct numerical 3D simulations of axially periodic Taylor-Couette flow, and a partial Liapunov exponent spectrum for the flow is computed by simultaneously advancing the full solution and a set of perturbations. It is shown that the short-time Liapunov exponent analysis yields more information on the exponents and dimension than that obtained from the common Liapunov exponent calculations. Results show that the chaotic state studied here is caused by a Kelvin-Helmholtz-type instability of the outflow boundary jet of Taylor vortices.
FLUT - A program for aeroelastic stability analysis. [of aircraft structures in subsonic flow
NASA Technical Reports Server (NTRS)
Johnson, E. H.
1977-01-01
A computer program (FLUT) that can be used to evaluate the aeroelastic stability of aircraft structures in subsonic flow is described. The algorithm synthesizes data from a structural vibration analysis with an unsteady aerodynamics analysis and then performs a complex eigenvalue analysis to assess the system stability. The theoretical basis of the program is discussed with special emphasis placed on some innovative techniques which improve the efficiency of the analysis. User information needed to efficiently and successfully utilize the program is provided. In addition to identifying the required input, the flow of the program execution and some possible sources of difficulty are included. The use of the program is demonstrated with a listing of the input and output for a simple example.
Ultrafast Microfluidic Cellular Imaging by Optical Time-Stretch.
Lau, Andy K S; Wong, Terence T W; Shum, Ho Cheung; Wong, Kenneth K Y; Tsia, Kevin K
2016-01-01
There is an unmet need in biomedicine for measuring a multitude of parameters of individual cells (i.e., high content) in a large population efficiently (i.e., high throughput). This is particularly driven by the emerging interest in bringing Big-Data analysis into this arena, encompassing pathology, drug discovery, rare cancer cell detection, emulsion microdroplet assays, to name a few. This momentum is particularly evident in recent advancements in flow cytometry. They include scaling of the number of measurable colors from the labeled cells and incorporation of imaging capability to access the morphological information of the cells. However, an unspoken predicament appears in the current technologies: higher content comes at the expense of lower throughput, and vice versa. For example, accessing additional spatial information of individual cells, imaging flow cytometers only achieve an imaging throughput ~1000 cells/s, orders of magnitude slower than the non-imaging flow cytometers. In this chapter, we introduce an entirely new imaging platform, namely optical time-stretch microscopy, for ultrahigh speed and high contrast label-free single-cell (in a ultrafast microfluidic flow up to 10 m/s) imaging and analysis with an ultra-fast imaging line-scan rate as high as tens of MHz. Based on this technique, not only morphological information of the individual cells can be obtained in an ultrafast manner, quantitative evaluation of cellular information (e.g., cell volume, mass, refractive index, stiffness, membrane tension) at nanometer scale based on the optical phase is also possible. The technology can also be integrated with conventional fluorescence measurements widely adopted in the non-imaging flow cytometers. Therefore, these two combinatorial and complementary measurement capabilities in long run is an attractive platform for addressing the pressing need for expanding the "parameter space" in high-throughput single-cell analysis. This chapter provides the general guidelines of constructing the optical system for time stretch imaging, fabrication and design of the microfluidic chip for ultrafast fluidic flow, as well as the image acquisition and processing.
A Management Information System in a Library Environment.
ERIC Educational Resources Information Center
Sutton, Michael J.; Black, John B.
More effective use of diminishing resources was needed to provide the best possible services at the University of Guelph (Ontario, Canada) library. This required the improved decision-making processes of a Library Management Information System (LMIS) to provide systematic information analysis. An information flow model was created, and an…
Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya
2011-02-01
Identification of the flow of materials and substances associated with a product system provides useful information for Life Cycle Analysis (LCA), and contributes to extending the scope of complementarity between LCA and Materials Flow Analysis/Substances Flow Analysis (MFA/SFA), the two major tools of industrial ecology. This paper proposes a new methodology based on input-output analysis for identifying the physical input-output flow of individual materials that is associated with the production of a unit of given product, the unit physical input-output by materials (UPIOM). While the Sankey diagram has been a standard tool for the visualization of MFA/SFA, with an increase in the complexity of the flows under consideration, which will be the case when economy-wide intersectoral flows of materials are involved, the Sankey diagram may become too complex for effective visualization. An alternative way to visually represent material flows is proposed which makes use of triangulation of the flow matrix based on degrees of fabrication. The proposed methodology is applied to the flow of pig iron and iron and steel scrap that are associated with the production of a passenger car in Japan. Its usefulness to identify a specific MFA pattern from the original IO table is demonstrated.
Theoretical analysis of tsunami generation by pyroclastic flows
Watts, P.; Waythomas, C.F.
2003-01-01
Pyroclastic flows are a common product of explosive volcanism and have the potential to initiate tsunamis whenever thick, dense flows encounter bodies of water. We evaluate the process of tsunami generation by pyroclastic flow by decomposing the pyroclastic flow into two components, the dense underflow portion, which we term the pyroclastic debris flow, and the plume, which includes the surge and coignimbrite ash cloud parts of the flow. We consider five possible wave generation mechanisms. These mechanisms consist of steam explosion, pyroclastic debris flow, plume pressure, plume shear, and pressure impulse wave generation. Our theoretical analysis of tsunami generation by these mechanisms provides an estimate of tsunami features such as a characteristic wave amplitude and wavelength. We find that in most situations, tsunami generation is dominated by the pyroclastic debris flow component of a pyroclastic flow. This work presents information sufficient to construct tsunami sources for an arbitrary pyroclastic flow interacting with most bodies of water. Copyright 2003 by the American Geophysical Union.
Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I
2017-01-01
Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.
Fast interactive exploration of 4D MRI flow data
NASA Astrophysics Data System (ADS)
Hennemuth, A.; Friman, O.; Schumann, C.; Bock, J.; Drexl, J.; Huellebrand, M.; Markl, M.; Peitgen, H.-O.
2011-03-01
1- or 2-directional MRI blood flow mapping sequences are an integral part of standard MR protocols for diagnosis and therapy control in heart diseases. Recent progress in rapid MRI has made it possible to acquire volumetric, 3-directional cine images in reasonable scan time. In addition to flow and velocity measurements relative to arbitrarily oriented image planes, the analysis of 3-dimensional trajectories enables the visualization of flow patterns, local features of flow trajectories or possible paths into specific regions. The anatomical and functional information allows for advanced hemodynamic analysis in different application areas like stroke risk assessment, congenital and acquired heart disease, aneurysms or abdominal collaterals and cranial blood flow. The complexity of the 4D MRI flow datasets and the flow related image analysis tasks makes the development of fast comprehensive data exploration software for advanced flow analysis a challenging task. Most existing tools address only individual aspects of the analysis pipeline such as pre-processing, quantification or visualization, or are difficult to use for clinicians. The goal of the presented work is to provide a software solution that supports the whole image analysis pipeline and enables data exploration with fast intuitive interaction and visualization methods. The implemented methods facilitate the segmentation and inspection of different vascular systems. Arbitrary 2- or 3-dimensional regions for quantitative analysis and particle tracing can be defined interactively. Synchronized views of animated 3D path lines, 2D velocity or flow overlays and flow curves offer a detailed insight into local hemodynamics. The application of the analysis pipeline is shown for 6 cases from clinical practice, illustrating the usefulness for different clinical questions. Initial user tests show that the software is intuitive to learn and even inexperienced users achieve good results within reasonable processing times.
Muñoz-Cobo, José Luis; Chiva, Sergio; Méndez, Santos; Monrós, Guillem; Escrivá, Alberto; Cuadros, José Luis
2017-05-10
This paper describes all the procedures and methods currently used at UPV (Universitat Politécnica de Valencia) and UJI (University Jaume I) for the development and use of sensors for multi-phase flow analysis in vertical pipes. This paper also describes the methods that we use to obtain the values of the two-phase flow magnitudes from the sensor signals and the validation and cross-verification methods developed to check the consistency of the results obtained for these magnitudes with the sensors. First, we provide information about the procedures used to build the multi-sensor conductivity probes and some of the tests performed with different materials to avoid sensor degradation issues. In addition, we provide information about the characteristics of the electric circuits that feed the sensors. Then the data acquisition of the conductivity probe, the signal conditioning and the data processing including the device that have been designed to automatize all the measurement process of moving the sensors inside the channels by means of stepper electric motors controlled by computer are shown in operation. Then, we explain the methods used for bubble identification and categorization. Finally, we describe the methodology used to obtain the two-phase flow information from the sensor signals. This includes the following items: void fraction, gas velocity, Sauter mean diameter and interfacial area concentration. The last part of this paper is devoted to the conductance probes developed for the annular flow analysis, which includes the analysis of the interfacial waves produced in annular flow and that requires a different type of sensor.
Muñoz-Cobo, José Luis; Chiva, Sergio; Méndez, Santos; Monrós, Guillem; Escrivá, Alberto; Cuadros, José Luis
2017-01-01
This paper describes all the procedures and methods currently used at UPV (Universitat Politécnica de Valencia) and UJI (University Jaume I) for the development and use of sensors for multi-phase flow analysis in vertical pipes. This paper also describes the methods that we use to obtain the values of the two-phase flow magnitudes from the sensor signals and the validation and cross-verification methods developed to check the consistency of the results obtained for these magnitudes with the sensors. First, we provide information about the procedures used to build the multi-sensor conductivity probes and some of the tests performed with different materials to avoid sensor degradation issues. In addition, we provide information about the characteristics of the electric circuits that feed the sensors. Then the data acquisition of the conductivity probe, the signal conditioning and the data processing including the device that have been designed to automatize all the measurement process of moving the sensors inside the channels by means of stepper electric motors controlled by computer are shown in operation. Then, we explain the methods used for bubble identification and categorization. Finally, we describe the methodology used to obtain the two-phase flow information from the sensor signals. This includes the following items: void fraction, gas velocity, Sauter mean diameter and interfacial area concentration. The last part of this paper is devoted to the conductance probes developed for the annular flow analysis, which includes the analysis of the interfacial waves produced in annular flow and that requires a different type of sensor. PMID:28489035
A study of information flow in hospice interdisciplinary team meetings
DEMIRIS, GEORGE; WASHINGTON, KARLA; OLIVER, DEBRA PARKER; WITTENBERG-LYLES, ELAINE
2009-01-01
The aim of this study was to explore the information flow of hospice interdisciplinary meetings focusing on information access, exchange and documentation. The study participants were members of four hospice interdisciplinary teams in the Midwestern United States. Team members included a diverse range of professionals including physicians, nurses, social workers, bereavement counselors, and others. A total of 81 patient care discussions were videotaped and transcribed. A content analysis revealed several themes that needed to be addressed to improve the overall information flow, such as access to and recording of information, documentation of services, obtaining information from absent team members, data redundancy and updating of recorded information. On average, 5% of all utterances when discussing a patient case were focused on soliciting information from the member who had access to the patient chart. In 12.3% of all discussions, members referred to an absent member who could have provided additional information. In 8.6% of all discussions the same facts were repeated three times or more. Based on the findings we propose guidelines that can address potential informational gaps and enhance team communication in hospice. PMID:19012142
Iavicoli, Patrizia; Urbán, Patricia; Bella, Angelo; Ryadnov, Maxim G; Rossi, François; Calzolai, Luigi
2015-11-27
Asymmetric Flow Field-Flow Fractionation (AF4) combined with multidetector analysis form a promising technique in the field of nanoparticle characterization. This system is able to measure the dimensions and physicochemical properties of nanoparticles with unprecedented accuracy and precision. Here, for the first time, this technique is optimized to characterize the interaction between an archetypal antimicrobial peptide and synthetic membranes. By using charged and neutral liposomes it is possible to mimic some of the charge characteristics of biological membranes. The use of AF4 system allows determining, in a single analysis, information regarding the selectivity of the peptides, the quantity of peptides bound to each liposome, the induced change in the size distribution and morphology of the liposomes. The results obtained provide relevant information for the study of structure-activity relationships in the context of membrane-induced antimicrobial action. This information will contribute to the rational design of potent antimicrobial agents in the future. Moreover, the application of this method to other liposome systems is straightforward and would be extremely useful for a comprehensive characterization with regard to size distribution and protein interaction in the nanomedicine field. Copyright © 2015. Published by Elsevier B.V.
Jenson, Susan K.; Domingue, Julia O.
1988-01-01
The first phase of analysis is a conditioning phase that generates three data sets: the original OEM with depressions filled, a data set indicating the flow direction for each cell, and a flow accumulation data set in which each cell receives a value equal to the total number of cells that drain to it. The original OEM and these three derivative data sets can then be processed in a variety of ways to optionally delineate drainage networks, overland paths, watersheds for userspecified locations, sub-watersheds for the major tributaries of a drainage network, or pour point linkages between watersheds. The computer-generated drainage lines and watershed polygons and the pour point linkage information can be transferred to vector-based geographic information systems for futher analysis. Comparisons between these computergenerated features and their manually delineated counterparts generally show close agreement, indicating that these software tools will save analyst time spent in manual interpretation and digitizing.
Information Leakage Analysis by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Zanioli, Matteo; Cortesi, Agostino
Protecting the confidentiality of information stored in a computer system or transmitted over a public network is a relevant problem in computer security. The approach of information flow analysis involves performing a static analysis of the program with the aim of proving that there will not be leaks of sensitive information. In this paper we propose a new domain that combines variable dependency analysis, based on propositional formulas, and variables' value analysis, based on polyhedra. The resulting analysis is strictly more accurate than the state of the art abstract interpretation based analyses for information leakage detection. Its modular construction allows to deal with the tradeoff between efficiency and accuracy by tuning the granularity of the abstraction and the complexity of the abstract operators.
Local statistics of retinal optic flow for self-motion through natural sceneries.
Calow, Dirk; Lappe, Markus
2007-12-01
Image analysis in the visual system is well adapted to the statistics of natural scenes. Investigations of natural image statistics have so far mainly focused on static features. The present study is dedicated to the measurement and the analysis of the statistics of optic flow generated on the retina during locomotion through natural environments. Natural locomotion includes bouncing and swaying of the head and eye movement reflexes that stabilize gaze onto interesting objects in the scene while walking. We investigate the dependencies of the local statistics of optic flow on the depth structure of the natural environment and on the ego-motion parameters. To measure these dependencies we estimate the mutual information between correlated data sets. We analyze the results with respect to the variation of the dependencies over the visual field, since the visual motions in the optic flow vary depending on visual field position. We find that retinal flow direction and retinal speed show only minor statistical interdependencies. Retinal speed is statistically tightly connected to the depth structure of the scene. Retinal flow direction is statistically mostly driven by the relation between the direction of gaze and the direction of ego-motion. These dependencies differ at different visual field positions such that certain areas of the visual field provide more information about ego-motion and other areas provide more information about depth. The statistical properties of natural optic flow may be used to tune the performance of artificial vision systems based on human imitating behavior, and may be useful for analyzing properties of natural vision systems.
Information Flow in Interaction Networks II: Channels, Path Lengths, and Potentials
Stojmirović, Aleksandar
2012-01-01
Abstract In our previous publication, a framework for information flow in interaction networks based on random walks with damping was formulated with two fundamental modes: emitting and absorbing. While many other network analysis methods based on random walks or equivalent notions have been developed before and after our earlier work, one can show that they can all be mapped to one of the two modes. In addition to these two fundamental modes, a major strength of our earlier formalism was its accommodation of context-specific directed information flow that yielded plausible and meaningful biological interpretation of protein functions and pathways. However, the directed flow from origins to destinations was induced via a potential function that was heuristic. Here, with a theoretically sound approach called the channel mode, we extend our earlier work for directed information flow. This is achieved by constructing a potential function facilitating a purely probabilistic interpretation of the channel mode. For each network node, the channel mode combines the solutions of emitting and absorbing modes in the same context, producing what we call a channel tensor. The entries of the channel tensor at each node can be interpreted as the amount of flow passing through that node from an origin to a destination. Similarly to our earlier model, the channel mode encompasses damping as a free parameter that controls the locality of information flow. Through examples involving the yeast pheromone response pathway, we illustrate the versatility and stability of our new framework. PMID:22409812
Flow Cytometric Analysis of Hepatocytes from Normal, PFDA, and PH/DEN/ PB-Treated Rats
1989-12-31
SUB-GROUP’ Perfluorodecanoic acid ( PFDA ); hepatocarcinogenesis; preneoplastic lesions; flow cytometry; imunotoxicitYyc3 1%&STRACT (Continue on...effects of perfluorodecanoic acid ( PFDA ). Flow cytometric evaluation of hepatocytes from PEDA-treated rats revealed an increase in size and granularity...was designed to generate preliminary information regarding the toxic and potential carcinogenic effects of perfluorodecanoic acid ( PFDA ) on rat
NASA Astrophysics Data System (ADS)
Woodward, Simon J. R.; Wöhling, Thomas; Stenger, Roland
2016-03-01
Understanding the hydrological and hydrogeochemical responses of hillslopes and other small scale groundwater systems requires mapping the velocity and direction of groundwater flow relative to the controlling subsurface material features. Since point observations of subsurface materials and groundwater head are often the basis for modelling these complex, dynamic, three-dimensional systems, considerable uncertainties are inevitable, but are rarely assessed. This study explored whether piezometric head data measured at high spatial and temporal resolution over six years at a hillslope research site provided sufficient information to determine the flow paths that transfer nitrate leached from the soil zone through the shallow saturated zone into a nearby wetland and stream. Transient groundwater flow paths were modelled using MODFLOW and MODPATH, with spatial patterns of hydraulic conductivity in the three material layers at the site being estimated by regularised pilot point calibration using PEST, constrained by slug test estimates of saturated hydraulic conductivity at several locations. Subsequent Null Space Monte Carlo uncertainty analysis showed that this data was not sufficient to definitively determine the spatial pattern of hydraulic conductivity at the site, although modelled water table dynamics matched the measured heads with acceptable accuracy in space and time. Particle tracking analysis predicted that the saturated flow direction was similar throughout the year as the water table rose and fell, but was not aligned with either the ground surface or subsurface material contours; indeed the subsurface material layers, having relatively similar hydraulic properties, appeared to have little effect on saturated water flow at the site. Flow path uncertainty analysis showed that, while accurate flow path direction or velocity could not be determined on the basis of the available head and slug test data alone, the origin of well water samples relative to the material layers and site contour could still be broadly deduced. This study highlights both the challenge of collecting suitably informative field data with which to characterise subsurface hydrology, and the power of modern calibration and uncertainty modelling techniques to assess flow path uncertainty in hillslopes and other small scale systems.
Automated flow cytometric analysis across large numbers of samples and cell types.
Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno
2015-04-01
Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Xu, Wenbo; Jing, Shaocai; Yu, Wenjuan; Wang, Zhaoxian; Zhang, Guoping; Huang, Jianxi
2013-11-01
In this study, the high risk areas of Sichuan Province with debris flow, Panzhihua and Liangshan Yi Autonomous Prefecture, were taken as the studied areas. By using rainfall and environmental factors as the predictors and based on the different prior probability combinations of debris flows, the prediction of debris flows was compared in the areas with statistical methods: logistic regression (LR) and Bayes discriminant analysis (BDA). The results through the comprehensive analysis show that (a) with the mid-range scale prior probability, the overall predicting accuracy of BDA is higher than those of LR; (b) with equal and extreme prior probabilities, the overall predicting accuracy of LR is higher than those of BDA; (c) the regional predicting models of debris flows with rainfall factors only have worse performance than those introduced environmental factors, and the predicting accuracies of occurrence and nonoccurrence of debris flows have been changed in the opposite direction as the supplemented information.
Breakdown parameter for kinetic modeling of multiscale gas flows.
Meng, Jianping; Dongari, Nishanth; Reese, Jason M; Zhang, Yonghao
2014-06-01
Multiscale methods built purely on the kinetic theory of gases provide information about the molecular velocity distribution function. It is therefore both important and feasible to establish new breakdown parameters for assessing the appropriateness of a fluid description at the continuum level by utilizing kinetic information rather than macroscopic flow quantities alone. We propose a new kinetic criterion to indirectly assess the errors introduced by a continuum-level description of the gas flow. The analysis, which includes numerical demonstrations, focuses on the validity of the Navier-Stokes-Fourier equations and corresponding kinetic models and reveals that the new criterion can consistently indicate the validity of continuum-level modeling in both low-speed and high-speed flows at different Knudsen numbers.
DOT National Transportation Integrated Search
2011-03-01
Midwest FreightView and the Great Lakes Maritime Information Delivery System is a comprehensive data repository and information : clearinghouse in support of Great Lakes maritime commerce. This multifunctional resource integrated in a geographic info...
Groundwater Data Package for the 2004 Composite Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorne, Paul D.
2004-08-11
This report presents data and information that supports the groundwater module. The conceptual model of groundwater flow and transport at the Hanford Site is described and specific information applied in the numerical implementation module is provided.
Statistical analysis on the signals monitoring multiphase flow patterns in pipeline-riser system
NASA Astrophysics Data System (ADS)
Ye, Jing; Guo, Liejin
2013-07-01
The signals monitoring petroleum transmission pipeline in offshore oil industry usually contain abundant information about the multiphase flow on flow assurance which includes the avoidance of most undesirable flow pattern. Therefore, extracting reliable features form these signals to analyze is an alternative way to examine the potential risks to oil platform. This paper is focused on characterizing multiphase flow patterns in pipeline-riser system that is often appeared in offshore oil industry and finding an objective criterion to describe the transition of flow patterns. Statistical analysis on pressure signal at the riser top is proposed, instead of normal prediction method based on inlet and outlet flow conditions which could not be easily determined during most situations. Besides, machine learning method (least square supported vector machine) is also performed to classify automatically the different flow patterns. The experiment results from a small-scale loop show that the proposed method is effective for analyzing the multiphase flow pattern.
NASA Astrophysics Data System (ADS)
Dufoyer, A.; Lecoq, N.; Massei, N.; Marechal, J. C.
2017-12-01
Physics-based modeling of karst systems remains almost impossible without enough accurate information about the inner physical characteristics. Usually, the only available hydrodynamic information is the flow rate at the karst outlet. Numerous works in the past decades have used and proven the usefulness of time-series analysis and spectral techniques applied to spring flow, precipitations or even physico-chemical parameters, for interpreting karst hydrological functioning. However, identifying or interpreting the karst systems physical features that control statistical or spectral characteristics of spring flow variations is still challenging, not to say sometimes controversial. The main objective of this work is to determine how the statistical and spectral characteristics of the hydrodynamic signal at karst springs can be related to inner physical and hydraulic properties. In order to address this issue, we undertake an empirical approach based on the use of both distributed and physics-based models, and on synthetic systems responses. The first step of the research is to conduct a sensitivity analysis of time-series/spectral methods to karst hydraulic and physical properties. For this purpose, forward modeling of flow through several simple, constrained and synthetic cases in response to precipitations is undertaken. It allows us to quantify how the statistical and spectral characteristics of flow at the outlet are sensitive to changes (i) in conduit geometries, and (ii) in hydraulic parameters of the system (matrix/conduit exchange rate, matrix hydraulic conductivity and storativity). The flow differential equations resolved by MARTHE, a computer code developed by the BRGM, allows karst conduits modeling. From signal processing on simulated spring responses, we hope to determine if specific frequencies are always modified, thanks to Fourier series and multi-resolution analysis. We also hope to quantify which parameters are the most variable with auto-correlation analysis: first results seem to show higher variations due to conduit conductivity than the ones due to matrix/conduit exchange rate. Future steps will be using another computer code, based on double-continuum approach and allowing turbulent conduit flow, and modeling a natural system.
Pattern analysis of fraud case in Taiwan, China and Indonesia
NASA Astrophysics Data System (ADS)
Kusumo, A. H.; Chi, C.-F.; Dewi, R. S.
2017-11-01
The current study analyzed 125 successful fraud cases happened in Taiwan, China, and Indonesia from 2008 to 2012 published in the English online newspapers. Each of the case report was coded in terms of scam principle, information media (information exchange between fraudsters and victim), money media (media used by fraudsters to obtain unauthorized financial benefit) and other additional information which was judged to be relevant. The Chi-square Automatic Interaction Detector (CHAID) was applied to the coded data of information, scam principle and money media to find a subset of predictors that might derive meaningful classifications. A series of flow diagrams was constructed based on CHAID result to illustrate the flow of information (scam) travelling from information media to money media.
Flow experience among information and communication technology users.
Rodríguez-Sánchez, Alma M; Schaufeli, Wilmar B; Salanova, Marisa; Cifre, Eva
2008-02-01
The use of technologies is more common in daily life; working with technologies might be associated with positive experiences such as flow. However, there is little empirical research on flow experiences in technology settings. The main aim of this study was to confirm the three-dimensional construct of flow, i.e., absorption, enjoyment, and intrinsic interest, among 517 Information and Communication Technology users [234 students whose mean age was 23 yr. (SD = 3.8)] from different areas of study, mainly Law, Public Administration, Chemistry, and Psychology, and 283 employees [whose mean age was 33 yr. (SD = 7.8)] of 21 different companies from various sectors of production, namely, public administration, industrial production, and services. Analysis showed, as expected, flow is a three-dimensional psychological construct and invariant among samples of technology users. Practical and theoretical implications as well as further research are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Ananta P.; Mohapatra, Ranjita K.; Saumia, P. S.
2010-03-15
Recently we have shown that there are crucial similarities in the physics of cosmic microwave background radiation (CMBR) anisotropies and the flow anisotropies in relativistic heavy-ion collision experiments (RHICE). We also argued that, following CMBR anisotropy analysis, a plot of root-mean-square values of the flow coefficients, calculated in a laboratory-fixed frame for RHICE, can yield important information about the nature of initial state anisotropies and their evolution. Here we demonstrate the strength of this technique by showing that elliptic flow for noncentral collisions can be directly determined from such a plot without any need for the determination of the eventmore » plane.« less
Application of effective discharge analysis to environmental flow decision-making
McKay, S. Kyle; Freeman, Mary C.; Covich, A.P.
2016-01-01
Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.
Application of Effective Discharge Analysis to Environmental Flow Decision-Making.
McKay, S Kyle; Freeman, Mary C; Covich, Alan P
2016-06-01
Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.
Incorporating seismic observations into 2D conduit flow modeling
NASA Astrophysics Data System (ADS)
Collier, L.; Neuberg, J.
2006-04-01
Conduit flow modeling aims to understand the conditions of magma at depth, and to provide insight into the physical processes that occur inside the volcano. Low-frequency events, characteristic to many volcanoes, are thought to contain information on the state of magma at depth. Therefore, by incorporating information from low-frequency seismic analysis into conduit flow modeling a greater understanding of magma ascent and its interdependence on magma conditions and physical processes is possible. The 2D conduit flow model developed in this study demonstrates the importance of lateral pressure and parameter variations on overall magma flow dynamics, and the substantial effect bubbles have on magma shear viscosity and on magma ascent. The 2D nature of the conduit flow model developed here allows in depth investigation into processes which occur at, or close to the wall, such as magma cooling and brittle failure of melt. These processes are shown to have a significant effect on magma properties and therefore, on flow dynamics. By incorporating low-frequency seismic information, an advanced conduit flow model is developed including the consequences of brittle failure of melt, namely friction-controlled slip and gas loss. This model focuses on the properties and behaviour of magma at depth within the volcano, and their interaction with the formation of seismic events by brittle failure of melt.
Information flow and causality as rigorous notions ab initio
NASA Astrophysics Data System (ADS)
Liang, X. San
2016-11-01
Information flow or information transfer the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality is firmly rooted in the dynamical system that lies beneath. The principle of nil causality that reads, an event is not causal to another if the evolution of the latter is independent of the former, which transfer entropy analysis and Granger causality test fail to verify in many situations, turns out to be a proven theorem here. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems, both deterministic and stochastic. They have been obtained explicitly in closed form, and put to applications with the benchmark systems such as the Kaplan-Yorke map, Rössler system, baker transformation, Hénon map, and stochastic potential flow. Besides unraveling the causal relations as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, providing a mathematical basis for the long-standing philosophical debate over causation versus correlation.
DOT National Transportation Integrated Search
1993-01-01
This 2-CD set presents data and information from the 1993 Commodity Flow Survey (CFS) on the movement of goods and products shipped by manufacturing, mining, wholesale, and selected retail establishments in the United States. The data cover domestic ...
Information security system quality assessment through the intelligent tools
NASA Astrophysics Data System (ADS)
Trapeznikov, E. V.
2018-04-01
The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.
Baladrón, Carlos; Khrennikov, Andrei
2016-12-01
The similarities between biological and physical systems as respectively defined in quantum information biology (QIB) and in a Darwinian approach to quantum mechanics (DAQM) have been analysed. In both theories the processing of information is a central feature characterising the systems. The analysis highlights a mutual support on the thesis contended by each theory. On the one hand, DAQM provides a physical basis that might explain the key role played by quantum information at the macroscopic level for bio-systems in QIB. On the other hand, QIB offers the possibility, acting as a macroscopic testing ground, to analyse the emergence of quantumness from classicality in the terms held by DAQM. As an added result of the comparison, a tentative definition of quantum information in terms of classical information flows has been proposed. The quantum formalism would appear from this comparative analysis between QIB and DAQM as an optimal information scheme that would maximise the stability of biological and physical systems at any scale. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Understanding Information Flow Interaction along Separable Causal Paths in Environmental Signals
NASA Astrophysics Data System (ADS)
Jiang, P.; Kumar, P.
2017-12-01
Multivariate environmental signals reflect the outcome of complex inter-dependencies, such as those in ecohydrologic systems. Transfer entropy and information partitioning approaches have been used to characterize such dependencies. However, these approaches capture net information flow occurring through a multitude of pathways involved in the interaction and as a result mask our ability to discern the causal interaction within an interested subsystem through specific pathways. We build on recent developments of momentary information transfer along causal paths proposed by Runge [2015] to develop a framework for quantifying information decomposition along separable causal paths. Momentary information transfer along causal paths captures the amount of information flow between any two variables lagged at two specific points in time. Our approach expands this concept to characterize the causal interaction in terms of synergistic, unique and redundant information flow through separable causal paths. Multivariate analysis using this novel approach reveals precise understanding of causality and feedback. We illustrate our approach with synthetic and observed time series data. We believe the proposed framework helps better delineate the internal structure of complex systems in geoscience where huge amounts of observational datasets exist, and it will also help the modeling community by providing a new way to look at the complexity of real and modeled systems. Runge, Jakob. "Quantifying information transfer and mediation along causal pathways in complex systems." Physical Review E 92.6 (2015): 062829.
NASA Technical Reports Server (NTRS)
Penny, M. M.; Smith, S. D.; Anderson, P. G.; Sulyma, P. R.; Pearson, M. L.
1976-01-01
A computer program written in conjunction with the numerical solution of the flow of chemically reacting gas-particle mixtures was documented. The solution to the set of governing equations was obtained by utilizing the method of characteristics. The equations cast in characteristic form were shown to be formally the same for ideal, frozen, chemical equilibrium and chemical non-equilibrium reacting gas mixtures. The characteristic directions for the gas-particle system are found to be the conventional gas Mach lines, the gas streamlines and the particle streamlines. The basic mesh construction for the flow solution is along streamlines and normals to the streamlines for axisymmetric or two-dimensional flow. The analysis gives detailed information of the supersonic flow and provides for a continuous solution of the nozzle and exhaust plume flow fields. Boundary conditions for the flow solution are either the nozzle wall or the exhaust plume boundary.
Olson, Scott A.
2003-01-01
The stream-gaging network in New Hampshire was analyzed for its effectiveness in providing regional information on peak-flood flow, mean-flow, and low-flow frequency. The data available for analysis were from stream-gaging stations in New Hampshire and selected stations in adjacent States. The principles of generalized-least-squares regression analysis were applied to develop regional regression equations that relate streamflow-frequency characteristics to watershed characteristics. Regression equations were developed for (1) the instantaneous peak flow with a 100-year recurrence interval, (2) the mean-annual flow, and (3) the 7-day, 10-year low flow. Active and discontinued stream-gaging stations with 10 or more years of flow data were used to develop the regression equations. Each stream-gaging station in the network was evaluated and ranked on the basis of how much the data from that station contributed to the cost-weighted sampling-error component of the regression equation. The potential effect of data from proposed and new stream-gaging stations on the sampling error also was evaluated. The stream-gaging network was evaluated for conditions in water year 2000 and for estimated conditions under various network strategies if an additional 5 years and 20 years of streamflow data were collected. The effectiveness of the stream-gaging network in providing regional streamflow information could be improved for all three flow characteristics with the collection of additional flow data, both temporally and spatially. With additional years of data collection, the greatest reduction in the average sampling error of the regional regression equations was found for the peak- and low-flow characteristics. In general, additional data collection at stream-gaging stations with unregulated flow, relatively short-term record (less than 20 years), and drainage areas smaller than 45 square miles contributed the largest cost-weighted reduction to the average sampling error of the regional estimating equations. The results of the network analyses can be used to prioritize the continued operation of active stations, the reactivation of discontinued stations, or the activation of new stations to maximize the regional information content provided by the stream-gaging network. Final decisions regarding altering the New Hampshire stream-gaging network would require the consideration of the many uses of the streamflow data serving local, State, and Federal interests.
Managing Multi-center Flow Cytometry Data for Immune Monitoring
White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn
2014-01-01
With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786
NASA Astrophysics Data System (ADS)
Hughes, Allen A.
1994-12-01
Public safety can be enhanced through the development of a comprehensive medical device risk management. This can be accomplished through case studies using a framework that incorporates cost-benefit analysis in the evaluation of risk management attributes. This paper presents a framework for evaluating the risk management system for regulatory Class III medical devices. The framework consists of the following sixteen attributes of a comprehensive medical device risk management system: fault/failure analysis, premarket testing/clinical trials, post-approval studies, manufacturer sponsored hospital studies, product labeling, establishment inspections, problem reporting program, mandatory hospital reporting, medical literature surveillance, device/patient registries, device performance monitoring, returned product analysis, autopsy program, emergency treatment funds/interim compensation, product liability, and alternative compensation mechanisms. Review of performance histories for several medical devices can reveal the value of information for many attributes, and also the inter-dependencies of the attributes in generating risk information flow. Such an information flow network is presented as a starting point for enhancing medical device risk management by focusing on attributes with high net benefit values and potential to spur information dissemination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tralshawala, Nilesh; Howard, Don; Knight, Bryon
2008-02-28
In conventional infrared thermography, determination of thermal diffusivity requires thickness information. Recently GE has been experimenting with the use of lateral heat flow to determine thermal diffusivity without thickness information. This work builds on previous work at NASA Langley and Wayne State University but we incorporate thermal time of flight (tof) analysis rather than curve fitting to obtain quantitative information. We have developed appropriate theoretical models and a tof based data analysis framework to experimentally determine all components of thermal diffusivity from the time-temperature measurements. Initial validation was carried out using finite difference simulations. Experimental validation was done using anisotropicmore » carbon fiber reinforced polymer (CFRP) composites. We found that in the CFRP samples used, the in-plane component of diffusivity is about eight times larger than the through-thickness component.« less
Data-Flow Based Model Analysis
NASA Technical Reports Server (NTRS)
Saad, Christian; Bauer, Bernhard
2010-01-01
The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.
NASA Astrophysics Data System (ADS)
Beskow, Samuel; de Mello, Carlos Rogério; Vargas, Marcelle M.; Corrêa, Leonardo de L.; Caldeira, Tamara L.; Durães, Matheus F.; de Aguiar, Marilton S.
2016-10-01
Information on stream flows is essential for water resources management. The stream flow that is equaled or exceeded 90% of the time (Q90) is one the most used low stream flow indicators in many countries, and its determination is made from the frequency analysis of stream flows considering a historical series. However, stream flow gauging network is generally not spatially sufficient to meet the necessary demands of technicians, thus the most plausible alternative is the use of hydrological regionalization. The objective of this study was to couple the artificial intelligence techniques (AI) K-means, Partitioning Around Medoids (PAM), K-harmonic means (KHM), Fuzzy C-means (FCM) and Genetic K-means (GKA), with measures of low stream flow seasonality, for verification of its potential to delineate hydrologically homogeneous regions for the regionalization of Q90. For the performance analysis of the proposed methodology, location attributes from 108 watersheds situated in southern Brazil, and attributes associated with their seasonality of low stream flows were considered in this study. It was concluded that: (i) AI techniques have the potential to delineate hydrologically homogeneous regions in the context of Q90 in the study region, especially the FCM method based on fuzzy logic, and GKA, based on genetic algorithms; (ii) the attributes related to seasonality of low stream flows added important information that increased the accuracy of the grouping; and (iii) the adjusted mathematical models have excellent performance and can be used to estimate Q90 in locations lacking monitoring.
NaturAnalogs for the Unsaturated Zone
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Simmons; A. Unger; M. Murrell
2000-03-08
The purpose of this Analysis/Model Report (AMR) is to document natural and anthropogenic (human-induced) analog sites and processes that are applicable to flow and transport processes expected to occur at the potential Yucca Mountain repository in order to build increased confidence in modeling processes of Unsaturated Zone (UZ) flow and transport. This AMR was prepared in accordance with ''AMR Development Plan for U0135, Natural Analogs for the UZ'' (CRWMS 1999a). Knowledge from analog sites and processes is used as corroborating information to test and build confidence in flow and transport models of Yucca Mountain, Nevada. This AMR supports the Unsaturatedmore » Zone (UZ) Flow and Transport Process Model Report (PMR) and the Yucca Mountain Site Description. The objectives of this AMR are to test and build confidence in the representation of UZ processes in numerical models utilized in the UZ Flow and Transport Model. This is accomplished by: (1) applying data from Boxy Canyon, Idaho in simulations of UZ flow using the same methodologies incorporated in the Yucca Mountain UZ Flow and Transport Model to assess the fracture-matrix interaction conceptual model; (2) Providing a preliminary basis for analysis of radionuclide transport at Pena Blanca, Mexico as an analog of radionuclide transport at Yucca Mountain; and (3) Synthesizing existing information from natural analog studies to provide corroborating evidence for representation of ambient and thermally coupled UZ flow and transport processes in the UZ Model.« less
The report gives results of a materials flow analysis performed for composting municipal solid waste (MSW) and specific biodegradable organic components of MSW. (NOTE: This work is part of an overall U.S. EPA project providing cost, energy, and materials flow information on diffe...
Flow cytogenetics and chromosome sorting.
Cram, L S
1990-06-01
This review of flow cytogenetics and chromosome sorting provides an overview of general information in the field and describes recent developments in more detail. From the early developments of chromosome analysis involving single parameter or one color analysis to the latest developments in slit scanning of single chromosomes in a flow stream, the field has progressed rapidly and most importantly has served as an important enabling technology for the human genome project. Technological innovations that advanced flow cytogenetics are described and referenced. Applications in basic cell biology, molecular biology, and clinical investigations are presented. The necessary characteristics for large number chromosome sorting are highlighted. References to recent review articles are provided as a starting point for locating individual references that provide more detail. Specific references are provided for recent developments.
Juan-García, Ana; Manyes, Lara; Ruiz, María-José; Font, Guillermina
2013-06-01
This review gives an overview of flow cytometry applications to toxicological studies of several physiological target sites of mycotoxins on different mammalian cell lines. Mycotoxins are secondary metabolites of fungi that may be present in food, feed, air and water. The increasing presence of mycotoxins in crops, their wide distribution in the food chain, and their potential for toxicity demonstrate the need for further knowledge. Flow cytometry has become a valuable tool in mycotoxin studies in recent years for the rapid analysis of single cells in a mixture. In toxicology, the power of these methods lies in the possibility of determining a wide range of cell parameters, providing valuable information to elucidate cell growth and viability, metabolic activity, mitochondrial membrane potential and membrane integrity mechanisms. There are studies using flow cytometry technique on Alternaria, Aspergillus, Fusarium and Penicillium mycotoxins including information about cell type, assay conditions and functional parameters. Most of the studies collected in the literature are on deoxynivalenol and zearalenone mycotoxins. Cell cycle analysis and apoptosis are the processes more widely investigated. Copyright © 2013 Elsevier Ltd. All rights reserved.
Information flow on social networks: from empirical data to situation understanding
NASA Astrophysics Data System (ADS)
Roy, Heather; Abdelzaher, Tarek; Bowman, Elizabeth K.; Al Amin, Md. Tanvir
2017-05-01
This paper describes characteristics of information flow on social channels, as a function of content type and relations among individual sources, distilled from analysis of Twitter data as well as human subject survey results. The working hypothesis is that individuals who propagate content on social media act (e.g., decide whether to relay information or not) in accordance with their understanding of the content, as well as their own beliefs and trust relations. Hence, the resulting aggregate content propagation pattern encodes the collective content interpretation of the underlying group, as well as their relations. Analysis algorithms are described to recover such relations from the observed propagation patterns as well as improve our understanding of the content itself in a language agnostic manner simply from its propagation characteristics. An example is to measure the degree of community polarization around contentious topics, identify the factions involved, and recognize their individual views on issues. The analysis is independent of the language of discourse itself, making it valuable for multilingual media, where the number of languages used may render language-specific analysis less scalable.
Genomic Instability at Premalignant and Early Stages of Breast Cancer Development
1999-08-01
by routine DNA flow cytometry vation. ERBB2 expression was detected with a to determine DNA index (DI). commercially available antibody (Oncogene Sci...supplements the information gained from ic microsatellite primers. We observed that the ploidy analysis by DNA flow cytometry alone. In DNA so obtained...preserved the proportionality of many cases where flow cytometry could not be per- the different alleles as found in the original sample. formed because the
A Four-Level Hierarchy for Organizing Wildland Stream Resource Information
Harry Parrott; Daniel A. Marion; R. Douglas Perkinson
1989-01-01
An analysis of current USDA Forest Service methods of collecting and using wildland stream resource data indicates that required information can be organized into a four-level hierarchy. Information at each level is tiered with information at the preceding level. Level 1 is the ASSOCIATION, which is differentiated by stream size and flow regime. Level 2, STREAM TYPE,...
Małyska, Aleksandra; Maciąg, Kamil; Twardowski, Tomasz
2014-03-25
The issue of GMOs arouses constantly strong emotions in public discourse. At the same time opinions of people particularly interested in this issues such as researchers, or potential users of this technology (e.g. farmers) are rarely subjected to analysis. Moreover, lack of knowledge about the flow of information "from the laboratory to the consumer" hinders implementation of any changes in this field. By using triangulation (combining quantitative and qualitative research and the use of various research tools) we explored the attitudes of Polish scientists, agricultural advisers and farmers (large scale agricultural producers) to the use of GMOs in the economy. On the basis of the performed research we diagnosed the effectiveness of information flow among these groups about transgenic organisms. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Galanti, Eli; Durante, Daniele; Finocchiaro, Stefano; Iess, Luciano; Kaspi, Yohai
2017-07-01
The upcoming Juno spacecraft measurements have the potential of improving our knowledge of Jupiter’s gravity field. The analysis of the Juno Doppler data will provide a very accurate reconstruction of spatial gravity variations, but these measurements will be very accurate only over a limited latitudinal range. In order to deduce the full gravity field of Jupiter, additional information needs to be incorporated into the analysis, especially regarding the Jovian flow structure and its depth, which can influence the measured gravity field. In this study we propose a new iterative method for the estimation of the Jupiter gravity field, using a simulated Juno trajectory, a trajectory estimation model, and an adjoint-based inverse model for the flow dynamics. We test this method both for zonal harmonics only and with a full gravity field including tesseral harmonics. The results show that this method can fit some of the gravitational harmonics better to the “measured” harmonics, mainly because of the added information from the dynamical model, which includes the flow structure. Thus, it is suggested that the method presented here has the potential of improving the accuracy of the expected gravity harmonics estimated from the Juno and Cassini radio science experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galanti, Eli; Kaspi, Yohai; Durante, Daniele
The upcoming Juno spacecraft measurements have the potential of improving our knowledge of Jupiter’s gravity field. The analysis of the Juno Doppler data will provide a very accurate reconstruction of spatial gravity variations, but these measurements will be very accurate only over a limited latitudinal range. In order to deduce the full gravity field of Jupiter, additional information needs to be incorporated into the analysis, especially regarding the Jovian flow structure and its depth, which can influence the measured gravity field. In this study we propose a new iterative method for the estimation of the Jupiter gravity field, using a simulatedmore » Juno trajectory, a trajectory estimation model, and an adjoint-based inverse model for the flow dynamics. We test this method both for zonal harmonics only and with a full gravity field including tesseral harmonics. The results show that this method can fit some of the gravitational harmonics better to the “measured” harmonics, mainly because of the added information from the dynamical model, which includes the flow structure. Thus, it is suggested that the method presented here has the potential of improving the accuracy of the expected gravity harmonics estimated from the Juno and Cassini radio science experiments.« less
The effect of hydrodynamic conditions on the phenotype of Pseudomonas fluorescens biofilms.
Simões, Manuel; Pereira, Maria O; Sillankorva, Sanna; Azeredo, Joana; Vieira, Maria J
2007-01-01
This study investigated the phenotypic characteristics of monoculture P. fluorescens biofilms grown under turbulent and laminar flow, using flow cells reactors with stainless steel substrata. The cellular physiology and the overall biofilm activity, structure and composition were characterized, and compared, within hydrodynamically distinct conditions. The results indicate that turbulent flow-generated biofilm cells were significantly less extensive, with decreased metabolic activity and a lower protein and polysaccharides composition per cell than those from laminar flow-generated biofilms. The effect of flow regime did not cause significantly different outer membrane protein expression. From the analysis of biofilm activity, structure and composition, turbulent flow-generated biofilms were metabolically more active, had twice more mass per cm(2), and higher cellular density and protein content (mainly cellular) than laminar flow-generated biofilms. Conversely, laminar flow-generated biofilms presented higher total and matrix polysaccharide contents. Direct visualisation and scanning electron microscopy analysis showed that these different flows generate structurally different biofilms, corroborating the quantitative results. The combination of applied methods provided useful information regarding a broad spectrum of biofilm parameters, which can contribute to control and model biofilm processes.
ERIC Educational Resources Information Center
Yang, Qinghua; Yang, Fan; Zhou, Chun
2015-01-01
Purpose: The purpose of this paper is to investigate how the information about haze, a term used in China to describe the air pollution problem, is portrayed on Chinese social media by different types of organizations using the theoretical framework of the health belief model (HBM). Design/methodology/approach: A content analysis was conducted…
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
Computer programs for large systems of normal equations, an interactive digital signal process, structural analysis of cylindrical thrust chambers, swirling turbulent axisymmetric recirculating flows in practical isothermal combustor geometrics, computation of three dimensional combustor performance, a thermal radiation analysis system, transient response analysis, and a software design analysis are summarized.
Development of image processing techniques for applications in flow visualization and analysis
NASA Technical Reports Server (NTRS)
Disimile, Peter J.; Shoe, Bridget; Toy, Norman; Savory, Eric; Tahouri, Bahman
1991-01-01
A comparison between two flow visualization studies of an axi-symmetric circular jet issuing into still fluid, using two different experimental techniques, is described. In the first case laser induced fluorescence is used to visualize the flow structure, whilst smoke is utilized in the second. Quantitative information was obtained from these visualized flow regimes using two different digital imaging systems. Results are presented of the rate at which the jet expands in the downstream direction and these compare favorably with the more established data.
Entropy and generalized least square methods in assessment of the regional value of streamgages
Markus, M.; Vernon, Knapp H.; Tasker, Gary D.
2003-01-01
The Illinois State Water Survey performed a study to assess the streamgaging network in the State of Illinois. One of the important aspects of the study was to assess the regional value of each station through an assessment of the information transfer among gaging records for low, average, and high flow conditions. This analysis was performed for the main hydrologic regions in the State, and the stations were initially evaluated using a new approach based on entropy analysis. To determine the regional value of each station within a region, several information parameters, including total net information, were defined based on entropy. Stations were ranked based on the total net information. For comparison, the regional value of the same stations was assessed using the generalized least square regression (GLS) method, developed by the US Geological Survey. Finally, a hybrid combination of GLS and entropy was created by including a function of the negative net information as a penalty function in the GLS. The weights of the combined model were determined to maximize the average correlation with the results of GLS and entropy. The entropy and GLS methods were evaluated using the high-flow data from southern Illinois stations. The combined method was compared with the entropy and GLS approaches using the high-flow data from eastern Illinois stations. ?? 2003 Elsevier B.V. All rights reserved.
A pilot study of river flow prediction in urban area based on phase space reconstruction
NASA Astrophysics Data System (ADS)
Adenan, Nur Hamiza; Hamid, Nor Zila Abd; Mohamed, Zulkifley; Noorani, Mohd Salmi Md
2017-08-01
River flow prediction is significantly related to urban hydrology impact which can provide information to solve any problems such as flood in urban area. The daily river flow of Klang River, Malaysia was chosen to be forecasted in this pilot study which based on phase space reconstruction. The reconstruction of phase space involves a single variable of river flow data to m-dimensional phase space in which the dimension (m) is based on the optimal values of Cao method. The results from the reconstruction of phase space have been used in the forecasting process using local linear approximation method. From our investigation, river flow at Klang River is chaotic based on the analysis from Cao method. The overall results provide good value of correlation coefficient. The value of correlation coefficient is acceptable since the area of the case study is influence by a lot of factors. Therefore, this pilot study may be proposed to forecast daily river flow data with the purpose of providing information about the flow of the river system in urban area.
Oil-Water Flow Investigations using Planar-Laser Induced Fluorescence and Particle Velocimetry
NASA Astrophysics Data System (ADS)
Ibarra, Roberto; Matar, Omar K.; Markides, Christos N.
2017-11-01
The study of the complex behaviour of immiscible liquid-liquid flow in pipes requires the implementation of advanced measurement techniques in order to extract detailed in situ information. Laser-based diagnostic techniques allow the extraction of high-resolution space- and time resolve phase and velocity information, which aims to improve the fundamental understanding of these flows and to validate closure relations for advanced multiphase flow models. This work shows a novel simultaneous planar-laser induced fluorescence and particle velocimetry in stratified oil-water flows using two laser light sheets at two different wavelengths for fluids with different refractive indices at horizontal and upward pipe inclinations (<5°) in stratified flow conditions (i.e. separated layers). Complex flow structures are extracted from 2-D instantaneous velocity fields, which are strongly dependent on the pipe inclination at low velocities. The analysis of mean wall-normal velocity profiles and velocity fluctuations suggests the presence of single- and counter-rotating vortices in the azimuthal direction, especially in the oil layer, which can be attributed to the influence of the interfacial waves. Funding from BP, and the TMF Consortium is gratefully acknowledged.
PREDICTING TURBINE STAGE PERFORMANCE
NASA Technical Reports Server (NTRS)
Boyle, R. J.
1994-01-01
This program was developed to predict turbine stage performance taking into account the effects of complex passage geometries. The method uses a quasi-3D inviscid-flow analysis iteratively coupled to calculated losses so that changes in losses result in changes in the flow distribution. In this manner the effects of both the geometry on the flow distribution and the flow distribution on losses are accounted for. The flow may be subsonic or shock-free transonic. The blade row may be fixed or rotating, and the blades may be twisted and leaned. This program has been applied to axial and radial turbines, and is helpful in the analysis of mixed flow machines. This program is a combination of the flow analysis programs MERIDL and TSONIC coupled to the boundary layer program BLAYER. The subsonic flow solution is obtained by a finite difference, stream function analysis. Transonic blade-to-blade solutions are obtained using information from the finite difference, stream function solution with a reduced flow factor. Upstream and downstream flow variables may vary from hub to shroud and provision is made to correct for loss of stagnation pressure. Boundary layer analyses are made to determine profile and end-wall friction losses. Empirical loss models are used to account for incidence, secondary flow, disc windage, and clearance losses. The total losses are then used to calculate stator, rotor, and stage efficiency. This program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370/3033 under TSS with a central memory requirement of approximately 4.5 Megs of 8 bit bytes. This program was developed in 1985.
Tan, C; Liu, W L; Dong, F
2016-06-28
Understanding of flow patterns and their transitions is significant to uncover the flow mechanics of two-phase flow. The local phase distribution and its fluctuations contain rich information regarding the flow structures. A wire-mesh sensor (WMS) was used to study the local phase fluctuations of horizontal gas-liquid two-phase flow, which was verified through comparing the reconstructed three-dimensional flow structure with photographs taken during the experiments. Each crossing point of the WMS is treated as a node, so the measurement on each node is the phase fraction in this local area. An undirected and unweighted flow pattern network was established based on connections that are formed by cross-correlating the time series of each node under different flow patterns. The structure of the flow pattern network reveals the relationship of the phase fluctuations at each node during flow pattern transition, which is then quantified by introducing the topological index of the complex network. The proposed analysis method using the WMS not only provides three-dimensional visualizations of the gas-liquid two-phase flow, but is also a thorough analysis for the structure of flow patterns and the characteristics of flow pattern transition. This article is part of the themed issue 'Supersensing through industrial process tomography'. © 2016 The Author(s).
Liu, W. L.; Dong, F.
2016-01-01
Understanding of flow patterns and their transitions is significant to uncover the flow mechanics of two-phase flow. The local phase distribution and its fluctuations contain rich information regarding the flow structures. A wire-mesh sensor (WMS) was used to study the local phase fluctuations of horizontal gas–liquid two-phase flow, which was verified through comparing the reconstructed three-dimensional flow structure with photographs taken during the experiments. Each crossing point of the WMS is treated as a node, so the measurement on each node is the phase fraction in this local area. An undirected and unweighted flow pattern network was established based on connections that are formed by cross-correlating the time series of each node under different flow patterns. The structure of the flow pattern network reveals the relationship of the phase fluctuations at each node during flow pattern transition, which is then quantified by introducing the topological index of the complex network. The proposed analysis method using the WMS not only provides three-dimensional visualizations of the gas–liquid two-phase flow, but is also a thorough analysis for the structure of flow patterns and the characteristics of flow pattern transition. This article is part of the themed issue ‘Supersensing through industrial process tomography’. PMID:27185959
Sensory Information Systems Program
2012-03-06
cochlear implants. Developed by Dr. Les Atlas, U. Wash. Dr. Jay Rebenstein will develop commercial applications. TO: AFRL-- Eglin: Measurements and...wide field-of-view optic flow http://www.avl.umd.edu/ Microautonomous Systems and Technology Autonomous Steering: Transition to Army MAST 10...Wehling ( AFRL/RW): Neural analysis of optic flow . S. Sane ( Tata Institute): Insect multisensory integration 20 DISTRIBUTION A: Approved
An Exploratory Study of Interactivity in Visualization Tools: "Flow" of Interaction
ERIC Educational Resources Information Center
Liang, Hai-Ning; Parsons, Paul C.; Wu, Hsien-Chi; Sedig, Kamran
2010-01-01
This paper deals with the design of interactivity in visualization tools. There are several factors that can be used to guide the analysis and design of the interactivity of these tools. One such factor is flow, which is concerned with the duration of interaction with visual representations of information--interaction being the actions performed…
NASA Astrophysics Data System (ADS)
Hawkins, T. T.; Brand, B. D.; Sarrochi, D.; Pollock, N.
2016-12-01
One of the greatest challenges volcanologists face is the ability to extrapolate information about eruption dynamics and emplacement conditions from deposits. Pyroclastic density current (PDC) deposits are particularly challenging given the wide range of initial current conditions, (e.g., granular, fluidized, concentrated, dilute), and rapid flow transformations due to interaction with evolving topography. Analysis of particle shape-fabric can be used to determine flow direction, and may help to understand the rheological characteristics of the flows. However, extracting shape-fabric information from outcrop (2D) apparent fabric is limited, especially when outcrop exposure is incomplete or lacks context. To better understand and quantify the complex flow dynamics reflected in PDC deposits, we study the complete shape-fabric data in 3D using oriented samples. In the field, the prospective sample is carved from the unconsolidated deposit in blocks, the dimensions of which depend on the average clast size in the sample. The sample is saturated in situ with a water-based sodium silicate solution, then wrapped in plaster-soaked gauze to form a protective cast. The orientation of the sample is recorded on the block faces. The samples dry for five days and are then extracted in intact blocks. In the lab, the sample is vacuum impregnated with sodium silicate and cured in an oven. The fully lithified sample is first cut along the plan view to identify orientations of the long axes of the grains (flow direction), and then cut in the two plains perpendicular to grain elongation. 3D fabric analysis is performed using high resolution images of the cut-faces using computer assisted image analysis software devoted to shape-fabric analysis. Here we present the results of samples taken from the 18 May 1980 PDC deposit facies, including massive, diffuse-stratified and cross-stratified lapilli tuff. We show a relationship between the strength of iso-orientation of the elongated particles and different facies architectures, which is used to interpret rheological conditions of the flow. We chose the 18 May PDC deposits because their well-exposed and well-studied outcrops provide context, which allow us to test the method and extract information useful for interpreting ancient deposits that lack context.
Audebert, M; Clément, R; Moreau, S; Duquennoi, C; Loisel, S; Touze-Foltz, N
2016-09-01
Landfill bioreactors are based on an acceleration of in-situ waste biodegradation by performing leachate recirculation. To quantify the water content and to evaluate the leachate injection system, in-situ methods are required to obtain spatially distributed information, usually electrical resistivity tomography (ERT). In a previous study, the MICS (multiple inversions and clustering strategy) methodology was proposed to improve the hydrodynamic interpretation of ERT results by a precise delimitation of the infiltration area. In this study, MICS was applied on two ERT time-lapse data sets recorded on different waste deposit cells in order to compare the hydrodynamic behaviour of leachate flow between the two cells. This comparison is based on an analysis of: (i) the volume of wetted waste assessed by MICS and the wetting rate, (ii) the infiltration shapes and (iii) the pore volume used by the leachate flow. This paper shows that leachate hydrodynamic behaviour is comparable from one waste deposit cell to another with: (i) a high leachate infiltration speed at the beginning of the infiltration, which decreases with time, (ii) a horizontal anisotropy of the leachate infiltration shape and (iii) a very small fraction of the pore volume used by the leachate flow. This hydrodynamic information derived from MICS results can be useful for subsurface flow modelling used to predict leachate flow at the landfill scale. Copyright © 2016 Elsevier Ltd. All rights reserved.
Numerical Analysis of Incipient Separation on 53 Deg Swept Diamond Wing
NASA Technical Reports Server (NTRS)
Frink, Neal T.
2015-01-01
A systematic analysis of incipient separation and subsequent vortex formation from moderately swept blunt leading edges is presented for a 53 deg swept diamond wing. This work contributes to a collective body of knowledge generated within the NATO/STO AVT-183 Task Group titled 'Reliable Prediction of Separated Flow Onset and Progression for Air and Sea Vehicles'. The objective is to extract insights from the experimentally measured and numerically computed flow fields that might enable turbulence experts to further improve their models for predicting swept blunt leading-edge flow separation. Details of vortex formation are inferred from numerical solutions after establishing a good correlation of the global flow field and surface pressure distributions between wind tunnel measurements and computed flow solutions. From this, significant and sometimes surprising insights into the nature of incipient separation and part-span vortex formation are derived from the wealth of information available in the computational solutions.
Effective connectivity of facial expression network by using Granger causality analysis
NASA Astrophysics Data System (ADS)
Zhang, Hui; Li, Xiaoting
2013-10-01
Functional magnetic resonance imaging (fMRI) is an advanced non-invasive data acquisition technique to investigate the neural activity in human brain. In addition to localize the functional brain regions that is activated by specific cognitive task, fMRI can also be utilized to measure the task-related functional interactions among the active regions of interest (ROI) in the brain. Among the variety of analysis tools proposed for modeling the connectivity of brain regions, Granger causality analysis (GCA) measure the directions of information interactions by looking for the lagged effect among the brain regions. In this study, we use fMRI and Granger Causality analysis to investigate the effective connectivity of brain network induced by viewing several kinds of expressional faces. We focus on four kinds of facial expression stimuli: fearful, angry, happy and neutral faces. Five face selective regions of interest are localized and the effective connectivity within these regions is measured for the expressional faces. Our result based on 8 subjects showed that there is significant effective connectivity from STS to amygdala, from amygdala to OFA, aFFA and pFFA, from STS to aFFA and from pFFA to aFFA. This result suggested that there is an information flow from the STS to the amygdala when perusing expressional faces. This emotional expressional information flow that is conveyed by STS and amygdala, flow back to the face selective regions in occipital-temporal lobes, which constructed a emotional face processing network.
A novel method to measure regional muscle blood flow continuously using NIRS kinetics information
Nioka, Shoko; Kime, Ryotaro; Sunar, Ulas; Im, Joohee; Izzetoglu, Meltem; Zhang, Jun; Alacam, Burak; Chance, Britton
2006-01-01
Background This article introduces a novel method to continuously monitor regional muscle blood flow by using Near Infrared Spectroscopy (NIRS). We demonstrate the feasibility of the new method in two ways: (1) by applying this new method of determining blood flow to experimental NIRS data during exercise and ischemia; and, (2) by simulating muscle oxygenation and blood flow values using these newly developed equations during recovery from exercise and ischemia. Methods Deoxy (Hb) and oxyhemoglobin (HbO2), located in the blood ofthe skeletal muscle, carry two internal relationships between blood flow and oxygen consumption. One is a mass transfer principle and the other describes a relationship between oxygen consumption and Hb kinetics in a two-compartment model. To monitor blood flow continuously, we transfer these two relationships into two equations and calculate the blood flow with the differential information of HbO2 and Hb. In addition, these equations are used to simulate the relationship between blood flow and reoxygenation kinetics after cuff ischemia and a light exercise. Nine healthy subjects volunteered for the cuff ischemia, light arm exercise and arm exercise with cuff ischemia for the experimental study. Results Analysis of experimental data of both cuff ischemia and light exercise using the new equations show greater blood flow (four to six times more than resting values) during recovery, agreeing with previous findings. Further, the simulation and experimental studies of cuff ischemia and light exercise agree with each other. Conclusion We demonstrate the accuracy of this new method by showing that the blood flow obtained from the method agrees with previous data as well as with simulated data. We conclude that this novel continuous blood flow monitoring method can provide blood flow information non-invasively with NIRS. PMID:16704736
Time series analysis of the Antarctic Circumpolar Wave via symbolic transfer entropy
NASA Astrophysics Data System (ADS)
Oh, Mingi; Kim, Sehyun; Lim, Kyuseong; Kim, Soo Yong
2018-06-01
An attempt to interpret a large-scale climate phenomenon in the Southern Ocean (SO), the Antarctic Circumpolar Wave (ACW), has been made using an information entropy method, symbolic transfer entropy (STE). Over the areas of 50-60∘S latitude belt, information flow for four climate variables, sea surface temperature (SST), sea-ice edge (SIE), sea level pressure (SLP) and meridional wind speed (MWS) is examined. We found a tendency that eastward flow of information is preferred only for oceanic variables, which is a main characteristic of the ACW, an eastward wave making a circuit around the Antarctica. Since the ACW is the coherent pattern in both ocean and atmosphere it is reasonable to infer that the tendency reflects the Antarctic Circumpolar Current (ACC) encircling the Antarctica, rather than an evidence of the ACW. We observed one common feature for all four variables, a strong information flow over the area of the eastern Pacific Ocean, which suggest a signature of El Nino Southern Oscillation (ENSO).
Correlations and flow of information between the New York Times and stock markets
NASA Astrophysics Data System (ADS)
García-Medina, Andrés; Sandoval, Leonidas; Bañuelos, Efraín Urrutia; Martínez-Argüello, A. M.
2018-07-01
We use Random Matrix Theory (RMT) and information theory to analyze the correlations and flow of information between 64,939 news from The New York Times and 40 world financial indices during 10 months along the period 2015-2016. The set of news is quantified and transformed into daily polarity time series using tools from sentiment analysis. The results show that a common factor influences the world indices and news, which even share the same dynamics. Furthermore, the global correlation structure is found to be preserved when adding white noise, what indicates that correlations are not due to sample size effects. Likewise, we find a considerable amount of information flowing from news to world indices for some specific delay. This is of practical interest for trading purposes. Our results suggest a deep relationship between news and world indices, and show a situation where news drive world market movements, giving a new evidence to support behavioral finance as the current economic paradigm.
Using U.S. Geological Survey data in material flow analysis: An introduction
Sibley, S.F.
2009-01-01
A few sources of basic data on worldwide raw materials production and consumption exist that are independently developed and freely available to the public. This column is an introduction to the types of information available from the U.S. Geological Survey (USGS), and explains how the data are assembled. The kind of information prepared by the USGS is essential to U.S. materials flow studies because the data make it possible to conduct these studies within a global context. The data include primary and secondary (scrap) production, consumption and stocks (mostly limited to the United States unless calculated), trade (not readily available for all countries), and prices for more than 80 mineral commodities. Materials flow studies by USGS specialists using these data are continuing (http://minerals.usgs.gov/minerals/mflow/). Figure 1 shows from where the data are collected and where they are used. Minerals information was downloaded by users 5.8 million times from USGS minerals information Web pages in 2008.
Debris flow-induced topographic changes: effects of recurrent debris flow initiation.
Chen, Chien-Yuan; Wang, Qun
2017-08-12
Chushui Creek in Shengmu Village, Nantou County, Taiwan, was analyzed for recurrent debris flow using numerical modeling and geographic information system (GIS) spatial analysis. The two-dimensional water flood and mudflow simulation program FLO-2D were used to simulate debris flow induced by rainfall during typhoon Herb in 1996 and Mindulle in 2004. Changes in topographic characteristics after the debris flows were simulated for the initiation of hydrological characteristics, magnitude, and affected area. Changes in topographic characteristics included those in elevation, slope, aspect, stream power index (SPI), topographic wetness index (TWI), and hypsometric curve integral (HI), all of which were analyzed using GIS spatial analysis. The results show that the SPI and peak discharge in the basin increased after a recurrence of debris flow. The TWI was higher in 2003 than in 2004 and indicated higher potential of landslide initiation when the slope of the basin was steeper. The HI revealed that the basin was in its mature stage and was shifting toward the old stage. Numerical simulation demonstrated that the parameters' mean depth, maximum depth, affected area, mean flow rate, maximum flow rate, and peak flow discharge were increased after recurrent debris flow, and peak discharge occurred quickly.
[The application of Doppler broadening and Doppler shift to spectral analysis].
Xu, Wei; Fang, Zi-shen
2002-08-01
The distinction between Doppler broadening and Doppler shift has analyzed, Doppler broadening locally results from the distribution of velocities of the emitting particles, the line width gives the information on temperature of emitting particles. Doppler shift results when the emitting particles have a bulk non random flow velocity in a particular direction, the drift of central wavelength gives the information on flow velocity of emitting particles, and the Doppler shift only drifts the profile of line without changing the width. The difference between Gaussian fitting and the distribution of chord-integral line shape have also been discussed. The distribution of H alpha spectral line shape has been derived from the surface of limiter in HT-6M Tokamak with optical spectroscope multichannel analysis (OSMA), the result by double Gaussian fitting shows that the line shape make up of two port, the emitting of reflect particles with higher energy and the release particle from the limiter surface. Ion temperature and recycling particle flow velocity have been obtained from Doppler broadening and Doppler shift.
Consumables and wastes estimations for the First Lunar Outpost
NASA Technical Reports Server (NTRS)
Theis, Ronald L. A.; Ballin, Mark G.; Evert, Martha F.
1992-01-01
The First Lunar Outpost mission is a design reference mission for the first human return to the moon. This paper describes a set of consumables and waste material estimations made on the basis of the First Lunar Outpost mission scenario developed by the NASA Exploration Programs Office. The study includes the definition of a functional interface framework and a top-level set of consumables and waste materials to be evaluated, the compilation of mass flow information from mission developers supplemented with information from the literature, and the analysis of the resulting mass flow information to gain insight about the possibility of material flow integration between the moon outpost elements. The results of the study of the details of the piloted mission and the habitat are used to identify areas where integration of consumables and wastes across different mission elements could provide possible launch mass savings.
Application guide for AFINCH (Analysis of Flows in Networks of Channels) described by NHDPlus
Holtschlag, David J.
2009-01-01
AFINCH (Analysis of Flows in Networks of CHannels) is a computer application that can be used to generate a time series of monthly flows at stream segments (flowlines) and water yields for catchments defined in the National Hydrography Dataset Plus (NHDPlus) value-added attribute system. AFINCH provides a basis for integrating monthly flow data from streamgages, water-use data, monthly climatic data, and land-cover characteristics to estimate natural monthly water yields from catchments by user-defined regression equations. Images of monthly water yields for active streamgages are generated in AFINCH and provide a basis for detecting anomalies in water yields, which may be associated with undocumented flow diversions or augmentations. Water yields are multiplied by the drainage areas of the corresponding catchments to estimate monthly flows. Flows from catchments are accumulated downstream through the streamflow network described by the stream segments. For stream segments where streamgages are active, ratios of measured to accumulated flows are computed. These ratios are applied to upstream water yields to proportionally adjust estimated flows to match measured flows. Flow is conserved through the NHDPlus network. A time series of monthly flows can be generated for stream segments that average about 1-mile long, or monthly water yields from catchments that average about 1 square mile. Estimated monthly flows can be displayed within AFINCH, examined for nonstationarity, and tested for monotonic trends. Monthly flows also can be used to estimate flow-duration characteristics at stream segments. AFINCH generates output files of monthly flows and water yields that are compatible with ArcMap, a geographical information system analysis and display environment. Chloropleth maps of monthly water yield and flow can be generated and analyzed within ArcMap by joining NHDPlus data structures with AFINCH output. Matlab code for the AFINCH application is presented.
Effect of current vehicle’s interruption on traffic stability in cooperative car-following theory
NASA Astrophysics Data System (ADS)
Zhang, Geng; Liu, Hui
2017-12-01
To reveal the impact of the current vehicle’s interruption information on traffic flow, a new car-following model with consideration of the current vehicle’s interruption is proposed and the influence of the current vehicle’s interruption on traffic stability is investigated through theoretical analysis and numerical simulation. By linear analysis, the linear stability condition of the new model is obtained and the negative influence of the current vehicle’s interruption on traffic stability is shown in the headway-sensitivity space. Through nonlinear analysis, the modified Korteweg-de Vries (mKdV) equation of the new model near the critical point is derived and it can be used to describe the propagating behavior of the traffic density wave. Finally, numerical simulation confirms the analytical results, which shows that the current vehicle’s interruption information can destabilize traffic flow and should be considered in real traffic.
Kamp, Marcel A; Slotty, Philipp; Turowski, Bernd; Etminan, Nima; Steiger, Hans-Jakob; Hänggi, Daniel; Stummer, Walter
2012-03-01
Intraoperative measurements of cerebral blood flow are of interest during vascular neurosurgery. Near-infrared indocyanine green (ICG) fluorescence angiography was introduced for visualizing vessel patency intraoperatively. However, quantitative information has not been available. To report our experience with a microscope with an integrated dynamic ICG fluorescence analysis system supplying semiquantitative information on blood flow. We recorded ICG fluorescence curves of cortex and cerebral vessels using software integrated into the surgical microscope (Flow 800 software; Zeiss Pentero) in 30 patients undergoing surgery for different pathologies. The following hemodynamic parameters were assessed: maximum intensity, rise time, time to peak, time to half-maximal fluorescence, cerebral blood flow index, and transit times from arteries to cortex. For patients without obvious perfusion deficit, maximum fluorescence intensity was 177.7 arbitrary intensity units (AIs; 5-mg ICG bolus), mean rise time was 5.2 seconds (range, 2.9-8.2 seconds; SD, 1.3 seconds), mean time to peak was 9.4 seconds (range, 4.9-15.2 seconds; SD, 2.5 seconds), mean cerebral blood flow index was 38.6 AI/s (range, 13.5-180.6 AI/s; SD, 36.9 seconds), and mean transit time was 1.5 seconds (range, 360 milliseconds-3 seconds; SD, 0.73 seconds). For 3 patients with impaired cerebral perfusion, time to peak, rise time, and transit time between arteries and cortex were markedly prolonged (>20, >9 , and >5 seconds). In single patients, the degree of perfusion impairment could be quantified by the cerebral blood flow index ratios between normal and ischemic tissue. Transit times also reflected blood flow perturbations in arteriovenous fistulas. Quantification of ICG-based fluorescence angiography appears to be useful for intraoperative monitoring of arterial patency and regional cerebral blood flow.
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; MacMurdy, Dale E.; Kapania, Rakesh K.
1994-01-01
Strong interactions between flow about an aircraft wing and the wing structure can result in aeroelastic phenomena which significantly impact aircraft performance. Time-accurate methods for solving the unsteady Navier-Stokes equations have matured to the point where reliable results can be obtained with reasonable computational costs for complex non-linear flows with shock waves, vortices and separations. The ability to combine such a flow solver with a general finite element structural model is key to an aeroelastic analysis in these flows. Earlier work involved time-accurate integration of modal structural models based on plate elements. A finite element model was developed to handle three-dimensional wing boxes, and incorporated into the flow solver without the need for modal analysis. Static condensation is performed on the structural model to reduce the structural degrees of freedom for the aeroelastic analysis. Direct incorporation of the finite element wing-box structural model with the flow solver requires finding adequate methods for transferring aerodynamic pressures to the structural grid and returning deflections to the aerodynamic grid. Several schemes were explored for handling the grid-to-grid transfer of information. The complex, built-up nature of the wing-box complicated this transfer. Aeroelastic calculations for a sample wing in transonic flow comparing various simple transfer schemes are presented and discussed.
Fukuyama, Atsushi; Isoda, Haruo; Morita, Kento; Mori, Marika; Watanabe, Tomoya; Ishiguro, Kenta; Komori, Yoshiaki; Kosugi, Takafumi
2017-01-01
Introduction: We aim to elucidate the effect of spatial resolution of three-dimensional cine phase contrast magnetic resonance (3D cine PC MR) imaging on the accuracy of the blood flow analysis, and examine the optimal setting for spatial resolution using flow phantoms. Materials and Methods: The flow phantom has five types of acrylic pipes that represent human blood vessels (inner diameters: 15, 12, 9, 6, and 3 mm). The pipes were fixed with 1% agarose containing 0.025 mol/L gadolinium contrast agent. A blood-mimicking fluid with human blood property values was circulated through the pipes at a steady flow. Magnetic resonance (MR) images (three-directional phase images with speed information and magnitude images for information of shape) were acquired using the 3-Tesla MR system and receiving coil. Temporal changes in spatially-averaged velocity and maximum velocity were calculated using hemodynamic analysis software. We calculated the error rates of the flow velocities based on the volume flow rates measured with a flowmeter and examined measurement accuracy. Results: When the acrylic pipe was the size of the thoracicoabdominal or cervical artery and the ratio of pixel size for the pipe was set at 30% or lower, spatially-averaged velocity measurements were highly accurate. When the pixel size ratio was set at 10% or lower, maximum velocity could be measured with high accuracy. It was difficult to accurately measure maximum velocity of the 3-mm pipe, which was the size of an intracranial major artery, but the error for spatially-averaged velocity was 20% or less. Conclusions: Flow velocity measurement accuracy of 3D cine PC MR imaging for pipes with inner sizes equivalent to vessels in the cervical and thoracicoabdominal arteries is good. The flow velocity accuracy for the pipe with a 3-mm-diameter that is equivalent to major intracranial arteries is poor for maximum velocity, but it is relatively good for spatially-averaged velocity. PMID:28132996
Progress in fuel systems to meet new fuel economy and emissions standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-12-31
This publication includes information describing the latest developments within the automotive industry on fuel system hardware and control strategies. Contents include: Slow heating process of a heated pintle-type gasoline fuel injector; Mixture preparation measurements; Study of fuel flow rate change in injector for methanol fueled S.I. engine; Flow and structural analysis for fuel pressure regulator performance; A new method to analyze fuel behavior in a spark ignition engine; Throttle body at engine idle -- tolerance effect on flow rate; and more.
Identification of internal flow dynamics in two experimental catchments
Hansen, D.P.; Jakeman, A.J.; Kendall, C.; Weizu, G.
1997-01-01
Identification of the internal flow dynamics in catchments is difficult because of the lack of information in precipitation -stream discharge time series alone. Two experimental catchments, Hydrohill and Nandadish, near Nanjing in China, have been set up to monitor internal flows reaching the catchment stream at various depths, from the surface runoff to the bedrock. With analysis of the precipitation against these internal discharges, it is possible to quantify the time constants and volumes associated with various flowpaths in both catchments.
Detection of Abnormal Events via Optical Flow Feature Analysis
Wang, Tian; Snoussi, Hichem
2015-01-01
In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227
Flow of GE90 Turbofan Engine Simulated
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1999-01-01
The objective of this task was to create and validate a three-dimensional model of the GE90 turbofan engine (General Electric) using the APNASA (average passage) flow code. This was a joint effort between GE Aircraft Engines and the NASA Lewis Research Center. The goal was to perform an aerodynamic analysis of the engine primary flow path, in under 24 hours of CPU time, on a parallel distributed workstation system. Enhancements were made to the APNASA Navier-Stokes code to make it faster and more robust and to allow for the analysis of more arbitrary geometry. The resulting simulation exploited the use of parallel computations by using two levels of parallelism, with extremely high efficiency.The primary flow path of the GE90 turbofan consists of a nacelle and inlet, 49 blade rows of turbomachinery, and an exhaust nozzle. Secondary flows entering and exiting the primary flow path-such as bleed, purge, and cooling flows-were modeled macroscopically as source terms to accurately simulate the engine. The information on these source terms came from detailed descriptions of the cooling flow and from thermodynamic cycle system simulations. These provided boundary condition data to the three-dimensional analysis. A simplified combustor was used to feed boundary conditions to the turbomachinery. Flow simulations of the fan, high-pressure compressor, and high- and low-pressure turbines were completed with the APNASA code.
Tailings dam-break flow - Analysis of sediment transport
NASA Astrophysics Data System (ADS)
Aleixo, Rui; Altinakar, Mustafa
2015-04-01
A common solution to store mining debris is to build tailings dams near the mining site. These dams are usually built with local materials such as mining debris and are more vulnerable than concrete dams (Rico et al. 2008). of The tailings and the pond water generally contain heavy metals and various toxic chemicals used in ore extraction. Thus, the release of tailings due to a dam-break can have severe ecological consequences in the environment. A tailings dam-break has many similarities with a common dam-break flow. It is highly transient and can be severely descructive. However, a significant difference is that the released sediment-water mixture will behave as a non-Newtonian flow. Existing numerical models used to simulate dam-break flows do not represent correctly the non-Newtonian behavior of tailings under a dam-break flow and may lead to unrealistic and incorrect results. The need for experiments to extract both qualitative and quantitative information regarding these flows is therefore real and actual. The present paper explores an existing experimental data base presented in Aleixo et al. (2014a,b) to further characterize the sediment transport under conditions of a severe transient flow and to extract quantitative information regarding sediment flow rate, sediment velocity, sediment-sediment interactions a among others. Different features of the flow are also described and analyzed in detail. The analysis is made by means of imaging techniques such as Particle Image Velocimetry and Particle Tracking Velocimetry that allow extracting not only the velocity field but the Lagrangian description of the sediments as well. An analysis of the results is presented and the limitations of the presented experimental approach are discussed. References Rico, M., Benito, G., Salgueiro, AR, Diez-Herrero, A. and Pereira, H.G. (2008) Reported tailings dam failures: A review of the European incidents in the worldwide context , Journal of Hazardous Materials, 152, 846-852 . Aleixo, R., Ozeren, Y., Altinakar, M. and Wren, D. (2014a) Velocity Measurements using Particle Tracking in Tailings dam Failure experiments, Proceedings of the 3rd IAHR-Europe conference, Porto, Portugal. Aleixo, R., Ozeren, Y., Altinakar, M. (2014b) Tailing dam-break analysis by means of a combined PIV-PTV tool, Proceedings of the River Flow Conference, Lausanne, Switzerland.
Information Flow in an Atmospheric Model and Data Assimilation
ERIC Educational Resources Information Center
Yoon, Young-noh
2011-01-01
Weather forecasting consists of two processes, model integration and analysis (data assimilation). During the model integration, the state estimate produced by the analysis evolves to the next cycle time according to the atmospheric model to become the background estimate. The analysis then produces a new state estimate by combining the background…
2012-12-01
flows, diversity, emergence, networks, fusion, strategic planning, information sharing, ecosystem, hierarchy, NJ Regional Operations Intelligence ...Related Information...........................................................................79 viii 3. Production of Disaster Intelligence for... Intelligence for Field Personnel .................80 5. Focused Collection Efforts to Support FEMA and NJ OEM Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, E.M.
1977-02-01
Poster sessions were used as a vehicle of information exchange. Of the 101 posters presented, abstracts were received for 71. The 71 abstracts presented are concerned with cell-cycle analysis by flow cytometry, flow microfluorometric DNA measurements, application of microfluorometry to cancer chemotherapy, automated classification of neutrophils, and other aspects of automated cytology. (HLW)
Impact of trucking network flow on preferred biorefinery locations in the southern United States
Timothy M. Young; Lee D. Han; James H. Perdue; Stephanie R. Hargrove; Frank M. Guess; Xia Huang; Chung-Hao Chen
2017-01-01
The impact of the trucking transportation network flow was modeled for the southern United States. The study addresses a gap in existing research by applying a Bayesian logistic regression and Geographic Information System (GIS) geospatial analysis to predict biorefinery site locations. A one-way trucking cost assuming a 128.8 km (80-mile) haul distance was estimated...
Extended shortest path selection for package routing of complex networks
NASA Astrophysics Data System (ADS)
Ye, Fan; Zhang, Lei; Wang, Bing-Hong; Liu, Lu; Zhang, Xing-Yi
The routing strategy plays a very important role in complex networks such as Internet system and Peer-to-Peer networks. However, most of the previous work concentrates only on the path selection, e.g. Flooding and Random Walk, or finding the shortest path (SP) and rarely considering the local load information such as SP and Distance Vector Routing. Flow-based Routing mainly considers load balance and still cannot achieve best optimization. Thus, in this paper, we propose a novel dynamic routing strategy on complex network by incorporating the local load information into SP algorithm to enhance the traffic flow routing optimization. It was found that the flow in a network is greatly affected by the waiting time of the network, so we should not consider only choosing optimized path for package transformation but also consider node congestion. As a result, the packages should be transmitted with a global optimized path with smaller congestion and relatively short distance. Analysis work and simulation experiments show that the proposed algorithm can largely enhance the network flow with the maximum throughput within an acceptable calculating time. The detailed analysis of the algorithm will also be provided for explaining the efficiency.
Analysis of energy flow during playground surface impacts.
Davidson, Peter L; Wilson, Suzanne J; Chalmers, David J; Wilson, Barry D; Eager, David; McIntosh, Andrew S
2013-10-01
The amount of energy dissipated away from or returned to a child falling onto a surface will influence fracture risk but is not considered in current standards for playground impact-attenuating surfaces. A two-mass rheological computer simulation was used to model energy flow within the wrist and surface during hand impact with playground surfaces, and the potential of this approach to provide insights into such impacts and predict injury risk examined. Acceleration data collected on-site from typical playground surfaces and previously obtained data from children performing an exercise involving freefalling with a fully extended arm provided input. The model identified differences in energy flow properties between playground surfaces and two potentially harmful surface characteristics: more energy was absorbed by (work done on) the wrist during both impact and rebound on rubber surfaces than on bark, and rubber surfaces started to rebound (return energy to the wrist) while the upper limb was still moving downward. Energy flow analysis thus provides information on playground surface characteristics and the impact process, and has the potential to identify fracture risks, inform the development of safer impact-attenuating surfaces, and contribute to development of new energy-based arm fracture injury criteria and tests for use in conjunction with current methods.
NASA Astrophysics Data System (ADS)
Mallast, U.; Gloaguen, R.; Geyer, S.; Rödiger, T.; Siebert, C.
2011-08-01
In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxiliary information and finally evaluated in terms of hydro-geological significance. Using the example of the western catchment of the Dead Sea (Israel/Palestine), the orientation and location of the differentiated lineaments are compared to characteristics of known structural features. We demonstrate that a strong correlation between lineaments and structural features exists. Using Euclidean distances between lineaments and wells provides an assessment criterion to evaluate the hydraulic significance of detected lineaments. Based on this analysis, we suggest that the statistical analysis of lineaments allows a delineation of flow-paths and thus significant information on groundwater movements. To validate the flow-paths we compare them to existing results of groundwater models that are based on well data.
NASA Astrophysics Data System (ADS)
Wang, Yunong; Cheng, Rongjun; Ge, Hongxia
2017-08-01
In this paper, a lattice hydrodynamic model is derived considering not only the effect of flow rate difference but also the delayed feedback control signal which including more comprehensive information. The control method is used to analyze the stability of the model. Furthermore, the critical condition for the linear steady traffic flow is deduced and the numerical simulation is carried out to investigate the advantage of the proposed model with and without the effect of flow rate difference and the control signal. The results are consistent with the theoretical analysis correspondingly.
NASA Astrophysics Data System (ADS)
Wei, Jun; Zhou, Chuan; Chan, Heang-Ping; Chughtai, Aamer; Agarwal, Prachi; Kuriakose, Jean; Hadjiiski, Lubomir; Patel, Smita; Kazerooni, Ella
2015-03-01
We are developing a computer-aided detection system to assist radiologists in detection of non-calcified plaques (NCPs) in coronary CT angiograms (cCTA). In this study, we performed quantitative analysis of arterial flow properties in each vessel branch and extracted flow information to differentiate the presence and absence of stenosis in a vessel segment. Under rest conditions, blood flow in a single vessel branch was assumed to follow Poiseuille's law. For a uniform pressure distribution, two quantitative flow features, the normalized arterial compliance per unit length (Cu) and the normalized volumetric flow (Q) along the vessel centerline, were calculated based on the parabolic Poiseuille solution. The flow features were evaluated for a two-class classification task to differentiate NCP candidates obtained by prescreening as true NCPs and false positives (FPs) in cCTA. For evaluation, a data set of 83 cCTA scans was retrospectively collected from 83 patient files with IRB approval. A total of 118 NCPs were identified by experienced cardiothoracic radiologists. The correlation between the two flow features was 0.32. The discriminatory ability of the flow features evaluated as the area under the ROC curve (AUC) was 0.65 for Cu and 0.63 for Q in comparison with AUCs of 0.56-0.69 from our previous luminal features. With stepwise LDA feature selection, volumetric flow (Q) was selected in addition to three other luminal features. With FROC analysis, the test results indicated a reduction of the FP rates to 3.14, 1.98, and 1.32 FPs/scan at sensitivities of 90%, 80%, and 70%, respectively. The study indicated that quantitative blood flow analysis has the potential to provide useful features for the detection of NCPs in cCTA.
OpinionFlow: Visual Analysis of Opinion Diffusion on Social Media.
Wu, Yingcai; Liu, Shixia; Yan, Kai; Liu, Mengchen; Wu, Fangzhao
2014-12-01
It is important for many different applications such as government and business intelligence to analyze and explore the diffusion of public opinions on social media. However, the rapid propagation and great diversity of public opinions on social media pose great challenges to effective analysis of opinion diffusion. In this paper, we introduce a visual analysis system called OpinionFlow to empower analysts to detect opinion propagation patterns and glean insights. Inspired by the information diffusion model and the theory of selective exposure, we develop an opinion diffusion model to approximate opinion propagation among Twitter users. Accordingly, we design an opinion flow visualization that combines a Sankey graph with a tailored density map in one view to visually convey diffusion of opinions among many users. A stacked tree is used to allow analysts to select topics of interest at different levels. The stacked tree is synchronized with the opinion flow visualization to help users examine and compare diffusion patterns across topics. Experiments and case studies on Twitter data demonstrate the effectiveness and usability of OpinionFlow.
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1972-01-01
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
2017-01-01
Background Health care conferences present a unique opportunity to network, spark innovation, and disseminate novel information to a large audience, but the dissemination of information typically stays within very specific networks. Social network analysis can be adopted to understand the flow of information between virtual social communities and the role of patients within the network. Objective The purpose of this study is to examine the impact engaged patients bring to health care conference social media information flow and how they expand dissemination and distribution of tweets compared to other health care conference stakeholders such as physicians and researchers. Methods From January 2014 through December 2016, 7,644,549 tweets were analyzed from 1672 health care conferences with at least 1000 tweets who had registered in Symplur’s Health Care Hashtag Project from 2014 to 2016. The tweet content was analyzed to create a list of the top 100 influencers by mention from each conference, who were then subsequently categorized by stakeholder group. Multivariate linear regression models were created using stepwise function building to identify factors explaining variability as predictor variables for the model in which conference tweets were taken as the dependent variable. Results Inclusion of engaged patients in health care conference social media was low compared to that of physicians and has not significantly changed over the last 3 years. When engaged patient voices are included in health care conferences, they greatly increase information flow as measured by total tweet volume (beta=301.6) compared to physicians (beta=137.3, P<.001), expand propagation of information tweeted during a conference as measured by social media impressions created (beta=1,700,000) compared to physicians (beta=270,000, P<.001), and deepen engagement in the tweet conversation as measured by replies to their tweets (beta=24.4) compared to physicians (beta=5.5, P<.001). Social network analysis of hubs and authorities revealed that patients had statistically significant higher hub scores (mean 8.26×10-4, SD 2.96×10-4) compared to other stakeholder groups’ Twitter accounts (mean 7.19×10-4, SD 3.81×10-4; t273.84=4.302, P<.001). Conclusions Although engaged patients are powerful accelerators of information flow, expanders of tweet propagation, and greatly deepen engagement in conversation of tweets on social media of health care conferences compared to physicians, they represent only 1.4% of the stakeholder mix of the top 100 influencers in the conversation. Health care conferences that fail to engage patients in their proceedings may risk limiting their engagement with the public, disseminating scientific information to a narrow community and slowing flow of information across social media channels. PMID:28818821
Federal-State Cooperative Program in Kansas, seminar proceedings, July 1985
Huntzinger, T.L.
1985-01-01
During the past few years, water-resource management in Kansas has undergone reorientation with the creation of the Kansas Water Authority and the Kansas Water office. New thrusts toward long-term goals based on the Kansas State Water plan demand strong communication and coordination between all water-related agencies within the State. The seminar discussed in this report was an initial step by the Kansas Water Office to assure the continued presence of a technical-coordination process and to provide an opportunity for the U.S. Geological Survey to summarize their technical-informational activities in Kansas for the benefit of State and Federal water agencies with the State. The seminar was held on July 8 and 9, 1985, in Lawrence, Kansas. The agenda included a summary of the data-collection activities and short synopses of projects completed within the past year and those currently underway. The data program discussions described the information obtained at the surface water, groundwater, water quality, and sediment sites in Kansas. Interpretive projects summarized included studies in groundwater modeling, areal hydrologic analysis, regional analysis of floods , low-flow, high-flow, and flow-volume characteristics, water quality of groundwater and lakes, and traveltime and transit-loss analysis. (USGS)
Observation of airplane flow fields by natural condensation effects
NASA Technical Reports Server (NTRS)
Campbell, James F.; Chambers, Joseph R.; Rumsey, Christopher L.
1988-01-01
In-flight condensation patterns can illustrate a variety of airplane flow fields, such as attached and separated flows, vortex flows, and expansion and shock waves. These patterns are a unique source of flow visualization that has not been utilized previously. Condensation patterns at full-scale Reynolds number can provide useful information for researchers experimenting in subscale tunnels. It is also shown that computed values of relative humidity in the local flow field provide an inexpensive way to analyze the qualitative features of the condensation pattern, although a more complete theoretical modeling is necessary to obtain details of the condensation process. Furthermore, the analysis revealed that relative humidity is more sensitive to changes in local static temperature than to changes in pressure.
Estimates of Lava Eruption Rates at Alba Patera, Mars
NASA Technical Reports Server (NTRS)
Baloga, S. M.; Pieri, D. C.
1985-01-01
The Martian volcanic complex Alba Patera exhibits a suite of well-defined, long and relatively narrow lava flows qualitatively resembling those found in Hawaii. Even without any information on the duration of the Martian flows, eruption rates (total volume discharge/duration of the extrusion) estimates are implied by the physical dimensions of the flows and the likely conjecture that Stephan-Boltzmann radiation is the dominating thermal loss mechanism. The ten flows in this analysis emanate radially from the central vent and were recently measured in length, plan areas, and average thicknesses by shadow measurement techniques. The dimensions of interest are shown. Although perhaps morphologically congruent to certain Hawaiian flows, the dramatically expanded physical dimensions of the Martian flows argues for some markedly distinct differences in lava flow composition for eruption characteristics.
Spike Code Flow in Cultured Neuronal Networks.
Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime; Kamimura, Takuya; Yagi, Yasushi; Mizuno-Matsumoto, Yuko; Chen, Yen-Wei
2016-01-01
We observed spike trains produced by one-shot electrical stimulation with 8 × 8 multielectrodes in cultured neuronal networks. Each electrode accepted spikes from several neurons. We extracted the short codes from spike trains and obtained a code spectrum with a nominal time accuracy of 1%. We then constructed code flow maps as movies of the electrode array to observe the code flow of "1101" and "1011," which are typical pseudorandom sequence such as that we often encountered in a literature and our experiments. They seemed to flow from one electrode to the neighboring one and maintained their shape to some extent. To quantify the flow, we calculated the "maximum cross-correlations" among neighboring electrodes, to find the direction of maximum flow of the codes with lengths less than 8. Normalized maximum cross-correlations were almost constant irrespective of code. Furthermore, if the spike trains were shuffled in interval orders or in electrodes, they became significantly small. Thus, the analysis suggested that local codes of approximately constant shape propagated and conveyed information across the network. Hence, the codes can serve as visible and trackable marks of propagating spike waves as well as evaluating information flow in the neuronal network.
Alternatives to current flow cytometry data analysis for clinical and research studies.
Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul
2018-02-01
Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.
Fantini, Bernardino
2006-01-01
From its first proposal, the Central Dogma had a graphical form, complete with arrows of different types, and this form quickly became its standard presentation. In different scientific contexts, arrows have different meanings and in this particular case the arrows indicated the flow of information among different macromolecules. A deeper analysis illustrates that the arrows also imply a causal statement, directly connected to the causal role of genetic information. The author suggests a distinction between two different kinds of causal links, defined as 'physical causality' and 'biological determination', both implied in the production of biological specificity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faranda, Davide, E-mail: davide.faranda@cea.fr; Dubrulle, Bérengère; Daviaud, François
We introduce a novel way to extract information from turbulent datasets by applying an Auto Regressive Moving Average (ARMA) statistical analysis. Such analysis goes well beyond the analysis of the mean flow and of the fluctuations and links the behavior of the recorded time series to a discrete version of a stochastic differential equation which is able to describe the correlation structure in the dataset. We introduce a new index Υ that measures the difference between the resulting analysis and the Obukhov model of turbulence, the simplest stochastic model reproducing both Richardson law and the Kolmogorov spectrum. We test themore » method on datasets measured in a von Kármán swirling flow experiment. We found that the ARMA analysis is well correlated with spatial structures of the flow, and can discriminate between two different flows with comparable mean velocities, obtained by changing the forcing. Moreover, we show that the Υ is highest in regions where shear layer vortices are present, thereby establishing a link between deviations from the Kolmogorov model and coherent structures. These deviations are consistent with the ones observed by computing the Hurst exponents for the same time series. We show that some salient features of the analysis are preserved when considering global instead of local observables. Finally, we analyze flow configurations with multistability features where the ARMA technique is efficient in discriminating different stability branches of the system.« less
An open-source solution for advanced imaging flow cytometry data analysis using machine learning.
Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew
2017-01-01
Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Hydrogen fluoride (HF) substance flow analysis for safe and sustainable chemical industry.
Kim, Junbeum; Hwang, Yongwoo; Yoo, Mijin; Chen, Sha; Lee, Ik-Mo
2017-11-01
In this study, the chemical substance flow of hydrogen fluoride (hydrofluoric acid, HF) in domestic chemical industries in 2014 was analyzed in order to provide a basic material and information for the establishment of organized management system to ensure safety during HF applications. A total of 44,751 tons of HF was made by four domestic companies (in 2014); import amount was 95,984 tons in 2014 while 21,579 tons of HF was imported in 2005. The export amount of HF was 2180 tons, of which 2074 ton (China, 1422 tons, U.S. 524 tons, and Malaysia, 128 tons) was exported for the manufacturing of semiconductors. Based on the export and import amounts, it can be inferred that HF was used for manufacturing semiconductors. The industries applications of 161,123 tons of HF were as follows: manufacturing of basic inorganic chemical substance (27,937 tons), manufacturing of other chemical products such as detergents (28,208 tons), manufacturing of flat display (24,896 tons), and manufacturing of glass container package (22,002 tons). In this study, an analysis of the chemical substance flow showed that HF was mainly used in the semiconductor industry as well as glass container manufacturing. Combined with other risk management tools and approaches in the chemical industry, the chemical substance flow analysis (CSFA) can be a useful tool and method for assessment and management. The current CSFA results provide useful information for policy making in the chemical industry and national systems. Graphical abstract Hydrogen fluoride chemical substance flows in 2014 in South Korea.
Which catchment characteristics control the temporal dependence structure of daily river flows?
NASA Astrophysics Data System (ADS)
Chiverton, Andrew; Hannaford, Jamie; Holman, Ian; Corstanje, Ron; Prudhomme, Christel; Bloomfield, John; Hess, Tim
2014-05-01
A hydrological classification system would provide information about the dominant processes in the catchment enabling information to be transferred between catchments. Currently there is no widely-agreed upon system for classifying river catchments. This paper developed a novel approach to assess the influence that catchment characteristics have on the precipitation-to-flow relationship, using a catchment classification based on the average temporal dependence structure in daily river flow data over the period 1980 to 2010. Temporal dependence in river flow data is driven by the flow pathways, connectivity and storage within the catchment. Temporal dependence was analysed by creating temporally averaged semi-variograms for a set of 116 near-natural catchments (in order to prevent direct anthropogenic disturbances influencing the results) distributed throughout the UK. Cluster analysis, using the variogram, classified the catchments into four well defined clusters driven by the interaction of catchment characteristics, predominantly characteristics which influence the precipitation-to-flow relationship. Geology, depth to gleyed layer in soils, slope of the catchment and the percentage of arable land were significantly different between the clusters. These characteristics drive the temporal dependence structure by influencing the rate at which water moves through the catchment and / or the storage in the catchment. Arable land is correlated with several other variables, hence is a proxy indicating the residence time of the water in the catchment. Finally, quadratic discriminant analysis was used to show that a model with five catchment characteristics is able to predict the temporal dependence structure for un-gauged catchments. This work demonstrates that a variogram-based approach is a powerful and flexible methodology for grouping catchments based on the precipitation-to-flow relationship which could be applied to any set of catchments with a relatively complete daily river flow record.
Methodology for CFD Design Analysis of National Launch System Nozzle Manifold
NASA Technical Reports Server (NTRS)
Haire, Scot L.
1993-01-01
The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.
Nursing Information Flow in Long-Term Care Facilities.
Wei, Quan; Courtney, Karen L
2018-04-01
Long-term care (LTC), residential care requiring 24-hour nursing services, plays an important role in the health care service delivery system. The purpose of this study was to identify the needed clinical information and information flow to support LTC Registered Nurses (RNs) in care collaboration and clinical decision making. This descriptive qualitative study combines direct observations and semistructured interviews, conducted at Alberta's LTC facilities between May 2014 and August 2015. The constant comparative method (CCM) of joint coding was used for data analysis. Nine RNs from six LTC facilities participated in the study. The RN practice environment includes two essential RN information management aspects: information resources and information spaces. Ten commonly used information resources by RNs included: (1) RN-personal notes; (2) facility-specific templates/forms; (3) nursing processes/tasks; (4) paper-based resident profile; (5) daily care plans; (6) RN-notebooks; (7) medication administration records (MARs); (8) reporting software application (RAI-MDS); (9) people (care providers); and (10) references (i.e., books). Nurses used a combination of shared information spaces, such as the Nurses Station or RN-notebook, and personal information spaces, such as personal notebooks or "sticky" notes. Four essential RN information management functions were identified: collection, classification, storage, and distribution. Six sets of information were necessary to perform RN care tasks and communication, including: (1) admission, discharge, and transfer (ADT); (2) assessment; (3) care plan; (4) intervention (with two subsets: medication and care procedure); (5) report; and (6) reference. Based on the RN information management system requirements, a graphic information flow model was constructed. This baseline study identified key components of a current LTC nursing information management system. The information flow model may assist health information technology (HIT) developers to consolidate the design of HIT solutions for LTC, and serve as a communication tool between nurses and information technology (IT) staff to refine requirements and support further LTC HIT research. Schattauer GmbH Stuttgart.
HOPE information system review
NASA Astrophysics Data System (ADS)
Suzuki, Yoshiaki; Nishiyama, Kenji; Ono, Shuuji; Fukuda, Kouin
1992-08-01
An overview of the review conducted on H-2 Orbiting Plane (HOPE) is presented. A prototype model was constructed by inputting various technical information proposed by related laboratories. Especially operation flow which enables understanding of correlation between various analysis items, judgement criteria, technical data, and interfaces with others was constructed. Technical information data base and retrieval systems were studied. A Macintosh personal computer was selected for information shaping because of its excellent function, performance, operability, and software completeness.
Hodgkins, Glenn A.; Norris, J. Michael; Lent, Robert M.
2014-01-01
Long-term streamflow information is critical for use in several water-related areas that are important to humans and wildlife, including water management, computation of flood and drought flows for water infrastructure, and analysis of climate-related trends. Specific uses are many and diverse and range from informing water rights across state and international boundaries to designing dams and bridges.
A Systems Analysis of Strike Naval Aviation Training
2013-06-01
from external nodes (yellow) and flows through the model design (gray nodes). Arrows represent information flow direction and identify what...multiple times need to be established as external functions accessible by all subroutines • Variables and constants must be defined up-front, and...Downloaded Figure 37. Blocks In Figure 38, proficiency threshold breeches are highlighted to indicate when the resulting skill proficiency drops below the
Normalizing the causality between time series.
Liang, X San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
Survey of Media Forms and Information Flow Models in Microsystems Companies
NASA Astrophysics Data System (ADS)
Durugbo, Christopher; Tiwari, Ashutosh; Alcock, Jeffery R.
The paper presents the findings of a survey of 40 microsystems companies that was carried out to determine the use and the purpose of use of media forms and information flow models within these companies. These companies as 'product-service systems' delivered integrated products and services to realise customer solutions. Data collection was carried out by means of an online survey over 3 months. The survey revealed that 42.5% of respondents made use of data flow diagrams and 10% made use of design structure matrices. The survey also suggests that a majority of companies (75%) made use of textual and diagrammatic media forms for communication, analysis, documentation and representation during design and development processes. The paper also discusses the implications of the survey findings to product-service systems.
Normalizing the causality between time series
NASA Astrophysics Data System (ADS)
Liang, X. San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
NASA Astrophysics Data System (ADS)
Rotzoll, K.; Izuka, S. K.; Nishikawa, T.; Fienen, M. N.; El-Kadi, A. I.
2016-12-01
Some of the volcanic-rock aquifers of the islands of Hawaii are substantially developed, leading to concerns related to the effects of groundwater withdrawals on saltwater intrusion and stream base-flow reduction. A numerical modeling analysis using recent available information (e.g., recharge, withdrawals, hydrogeologic framework, and conceptual models of groundwater flow) advances current understanding of groundwater flow and provides insight into the effects of human activity and climate change on Hawaii's water resources. Three island-wide groundwater-flow models (Kauai, Oahu, and Maui) were constructed using MODFLOW 2005 coupled with the Seawater-Intrusion Package (SWI2), which simulates the transition between saltwater and freshwater in the aquifer as a sharp interface. This approach allowed coarse vertical discretization (maximum of two layers) without ignoring the freshwater-saltwater system at the regional scale. Model construction (FloPy3), parameter estimation (PEST), and analysis of results were streamlined using Python scripts. Model simulations included pre-development (1870) and recent (average of 2001-10) scenarios for each island. Additionally, scenarios for future withdrawals and climate change were simulated for Oahu. We present our streamlined approach and results showing estimated effects of human activity on the groundwater resource by quantifying decline in water levels, rise of the freshwater-saltwater interface, and reduction in stream base flow. Water-resource managers can use this information to evaluate consequences of groundwater development that can constrain future groundwater availability.
NASA Technical Reports Server (NTRS)
Moin, Parviz; Spalart, Philippe R.
1987-01-01
The use of simulation data bases for the examination of turbulent flows is an effective research tool. Studies of the structure of turbulence have been hampered by the limited number of probes and the impossibility of measuring all desired quantities. Also, flow visualization is confined to the observation of passive markers with limited field of view and contamination caused by time-history effects. Computer flow fields are a new resource for turbulence research, providing all the instantaneous flow variables in three-dimensional space. Simulation data bases also provide much-needed information for phenomenological turbulence modeling. Three dimensional velocity and pressure fields from direct simulations can be used to compute all the terms in the transport equations for the Reynolds stresses and the dissipation rate. However, only a few, geometrically simple flows have been computed by direct numerical simulation, and the inventory of simulation does not fully address the current modeling needs in complex turbulent flows. The availability of three-dimensional flow fields also poses challenges in developing new techniques for their analysis, techniques based on experimental methods, some of which are used here for the analysis of direct-simulation data bases in studies of the mechanics of turbulent flows.
Globalization and International Student Mobility: A Network Analysis
ERIC Educational Resources Information Center
Shields, Robin
2013-01-01
This article analyzes changes to the network of international student mobility in higher education over a 10-year period (1999-2008). International student flows have increased rapidly, exceeding 3 million in 2009, and extensive data on mobility provide unique insight into global educational processes. The analysis is informed by three theoretical…
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
Social Network Analysis of the Farabi Exchange Program: Student Mobility
ERIC Educational Resources Information Center
Ugurlu, Zeynep
2016-01-01
Problem Statement: Exchange programs offer communication channels created through student and instructor exchanges; a flow of information takes place through these channels. The Farabi Exchange Program (FEP) is a student and instructor exchange program between institutions of higher education. Through the use of social network analysis and…
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
APPLYING OPERATIONAL ANALYSIS TO URBAN EDUCATIONAL SYSTEMS, A WORKING PAPER.
ERIC Educational Resources Information Center
SISSON, ROGER L.
OPERATIONS RESEARCH CONCEPTS ARE POTENTIALLY USEFUL FOR STUDY OF SUCH LARGE URBAN SCHOOL DISTRICT PROBLEMS AS INFORMATION FLOW, PHYSICAL STRUCTURE OF THE DISTRICT, ADMINISTRATIVE DECISION MAKING BOARD POLICY FUNCTIONS, AND THE BUDGET STRUCTURE. OPERATIONAL ANALYSIS REQUIRES (1) IDENTIFICATION OF THE SYSTEM UNDER STUDY, (2) IDENTIFICATION OF…
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
On the theoretical velocity distribution and flow resistance in natural channels
NASA Astrophysics Data System (ADS)
Moramarco, Tommaso; Dingman, S. Lawrence
2017-12-01
The velocity distribution in natural channels is of considerable interest for streamflow measurements to obtain information on discharge and flow resistance. This study focuses on the comparison of theoretical velocity distributions based on 1) entropy theory, and 2) the two-parameter power law. The analysis identifies the correlation between the parameters of the distributions and defines their dependence on the geometric and hydraulic characteristics of the channel. Specifically, we investigate how the parameters are related to the flow resistance in terms of Manning roughness, shear velocity and water surface slope, and several formulae showing their relationships are proposed. Velocity measurements carried out in the past 20 years at Ponte Nuovo gauged section along the Tiber River, central Italy, are the basis for the analysis.
Sando, Roy; Sando, Steven K.; McCarthy, Peter M.; Dutton, DeAnn M.
2016-04-05
The U.S. Geological Survey (USGS), in cooperation with the Montana Department of Natural Resources and Conservation, completed a study to update methods for estimating peak-flow frequencies at ungaged sites in Montana based on peak-flow data at streamflow-gaging stations through water year 2011. The methods allow estimation of peak-flow frequencies (that is, peak-flow magnitudes, in cubic feet per second, associated with annual exceedance probabilities of 66.7, 50, 42.9, 20, 10, 4, 2, 1, 0.5, and 0.2 percent) at ungaged sites. The annual exceedance probabilities correspond to 1.5-, 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals, respectively.Regional regression analysis is a primary focus of Chapter F of this Scientific Investigations Report, and regression equations for estimating peak-flow frequencies at ungaged sites in eight hydrologic regions in Montana are presented. The regression equations are based on analysis of peak-flow frequencies and basin characteristics at 537 streamflow-gaging stations in or near Montana and were developed using generalized least squares regression or weighted least squares regression.All of the data used in calculating basin characteristics that were included as explanatory variables in the regression equations were developed for and are available through the USGS StreamStats application (http://water.usgs.gov/osw/streamstats/) for Montana. StreamStats is a Web-based geographic information system application that was created by the USGS to provide users with access to an assortment of analytical tools that are useful for water-resource planning and management. The primary purpose of the Montana StreamStats application is to provide estimates of basin characteristics and streamflow characteristics for user-selected ungaged sites on Montana streams. The regional regression equations presented in this report chapter can be conveniently solved using the Montana StreamStats application.Selected results from this study were compared with results of previous studies. For most hydrologic regions, the regression equations reported for this study had lower mean standard errors of prediction (in percent) than the previously reported regression equations for Montana. The equations presented for this study are considered to be an improvement on the previously reported equations primarily because this study (1) included 13 more years of peak-flow data; (2) included 35 more streamflow-gaging stations than previous studies; (3) used a detailed geographic information system (GIS)-based definition of the regulation status of streamflow-gaging stations, which allowed better determination of the unregulated peak-flow records that are appropriate for use in the regional regression analysis; (4) included advancements in GIS and remote-sensing technologies, which allowed more convenient calculation of basin characteristics and investigation of many more candidate basin characteristics; and (5) included advancements in computational and analytical methods, which allowed more thorough and consistent data analysis.This report chapter also presents other methods for estimating peak-flow frequencies at ungaged sites. Two methods for estimating peak-flow frequencies at ungaged sites located on the same streams as streamflow-gaging stations are described. Additionally, envelope curves relating maximum recorded annual peak flows to contributing drainage area for each of the eight hydrologic regions in Montana are presented and compared to a national envelope curve. In addition to providing general information on characteristics of large peak flows, the regional envelope curves can be used to assess the reasonableness of peak-flow frequency estimates determined using the regression equations.
40-Gbps optical backbone network deep packet inspection based on FPGA
NASA Astrophysics Data System (ADS)
Zuo, Yuan; Huang, Zhiping; Su, Shaojing
2014-11-01
In the era of information, the big data, which contains huge information, brings about some problems, such as high speed transmission, storage and real-time analysis and process. As the important media for data transmission, the Internet is the significant part for big data processing research. With the large-scale usage of the Internet, the data streaming of network is increasing rapidly. The speed level in the main fiber optic communication of the present has reached 40Gbps, even 100Gbps, therefore data on the optical backbone network shows some features of massive data. Generally, data services are provided via IP packets on the optical backbone network, which is constituted with SDH (Synchronous Digital Hierarchy). Hence this method that IP packets are directly mapped into SDH payload is named POS (Packet over SDH) technology. Aiming at the problems of real time process of high speed massive data, this paper designs a process system platform based on ATCA for 40Gbps POS signal data stream recognition and packet content capture, which employs the FPGA as the CPU. This platform offers pre-processing of clustering algorithms, service traffic identification and data mining for the following big data storage and analysis with high efficiency. Also, the operational procedure is proposed in this paper. Four channels of 10Gbps POS signal decomposed by the analysis module, which chooses FPGA as the kernel, are inputted to the flow classification module and the pattern matching component based on TCAM. Based on the properties of the length of payload and net flows, buffer management is added to the platform to keep the key flow information. According to data stream analysis, DPI (deep packet inspection) and flow balance distribute, the signal is transmitted to the backend machine through the giga Ethernet ports on back board. Practice shows that the proposed platform is superior to the traditional applications based on ASIC and NP.
Effects of background motion on eye-movement information.
Nakamura, S
1997-02-01
The effect of background stimulus on eye-movement information was investigated by analyzing the underestimation of the target velocity during pursuit eye movement (Aubert-Fleishl paradox). In the experiment, a striped pattern with various brightness contrasts and spatial frequencies was used as a background stimulus, which was moved at various velocities. Analysis showed that the perceived velocity of the pursuit target, which indicated the magnitudes of eye-movement information, decreased when the background stripes moved in the same direction as eye movement at higher velocities and increased when the background moved in the opposite direction. The results suggest that the eye-movement information varied as a linear function of the velocity of the motion of the background retinal image (optic flow). In addition, the effectiveness of optic flow on eye-movement information was determined by the attributes of the background stimulus such as the brightness contrast or the spatial frequency of the striped pattern.
Dubelaar, G B; Gerritzen, P L; Beeker, A E; Jonker, R R; Tangen, K
1999-12-01
The high costs of microscopical determination and counting of phytoplankton often limit sampling frequencies below an acceptable level for the monitoring of dynamic ecosystems. Although having a limited discrimination power, flow cytometry allows the analysis of large numbers of samples to a level that is sufficient for many basic monitoring jobs. For this purpose, flow cytometers should not be restricted to research laboratories. We report here on the development of an in situ flow cytometer for autonomous operation inside a small moored buoy or on other platforms. Operational specifications served a wide range of applications in the aquatic field. Specific conditions had to be met with respect to the operation platform and autonomy. A small, battery-operated flow cytometer resulted, requiring no external sheath fluid supply. Because it was designed to operate in a buoy, we call it CytoBuoy. Sampling, analysis, and radio transmission of the data proceed automatically at user-defined intervals. A powerful feature is the acquisition and radio transmission of full detector pulse shapes of each particle. This provides valuable morphological information for particles larger than the 5-microm laser focus. CytoBuoy allows on-line in situ particle analysis, estimation of phytoplankton biomass, and discrimination between different phytoplankton groups. This will increase the applicability of flow cytometry in the field of environmental monitoring. Copyright 1999 Wiley-Liss, Inc.
On-line metabolic pathway analysis based on metabolic signal flow diagram.
Shi, H; Shimizu, K
In this work, an integrated modeling approach based on a metabolic signal flow diagram and cellular energetics was used to model the metabolic pathway analysis for the cultivation of yeast on glucose. This approach enables us to make a clear analysis of the flow direction of the carbon fluxes in the metabolic pathways as well as of the degree of activation of a particular pathway for the synthesis of biomaterials for cell growth. The analyses demonstrate that the main metabolic pathways of Saccharomyces cerevisiae change significantly during batch culture. Carbon flow direction is toward glycolysis to satisfy the increase of requirement for precursors and energy. The enzymatic activation of TCA cycle seems to always be at normal level, which may result in the overflow of ethanol due to its limited capacity. The advantage of this approach is that it adopts both virtues of the metabolic signal flow diagram and the simple network analysis method, focusing on the investigation of the flow directions of carbon fluxes and the degree of activation of a particular pathway or reaction loop. All of the variables used in the model equations were determined on-line; the information obtained from the calculated metabolic coefficients may result in a better understanding of cell physiology and help to evaluate the state of the cell culture process. Copyright 1998 John Wiley & Sons, Inc.
Holmquist-Johnson, C. L.
2009-01-01
River spanning rock structures are being constructed for water delivery as well as to enable fish passage at barriers and provide or improve the aquatic habitat for endangered fish species. Current design methods are based upon anecdotal information applicable to a narrow range of channel conditions. The complex flow patterns and performance of rock weirs is not well understood. Without accurate understanding of their hydraulics, designers cannot address the failure mechanisms of these structures. Flow characteristics such as jets, near bed velocities, recirculation, eddies, and plunging flow govern scour pool development. These detailed flow patterns can be replicated using a 3D numerical model. Numerical studies inexpensively simulate a large number of cases resulting in an increased range of applicability in order to develop design tools and predictive capability for analysis and design. The analysis and results of the numerical modeling, laboratory modeling, and field data provide a process-based method for understanding how structure geometry affects flow characteristics, scour development, fish passage, water delivery, and overall structure stability. Results of the numerical modeling allow designers to utilize results of the analysis to determine the appropriate geometry for generating desirable flow parameters. The end product of this research will develop tools and guidelines for more robust structure design or retrofits based upon predictable engineering and hydraulic performance criteria. ?? 2009 ASCE.
Information theory and the ethylene genetic network.
González-García, José S; Díaz, José
2011-10-01
The original aim of the Information Theory (IT) was to solve a purely technical problem: to increase the performance of communication systems, which are constantly affected by interferences that diminish the quality of the transmitted information. That is, the theory deals only with the problem of transmitting with the maximal precision the symbols constituting a message. In Shannon's theory messages are characterized only by their probabilities, regardless of their value or meaning. As for its present day status, it is generally acknowledged that Information Theory has solid mathematical foundations and has fruitful strong links with Physics in both theoretical and experimental areas. However, many applications of Information Theory to Biology are limited to using it as a technical tool to analyze biopolymers, such as DNA, RNA or protein sequences. The main point of discussion about the applicability of IT to explain the information flow in biological systems is that in a classic communication channel, the symbols that conform the coded message are transmitted one by one in an independent form through a noisy communication channel, and noise can alter each of the symbols, distorting the message; in contrast, in a genetic communication channel the coded messages are not transmitted in the form of symbols but signaling cascades transmit them. Consequently, the information flow from the emitter to the effector is due to a series of coupled physicochemical processes that must ensure the accurate transmission of the message. In this review we discussed a novel proposal to overcome this difficulty, which consists of the modeling of gene expression with a stochastic approach that allows Shannon entropy (H) to be directly used to measure the amount of uncertainty that the genetic machinery has in relation to the correct decoding of a message transmitted into the nucleus by a signaling pathway. From the value of H we can define a function I that measures the amount of information content in the input message that the cell's genetic machinery is processing during a given time interval. Furthermore, combining Information Theory with the frequency response analysis of dynamical systems we can examine the cell's genetic response to input signals with varying frequencies, amplitude and form, in order to determine if the cell can distinguish between different regimes of information flow from the environment. In the particular case of the ethylene signaling pathway, the amount of information managed by the root cell of Arabidopsis can be correlated with the frequency of the input signal. The ethylene signaling pathway cuts off very low and very high frequencies, allowing a window of frequency response in which the nucleus reads the incoming message as a varying input. Outside of this window the nucleus reads the input message as an approximately non-varying one. This frequency response analysis is also useful to estimate the rate of information transfer during the transport of each new ERF1 molecule into the nucleus. Additionally, application of Information Theory to analysis of the flow of information in the ethylene signaling pathway provides a deeper insight in the form in which the transition between auxin and ethylene hormonal activity occurs during a circadian cycle. An ambitious goal for the future would be to use Information Theory as a theoretical foundation for a suitable model of the information flow that runs at each level and through all levels of biological organization.
Streamlining Transportation Corridor Planning Processess: Freight and Traffic Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franzese, Oscar
2010-08-01
The traffic investigation is one of the most important parts of an Environmental Impact Statement of projects involving the construction of new roadway facilities and/or the improvement of existing ones. The focus of the traffic analysis is on the determination of anticipated traffic flow characteristics of the proposed project, by the application of analytical methods that can be grouped under the umbrella of capacity analysis methodologies. In general, the main traffic parameter used in EISs to describe the quality of traffic flow is the Level of Service (LOS). The current state of the practice in terms of the traffic investigationsmore » for EISs has two main shortcomings. The first one is related to the information that is necessary to conduct the traffic analysis, and specifically to the lack of integration among the different transportation models and the sources of information that, in general, reside in GIS databases. A discussion of the benefits of integrating CRS&SI technologies and the transportation models used in the EIS traffic investigation is included. The second shortcoming is in the presentation of the results, both in terms of the appearance and formatting, as well as content. The presentation of traffic results (current and proposed) is discussed. This chapter also addresses the need of additional data, in terms of content and coverage. Regarding the former, other traffic parameters (e.g., delays) that are more meaningful to non-transportation experts than LOS, as well as additional information (e.g., freight flows) that can impact traffic conditions and safety are discussed. Spatial information technologies can decrease the negative effects of, and even eliminate, these shortcomings by making the relevant information that is input to the models more complete and readily available, and by providing the means to communicate the results in a more clear and efficient manner. The benefits that the application and use of CRS&SI technologies can provide to improve and expedite the traffic investigation part of the EIS process are presented.« less
Kawasaki, Masahiro; Uno, Yutaka; Mori, Jumpei; Kobata, Kenji; Kitajo, Keiichi
2014-01-01
Electroencephalogram (EEG) phase synchronization analyses can reveal large-scale communication between distant brain areas. However, it is not possible to identify the directional information flow between distant areas using conventional phase synchronization analyses. In the present study, we applied transcranial magnetic stimulation (TMS) to the occipital area in subjects who were resting with their eyes closed, and analyzed the spatial propagation of transient TMS-induced phase resetting by using the transfer entropy (TE), to quantify the causal and directional flow of information. The time-frequency EEG analysis indicated that the theta (5 Hz) phase locking factor (PLF) reached its highest value at the distant area (the motor area in this study), with a time lag that followed the peak of the transient PLF enhancements of the TMS-targeted area at the TMS onset. Phase-preservation index (PPI) analyses demonstrated significant phase resetting at the TMS-targeted area and distant area. Moreover, the TE from the TMS-targeted area to the distant area increased clearly during the delay that followed TMS onset. Interestingly, the time lags were almost coincident between the PLF and TE results (152 vs. 165 ms), which provides strong evidence that the emergence of the delayed PLF reflects the causal information flow. Such tendencies were observed only in the higher-intensity TMS condition, and not in the lower-intensity or sham TMS conditions. Thus, TMS may manipulate large-scale causal relationships between brain areas in an intensity-dependent manner. We demonstrated that single-pulse TMS modulated global phase dynamics and directional information flow among synchronized brain networks. Therefore, our results suggest that single-pulse TMS can manipulate both incoming and outgoing information in the TMS-targeted area associated with functional changes.
Computer simulation of airflow through a multi-generation tracheobronchial conducting airway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, B.; Cheng, Yung-Sung; Yeh, Hsu-Chi
1995-12-01
Knowledge of airflow patterns in the human lung is important for an analysis of lung diseases and drug delivery of aerosolized medicine for medical treatment. However, very little systematic information is available on the pattern of airflow in the lung and on how this pattern affects the deposition of toxicants in the lung, and the efficacy of aerosol drug therapy. Most previous studies have only considered the airflow through a single bifurcating airway. However, the flow in a network of more than one bifurcation is more complicated due to the effect of interrelated lung generations. Because of the variation ofmore » airway geometry and flow condition from generation to generation, a single bifurcating airway cannot be taken as a representative for the others in different generations. The flow in the network varies significantly with airway generations because of a redistribution of axial momentum by the secondary flow motions. The influence of the redistribution of flow is expected in every generation. Therefore, a systematic information of the airflow through a multi-generation tracheobronchial conducting airway is needed, and it becomes the purpose of this study. This study has provided information on airflow in a lung model which is necessary to the study of the deposition of toxicants and therapeutic aerosols.« less
Debris flow risk mapping on medium scale and estimation of prospective economic losses
NASA Astrophysics Data System (ADS)
Blahut, Jan; Sterlacchini, Simone
2010-05-01
Delimitation of potential zones affected by debris flow hazard, mapping of areas at risk, and estimation of future economic damage provides important information for spatial planners and local administrators in all countries endangered by this type of phenomena. This study presents a medium scale (1:25 000 - 1: 50 000) analysis applied in the Consortium of Mountain Municipalities of Valtellina di Tirano (Italian Alps, Lombardy Region). In this area a debris flow hazard map was coupled with the information about the elements at risk to obtain monetary values of prospective damage. Two available hazard maps were obtained from GIS medium scale modelling. Probability estimations of debris flow occurrence were calculated using existing susceptibility maps and two sets of aerial images. Value to the elements at risk was assigned according to the official information on housing costs and land value from the Territorial Agency of Lombardy Region. In the first risk map vulnerability values were assumed to be 1. The second risk map uses three classes of vulnerability values qualitatively estimated according to the debris flow possible propagation. Risk curves summarizing the possible economic losses were calculated. Finally these maps of economic risk were compared to maps derived from qualitative evaluation of the values of the elements at risk.
NASA Astrophysics Data System (ADS)
Vega-Jácome, Fiorella; Lavado-Casimiro, Waldo Sven; Felipe-Obando, Oscar Gustavo
2018-04-01
Hydrological changes were assessed considering possible changes in precipitation and regulation or hydraulic diversion projects developed in the basin since 1960s in terms of improving water supply of the Rimac River, which is the main source of fresh water of Peru's capital. To achieve this objective, a trend analysis of precipitation and flow series was assessed using the Mann-Kendall test. Subsequently, the Eco-flow and Indicators of Hydrologic Alteration (IHA) methods were applied for the characterization and quantification of the hydrological change in the basin, considering for the analysis, a natural period (1920-1960) and an altered period (1961-2012). Under this focus, daily hydrologic information of the "Chosica R-2" station (from 1920 to 2013) and monthly rainfall information related to 14 stations (from 1964 to 2013) were collected. The results show variations in the flow seasonality of the altered period in relation to the natural period and a significant trend to increase (decrease) minimum flows (maximum flows) during the analyzed period. The Eco-flow assessment shows a predominance of Eco-deficit from December to May (rainy season), strongly related to negative anomalies of precipitation. In addition, a predominance of Eco-surplus was found from June to November (dry season) with a behavior opposite to precipitation, attributed to the regulations and diversion in the basin during that period. In terms of magnitude, the IHA assessment identified an increase of 51% in the average flows during the dry season and a reduction of 10% in the average flows during the rainy season (except December and May). Furthermore, the minimum flows increased by 35% with shorter duration and frequency, and maximum flows decreased by 29% with more frequency but less duration. Although there are benefits of regulation and diversion for developing anthropic activities, the fact that hydrologic alterations may result in significant modifications in the Rimac River ecosystem must be taken into account.
Gerken, Tobias; Ruddell, Benjamin L.; Fuentes, Jose D.; ...
2017-04-29
This work investigates the diurnal and seasonal behavior of the energy balance residual (E) that results from the observed difference between available energy and the turbulent fluxes of sensible heat (H) and latent heat (LE) at the FLUXNET BR-Ma2 site located in the Brazilian central Amazon rainforest. The behavior of E is analyzed by extending the eddy covariance averaging length from 30 min to 4 h and by applying an Information Flow Dynamical Process Network to diagnose processes and conditions affecting E across different seasons. Results show that the seasonal turbulent flux dynamics and the Bowen ratio are primarily drivenmore » by net radiation (R n), with substantial sub-seasonal variability. The Bowen ratio increased from 0.25 in April to 0.4 at the end of September. Extension of the averaging length from 0.5 (94.6% closure) to 4 h and thus inclusion of longer timescale eddies and mesoscale processes closes the energy balance and lead to an increase in the Bowen ratio, thus highlighting the importance of additional H to E. Information flow analysis reveals that the components of the energy balance explain between 25 and 40% of the total Shannon entropy with higher values during the wet season than the dry season. Dry season information flow from the buoyancy flux to E are 30–50% larger than that from H, indicating the potential importance of buoyancy fluxes to closing E. While the low closure highlights additional sources not captured in the flux data and random measurement errors contributing to E, the findings of the information flow and averaging length analysis are consistent with the impact of mesoscale circulations, which tend to transport more H than LE, on the lack of closure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerken, Tobias; Ruddell, Benjamin L.; Fuentes, Jose D.
This work investigates the diurnal and seasonal behavior of the energy balance residual (E) that results from the observed difference between available energy and the turbulent fluxes of sensible heat (H) and latent heat (LE) at the FLUXNET BR-Ma2 site located in the Brazilian central Amazon rainforest. The behavior of E is analyzed by extending the eddy covariance averaging length from 30 min to 4 h and by applying an Information Flow Dynamical Process Network to diagnose processes and conditions affecting E across different seasons. Results show that the seasonal turbulent flux dynamics and the Bowen ratio are primarily drivenmore » by net radiation (R n), with substantial sub-seasonal variability. The Bowen ratio increased from 0.25 in April to 0.4 at the end of September. Extension of the averaging length from 0.5 (94.6% closure) to 4 h and thus inclusion of longer timescale eddies and mesoscale processes closes the energy balance and lead to an increase in the Bowen ratio, thus highlighting the importance of additional H to E. Information flow analysis reveals that the components of the energy balance explain between 25 and 40% of the total Shannon entropy with higher values during the wet season than the dry season. Dry season information flow from the buoyancy flux to E are 30–50% larger than that from H, indicating the potential importance of buoyancy fluxes to closing E. While the low closure highlights additional sources not captured in the flux data and random measurement errors contributing to E, the findings of the information flow and averaging length analysis are consistent with the impact of mesoscale circulations, which tend to transport more H than LE, on the lack of closure.« less
Flow cytometry in the post fluorescence era.
Nolan, Garry P
2011-12-01
While flow cytometry once enabled researchers to examine 10--15 cell surface parameters, new mass flow cytometry technology enables interrogation of up to 45 parameters on a single cell. This new technology has increased understanding of cell expression and how cells differentiate during hematopoiesis. Using this information, knowledge of leukemia cell biology has also increased. Other new technologies, such as SPADE analysis and single cell network profiling (SCNP), are enabling researchers to put different cancers into more biologically similar categories and have the potential to enable more personalized medicine. Copyright © 2011. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, Shiraj; Ganguly, Auroop R; Bandyopadhyay, Sharba
Cross-spectrum analysis based on linear correlations in the time domain suggested a coupling between large river flows and the El Nino-Southern Oscillation (ENSO) cycle. A nonlinear measure based on mutual information (MI) reveals extrabasinal connections between ENSO and river flows in the tropics and subtropics, that are 20-70% higher than those suggested so far by linear correlations. The enhanced dependence observed for the Nile, Amazon, Congo, Paran{acute a}, and Ganges rivers, which affect large, densely populated regions of the world, has significant impacts on inter-annual river flow predictabilities and, hence, on water resources and agricultural planning.
NASA Technical Reports Server (NTRS)
Creagh, John W. R.; Ginsburg, Ambrose
1948-01-01
An investigation of the XJ-41-V turbojet-engine compressor was conducted to determine the performance of the compressor and to obtain fundamental information on the aerodynamic problems associated with large centrifugal-type compressors. The results of the research conducted on the original compressor indicated the compressor would not meet the desired engine-design air-flow requirements because of an air-flow restriction in the vaned collector. The compressor air-flow choking point occurred near the entrance to the vaned-collector passage and was instigated by a poor mass-flow distribution at the vane entrance and from relatively large negative angles of attack of the air stream along the entrance edges of the vanes at the outer passage wall and large positive angles of attack at the inner passage wall. As a result of the analysis, a design change of the vaned collector entrance is recommended for improving the maximum flow capacity of the compressor.
Optimal frequency-response sensitivity of compressible flow over roughness elements
NASA Astrophysics Data System (ADS)
Fosas de Pando, Miguel; Schmid, Peter J.
2017-04-01
Compressible flow over a flat plate with two localised and well-separated roughness elements is analysed by global frequency-response analysis. This analysis reveals a sustained feedback loop consisting of a convectively unstable shear-layer instability, triggered at the upstream roughness, and an upstream-propagating acoustic wave, originating at the downstream roughness and regenerating the shear-layer instability at the upstream protrusion. A typical multi-peaked frequency response is recovered from the numerical simulations. In addition, the optimal forcing and response clearly extract the components of this feedback loop and isolate flow regions of pronounced sensitivity and amplification. An efficient parametric-sensitivity framework is introduced and applied to the reference case which shows that first-order increases in Reynolds number and roughness height act destabilising on the flow, while changes in Mach number or roughness separation cause corresponding shifts in the peak frequencies. This information is gained with negligible effort beyond the reference case and can easily be applied to more complex flows.
Analysis of magnitude and duration of floods and droughts in the context of climate change
NASA Astrophysics Data System (ADS)
Eshetu Debele, Sisay; Bogdanowicz, Ewa; Strupczewski, Witold
2016-04-01
Research and scientific information are key elements of any decision-making process. There is also a strong need for tools to describe and compare in a concise way the regime of hydrological extreme events in the context of presumed climate change. To meet these demands, two complementary methods for estimating high and low-flow frequency characteristics are proposed. Both methods deal with duration and magnitude of extreme events. The first one "flow-duration-frequency" (known as QdF) has already been applied successfully to low-flow analysis, flood flows and rainfall intensity. The second one called "duration-flow-frequency" (DqF) was proposed by Strupczewski et al. in 2010 to flood frequency analysis. The two methods differ in the treatment of flow and duration. In the QdF method the duration (d-consecutive days) is a chosen fixed value and the frequency analysis concerns the annual or seasonal series of mean value of flows exceeded (in the case of floods) or non-exceeded (in the case of droughts) within d-day period. In the second method, DqF, the flows are treated as fixed thresholds and the duration of flows exceeding (floods) and non-exceeding (droughts) these thresholds are a subject of frequency analysis. The comparison of characteristics of floods and droughts in reference period and under future climate conditions for catchments studied within the CHIHE project is presented and a simple way to show the results to non-professionals and decision-makers is proposed. The work was undertaken within the project "Climate Change Impacts on Hydrological Extremes (CHIHE)", which is supported by the Norway-Poland Grants Program administered by the Norwegian Research Council. The observed time series were provided by the Institute of Meteorology and Water Management (IMGW), Poland. Strupczewski, W. G., Kochanek, K., Markiewicz, I., Bogdanowicz, E., Weglarczyk, S., & Singh V. P. (2010). On the Tails of Distributions of Annual Peak Flow. Hydrology Research, 42, 171-192. http://dx.doi.org/10.2166/nh.2011.062
Multiple shooting shadowing for sensitivity analysis of chaotic dynamical systems
NASA Astrophysics Data System (ADS)
Blonigan, Patrick J.; Wang, Qiqi
2018-02-01
Sensitivity analysis methods are important tools for research and design with simulations. Many important simulations exhibit chaotic dynamics, including scale-resolving turbulent fluid flow simulations. Unfortunately, conventional sensitivity analysis methods are unable to compute useful gradient information for long-time-averaged quantities in chaotic dynamical systems. Sensitivity analysis with least squares shadowing (LSS) can compute useful gradient information for a number of chaotic systems, including simulations of chaotic vortex shedding and homogeneous isotropic turbulence. However, this gradient information comes at a very high computational cost. This paper presents multiple shooting shadowing (MSS), a more computationally efficient shadowing approach than the original LSS approach. Through an analysis of the convergence rate of MSS, it is shown that MSS can have lower memory usage and run time than LSS.
Utility accommodation and conflict tracker (UACT) : user manual
DOT National Transportation Integrated Search
2009-02-01
Project 0-5475 performed a comprehensive analysis of utility conflict data/information flows between utility : accommodation stakeholders in the Texas Department of Transportation project development process, : developed data models to accommodate wo...
A new algorithm for grid-based hydrologic analysis by incorporating stormwater infrastructure
NASA Astrophysics Data System (ADS)
Choi, Yosoon; Yi, Huiuk; Park, Hyeong-Dong
2011-08-01
We developed a new algorithm, the Adaptive Stormwater Infrastructure (ASI) algorithm, to incorporate ancillary data sets related to stormwater infrastructure into the grid-based hydrologic analysis. The algorithm simultaneously considers the effects of the surface stormwater collector network (e.g., diversions, roadside ditches, and canals) and underground stormwater conveyance systems (e.g., waterway tunnels, collector pipes, and culverts). The surface drainage flows controlled by the surface runoff collector network are superimposed onto the flow directions derived from a DEM. After examining the connections between inlets and outfalls in the underground stormwater conveyance system, the flow accumulation and delineation of watersheds are calculated based on recursive computations. Application of the algorithm to the Sangdong tailings dam in Korea revealed superior performance to that of a conventional D8 single-flow algorithm in terms of providing reasonable hydrologic information on watersheds with stormwater infrastructure.
NASA Astrophysics Data System (ADS)
Jakovels, Dainis; Saknite, Inga; Spigulis, Janis
2014-05-01
Laser speckle contrast analysis (LASCA) offers a non-contact, full-field, and real-time mapping of capillary blood flow and can be considered as an alternative method to Laser Doppler perfusion imaging. LASCA technique has been implemented in several commercial instruments. However, these systems are still too expensive and bulky to be widely available. Several optical techniques have found new implementations as connection kits for mobile phones thus offering low cost screening devices. In this work we demonstrate simple implementation of LASCA imaging technique as connection kit for mobile phone for primary low-cost assessment of skin blood flow. Stabilized 650 nm and 532 nm laser diode modules were used for LASCA illumination. Dual wavelength illumination could provide additional information about skin hemoglobin and oxygenation level. The proposed approach was tested for arterial occlusion and heat test. Besides, blood flow maps of injured and provoked skin were demonstrated.
Hierarchical Spatio-temporal Visual Analysis of Cluster Evolution in Electrocorticography Data
Murugesan, Sugeerth; Bouchard, Kristofer; Chang, Edward; ...
2016-10-02
Here, we present ECoG ClusterFlow, a novel interactive visual analysis tool for the exploration of high-resolution Electrocorticography (ECoG) data. Our system detects and visualizes dynamic high-level structures, such as communities, using the time-varying spatial connectivity network derived from the high-resolution ECoG data. ECoG ClusterFlow provides a multi-scale visualization of the spatio-temporal patterns underlying the time-varying communities using two views: 1) an overview summarizing the evolution of clusters over time and 2) a hierarchical glyph-based technique that uses data aggregation and small multiples techniques to visualize the propagation of clusters in their spatial domain. ECoG ClusterFlow makes it possible 1) tomore » compare the spatio-temporal evolution patterns across various time intervals, 2) to compare the temporal information at varying levels of granularity, and 3) to investigate the evolution of spatial patterns without occluding the spatial context information. Lastly, we present case studies done in collaboration with neuroscientists on our team for both simulated and real epileptic seizure data aimed at evaluating the effectiveness of our approach.« less
NASA Astrophysics Data System (ADS)
Haavisto, Sanna; Cardona, Maria J.; Salmela, Juha; Powell, Robert L.; McCarthy, Michael J.; Kataja, Markku; Koponen, Antti I.
2017-11-01
A hybrid multi-scale velocimetry method utilizing Doppler optical coherence tomography in combination with either magnetic resonance imaging or ultrasound velocity profiling is used to investigate pipe flow of four rheologically different working fluids under varying flow regimes. These fluids include water, an aqueous xanthan gum solution, a softwood fiber suspension, and a microfibrillated cellulose suspension. The measurement setup enables not only the analysis of the rheological (bulk) behavior of a studied fluid but gives simultaneously information on their wall layer dynamics, both of which are needed for analyzing and solving practical fluid flow-related problems. Preliminary novel results on rheological and boundary layer flow properties of the working fluids are reported and the potential of the hybrid measurement setup is demonstrated.
An Investigation of Ionic Flows in a Sphere-Plate Electrode Gap
NASA Astrophysics Data System (ADS)
Z. Alisoy, H.; Alagoz, S.; T. Alisoy, G.; B. Alagoz, B.
2013-10-01
This paper presents analyses of ion flow characteristics and ion discharge pulses in a sphere-ground plate electrode system. As a result of variation in electric field intensity in the electrode gap, the ion flows towards electrodes generate non-uniform discharging pulses. Inspection of these pulses provides useful information on ionic stream kinetics, the effective thickness of ion cover around electrodes, and the timing of ion clouds discharge pulse sequences. A finite difference time domain (FDTD) based space-charge motion simulation is used for the numerical analysis of the spatio-temporal development of ionic flows following the first Townsend avalanche, and the simulation results demonstrate expansion of the positive ion flow and compression of the negative ion flow, which results in non-uniform discharge pulse characteristics.
NASA Astrophysics Data System (ADS)
Blauhut, Veit; Stölzle, Michael; Stahl, Kerstin
2017-04-01
Drought induced low flow extremes, despite a variety of management strategies, can cause direct and indirect impacts on socio economic and ecological functions of rivers. These negative effects determine local risk and are a function of the regional drought hazard and the river system's vulnerability. Whereas drought risk analysis is known to be essential for drought management, risk analysis for low flow is less common. Where no distributed hydrological models exist, merely the local hazard at gauging stations is available to represent the entire catchment. Vulnerability information are only sparsely available. Hence, a comprehensive understanding of the drivers of low flow risk along the longitudinal river profile is often lacking. For two different rivers in southwestern Germany, this study analysed major low flow events of the past five decades. Applying a transdisciplinary approach, the hazard component is assessed by hydro-climatic analysis, hydrological modelling and forward looking stress test scenarios; the vulnerability component is estimated by a combination of impact assessment and vulnerability estimation, based on stakeholder workshops, questionnaires and regional characteristics. The results show distinct differences in low flow risk between the catchments and along the river. These differences are due to: hydrogeological characteristics that govern groundwater-surface water interaction, catchment-specific anthropogenic stimuli such as low flow decrease by near-stream groundwater pumping for public water supply or low flow augmentation by treatment plant discharge. Thus, low flow risk is anthropogenically influenced in both ways: positive and negative. Furthermore, the measured longitudinal profiles highlight the impracticability of single gauges to represent quantitative and qualitative conditions of entire rivers. Hence, this work calls for a comprehensive spatially variable consideration of flow characteristics and human influences to analyse low flow risk as the basis for an adequate low flow management.
Norri-Sederholm, Teija; Paakkonen, Heikki; Kurola, Jouni; Saranto, Kaija
2015-01-16
In prehospital emergency medical services, one of the key factors in the successful delivery of appropriate care is the efficient management and supervision of the area's emergency medical services units. Paramedic field supervisors have an important role in this task. One of the key factors in the daily work of paramedic field supervisors is ensuring that they have enough of the right type of information when co-operating with other authorities and making decisions. However, a gap in information sharing still exists especially due to information overload. The aim of this study was to find out what type of critical information paramedic field supervisors need during multi-authority missions in order to manage their emergency medical services area successfully. The study also investigated both the flow of information, and interactions with the paramedic field supervisors and the differences that occur depending on the incident type. Ten paramedic field supervisors from four Finnish rescue departments participated in the study in January-March 2012. The data were collected using semi-structured interviews based on three progressive real-life scenarios and a questionnaire. Data were analysed using deductive content analysis. Data management and analysis were performed using Atlas.ti 7 software. Five critical information categories were formulated: Incident data, Mission status, Area status, Safety at work, and Tactics. Each category's importance varied depending on the incident and on whether it was about information needed or information delivered by the paramedic field supervisors. The main communication equipment used to receive information was the authority radio network (TETRA). However, when delivering information, mobile phones and TETRA were of equal importance. Paramedic field supervisors needed more information relating to area status. Paramedic field supervisors communicate actively with EMS units and other authorities such as Emergency Medical Dispatch, police, and rescue services during the multi-authority incidents. This study provides knowledge about the critical information categories when receiving and sharing the information to obtain and maintain situational awareness. However, further research is needed to examine more the information flow in prehospital emergency care to enable a better understanding of required communication in situational awareness formation.
Building the Material Flow Networks of Aluminum in the 2007 U.S. Economy.
Chen, Wei-Qiang; Graedel, T E; Nuss, Philip; Ohno, Hajime
2016-04-05
Based on the combination of the U.S. economic input-output table and the stocks and flows framework for characterizing anthropogenic metal cycles, this study presents a methodology for building material flow networks of bulk metals in the U.S. economy and applies it to aluminum. The results, which we term the Input-Output Material Flow Networks (IO-MFNs), achieve a complete picture of aluminum flow in the entire U.S. economy and for any chosen industrial sector (illustrated for the Automobile Manufacturing sector). The results are compared with information from our former study on U.S. aluminum stocks and flows to demonstrate the robustness and value of this new methodology. We find that the IO-MFN approach has the following advantages: (1) it helps to uncover the network of material flows in the manufacturing stage in the life cycle of metals; (2) it provides a method that may be less time-consuming but more complete and accurate in estimating new scrap generation, process loss, domestic final demand, and trade of final products of metals, than existing material flow analysis approaches; and, most importantly, (3) it enables the analysis of the material flows of metals in the U.S. economy from a network perspective, rather than merely that of a life cycle chain.
Stochastic cycle selection in active flow networks.
Woodhouse, Francis G; Forrow, Aden; Fawcett, Joanna B; Dunkel, Jörn
2016-07-19
Active biological flow networks pervade nature and span a wide range of scales, from arterial blood vessels and bronchial mucus transport in humans to bacterial flow through porous media or plasmodial shuttle streaming in slime molds. Despite their ubiquity, little is known about the self-organization principles that govern flow statistics in such nonequilibrium networks. Here we connect concepts from lattice field theory, graph theory, and transition rate theory to understand how topology controls dynamics in a generic model for actively driven flow on a network. Our combined theoretical and numerical analysis identifies symmetry-based rules that make it possible to classify and predict the selection statistics of complex flow cycles from the network topology. The conceptual framework developed here is applicable to a broad class of biological and nonbiological far-from-equilibrium networks, including actively controlled information flows, and establishes a correspondence between active flow networks and generalized ice-type models.
Stochastic cycle selection in active flow networks
NASA Astrophysics Data System (ADS)
Woodhouse, Francis; Forrow, Aden; Fawcett, Joanna; Dunkel, Jorn
2016-11-01
Active biological flow networks pervade nature and span a wide range of scales, from arterial blood vessels and bronchial mucus transport in humans to bacterial flow through porous media or plasmodial shuttle streaming in slime molds. Despite their ubiquity, little is known about the self-organization principles that govern flow statistics in such non-equilibrium networks. By connecting concepts from lattice field theory, graph theory and transition rate theory, we show how topology controls dynamics in a generic model for actively driven flow on a network. Through theoretical and numerical analysis we identify symmetry-based rules to classify and predict the selection statistics of complex flow cycles from the network topology. Our conceptual framework is applicable to a broad class of biological and non-biological far-from-equilibrium networks, including actively controlled information flows, and establishes a new correspondence between active flow networks and generalized ice-type models.
Stochastic cycle selection in active flow networks
Woodhouse, Francis G.; Forrow, Aden; Fawcett, Joanna B.; Dunkel, Jörn
2016-01-01
Active biological flow networks pervade nature and span a wide range of scales, from arterial blood vessels and bronchial mucus transport in humans to bacterial flow through porous media or plasmodial shuttle streaming in slime molds. Despite their ubiquity, little is known about the self-organization principles that govern flow statistics in such nonequilibrium networks. Here we connect concepts from lattice field theory, graph theory, and transition rate theory to understand how topology controls dynamics in a generic model for actively driven flow on a network. Our combined theoretical and numerical analysis identifies symmetry-based rules that make it possible to classify and predict the selection statistics of complex flow cycles from the network topology. The conceptual framework developed here is applicable to a broad class of biological and nonbiological far-from-equilibrium networks, including actively controlled information flows, and establishes a correspondence between active flow networks and generalized ice-type models. PMID:27382186
Effects of nose bluntness and shock-shock interactions on blunt bodies in viscous hypersonic flows
NASA Technical Reports Server (NTRS)
Singh, D. J.; Tiwari, S. N.
1990-01-01
A numerical study was conducted to investigate the effects of blunt leading edges on the viscous flow field around a hypersonic vehicle such as the proposed National Aero-Space Plane. Attention is focused on two specific regions of the flow field. In the first region, effects of nose bluntness on the forebody flow field are investigated. The second region of the flow considered is around the leading edges of the scramjet inlet. In this region, the interaction of the forebody shock with the shock produced by the blunt leading edges of the inlet compression surfaces is analyzed. Analysis of these flow regions is required to accurately predict the overall flow field as well as to get necessary information on localized zones of high pressure and intense heating. The results for the forebody flow field are discussed first, followed by the results for the shock interaction in the inlet leading edge region.
NASA Astrophysics Data System (ADS)
Demetrius, Olive Joyce
The purpose of this study was to examine the relationships between Junior High School students' (8th and 9th grades) background variables (e.g. cognitive factors, prior knowledge, preference for science versus non-science activities, formal and informal activities) and structure of information recall of biological content. In addition, this study will illustrate how flow maps, a graphic display, designed to represent the sequential flow and cross linkage of ideas in information recalled by the learner can be used as a tool for analyzing science learning data. The participants (46 junior high school students) were taught a lesson on the human digestive system during which they were shown a model of the human torso. Their pattern of information recall was determined by using an interview technique to elicit their understanding of the functional anatomy of the human digestive system. The taped responses were later transcribed for construction of the flow map. The interview was also used to assess knowledge recall of biological content. The flow map, science interest questionnaire and the cognitive operations (based on content analysis of student's narrative) were used to analyze data from each respondent. This is a case study using individual subjects and interview techniques. The findings of this study are: (1) Based on flow map data higher academic ability students have more networking of ideas than low ability students. (2) A large percentage of 9th grade low ability students intend to pursue science/applied science course work after leaving school but they lack well organized ways of representing science knowledge in memory. (3) Content analysis of the narratives shows that students with more complex ideational networks use higher order cognitive thought processes compared to those with less networking of ideas. If students are to make a successful transition from low academic performance to high academic performance it seems that more emphasis should be placed on information networking skills. This is specifically likely to be productive for student currently performing on low academic ability levels and yet have high aspirations for pursuing science as a career.
A zonal method for modeling powered-lift aircraft flow fields
NASA Technical Reports Server (NTRS)
Roberts, D. W.
1989-01-01
A zonal method for modeling powered-lift aircraft flow fields is based on the coupling of a three-dimensional Navier-Stokes code to a potential flow code. By minimizing the extent of the viscous Navier-Stokes zones the zonal method can be a cost effective flow analysis tool. The successful coupling of the zonal solutions provides the viscous/inviscid interations that are necessary to achieve convergent and unique overall solutions. The feasibility of coupling the two vastly different codes is demonstrated. The interzone boundaries were overlapped to facilitate the passing of boundary condition information between the codes. Routines were developed to extract the normal velocity boundary conditions for the potential flow zone from the viscous zone solution. Similarly, the velocity vector direction along with the total conditions were obtained from the potential flow solution to provide boundary conditions for the Navier-Stokes solution. Studies were conducted to determine the influence of the overlap of the interzone boundaries and the convergence of the zonal solutions on the convergence of the overall solution. The zonal method was applied to a jet impingement problem to model the suckdown effect that results from the entrainment of the inviscid zone flow by the viscous zone jet. The resultant potential flow solution created a lower pressure on the base of the vehicle which produces the suckdown load. The feasibility of the zonal method was demonstrated. By enhancing the Navier-Stokes code for powered-lift flow fields and optimizing the convergence of the coupled analysis a practical flow analysis tool will result.
Hydraulic head applications of flow logs in the study of heterogeneous aquifers
Paillet, Frederick L.
2001-01-01
Permeability profiles derived from high-resolution flow logs in heterogeneous aquifers provide a limited sample of the most permeable beds or fractures determining the hydraulic properties of those aquifers. This paper demonstrates that flow logs can also be used to infer the large-scale properties of aquifers surrounding boreholes. The analysis is based on the interpretation of the hydraulic head values estimated from the flow log analysis. Pairs of quasi-steady flow profiles obtained under ambient conditions and while either pumping or injecting are used to estimate the hydraulic head in each water-producing zone. Although the analysis yields localized estimates of transmissivity for a few water-producing zones, the hydraulic head estimates apply to the farfield aquifers to which these zones are connected. The hydraulic head data are combined with information from other sources to identify the large-scale structure of heterogeneous aquifers. More complicated cross-borehole flow experiments are used to characterize the pattern of connection between large-scale aquifer units inferred from the hydraulic head estimates. The interpretation of hydraulic heads in situ under steady and transient conditions is illustrated by several case studies, including an example with heterogeneous permeable beds in an unconsolidated aquifer, and four examples with heterogeneous distributions of bedding planes and/or fractures in bedrock aquifers.
Comprehensive and Critical Literature Review on Insitu Micro-Sensors for Application in Tribology
1994-04-01
Electroosmotic flow provides a pumping method that is convenient for small capillaries. Electrophoretic separation is shown to be useful. On the left hand...analysis systems on glass chips (1 centimeter by 2 centimeters or larger) that utilize electroosmotic pumping to drive fluid flow and electrophoretic...elucidate the interaction mechanism. Additionally, using two types of sensors in a mixed array increases selectivity by providing different information
A Preliminary Analysis of the Theoretical Parameters of Organizaational Learning.
1995-09-01
PARAMETERS OF ORGANIZATIONAL LEARNING THESIS Presented to the Faculty of the Graduate School of Logistics and Acquisition Management of the Air...Organizational Learning Parameters in the Knowledge Acquisition Category 2~™ 2-3. Organizational Learning Parameters in the Information Distribution Category...Learning Refined Scale 4-94 4-145. Composition of Refined Scale 4 Knowledge Flow 4-95 4-146. Cronbach’s Alpha Statistics for the Complete Knowledge Flow
Ontological modeling of electronic health information exchange.
McMurray, J; Zhu, L; McKillop, I; Chen, H
2015-08-01
Investments of resources to purposively improve the movement of information between health system providers are currently made with imperfect information. No inventories of system-level electronic health information flows currently exist, nor do measures of inter-organizational electronic information exchange. Using Protégé 4, an open-source OWL Web ontology language editor and knowledge-based framework, we formalized a model that decomposes inter-organizational electronic health information flow into derivative concepts such as diversity, breadth, volume, structure, standardization and connectivity. The ontology was populated with data from a regional health system and the flows were measured. Individual instance's properties were inferred from their class associations as determined by their data and object property rules. It was also possible to visualize interoperability activity for regional analysis and planning purposes. A property called Impact was created from the total number of patients or clients that a health entity in the region served in a year, and the total number of health service providers or organizations with whom it exchanged information in support of clinical decision-making, diagnosis or treatment. Identifying providers with a high Impact but low Interoperability score could assist planners and policy-makers to optimize technology investments intended to electronically share patient information across the continuum of care. Finally, we demonstrated how linked ontologies were used to identify logical inconsistencies in self-reported data for the study. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Cazalbou, J.-B.; Chassaing, P.
2002-02-01
The behavior of Reynolds-stress-transport models at the free-stream edges of turbulent flows is investigated. Current turbulent-diffusion models are found to produce propagative (possibly weak) solutions of the same type as those reported earlier by Cazalbou, Spalart, and Bradshaw [Phys. Fluids 6, 1797 (1994)] for two-equation models. As in the latter study, an analysis is presented that provides qualitative information on the flow structure predicted near the edge if a condition on the values of the diffusion constants is satisfied. In this case, the solution appears to be fairly insensitive to the residual free-stream turbulence levels needed with conventional numerical methods. The main specific result is that, depending on the diffusion model, the propagative solution can force turbulence toward definite and rather extreme anisotropy states at the edge (one- or two-component limit). This is not the case with the model of Daly and Harlow [Phys. Fluids 13, 2634 (1970)]; it may be one of the reasons why this "old" scheme is still the most widely used, even in recent Reynolds-stress-transport models. In addition, the analysis helps us to interpret some difficulties encountered in computing even very simple flows with Lumley's pressure-diffusion model [Adv. Appl. Mech. 18, 123 (1978)]. A new realizability condition, according to which the diffusion model should not globally become "anti-diffusive," is introduced, and a recalibration of Lumley's model satisfying this condition is performed using information drawn from the analysis.
Information processing and dynamics in minimally cognitive agents.
Beer, Randall D; Williams, Paul L
2015-01-01
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.
Groundwater flow and hydrogeochemical evolution in the Jianghan Plain, central China
NASA Astrophysics Data System (ADS)
Gan, Yiqun; Zhao, Ke; Deng, Yamin; Liang, Xing; Ma, Teng; Wang, Yanxin
2018-05-01
Hydrogeochemical analysis and multivariate statistics were applied to identify flow patterns and major processes controlling the hydrogeochemistry of groundwater in the Jianghan Plain, which is located in central Yangtze River Basin (central China) and characterized by intensive surface-water/groundwater interaction. Although HCO3-Ca-(Mg) type water predominated in the study area, the 457 (21 surface water and 436 groundwater) samples were effectively classified into five clusters by hierarchical cluster analysis. The hydrochemical variations among these clusters were governed by three factors from factor analysis. Major components (e.g., Ca, Mg and HCO3) in surface water and groundwater originated from carbonate and silicate weathering (factor 1). Redox conditions (factor 2) influenced the geogenic Fe and As contamination in shallow confined groundwater. Anthropogenic activities (factor 3) primarily caused high levels of Cl and SO4 in surface water and phreatic groundwater. Furthermore, the factor score 1 of samples in the shallow confined aquifer gradually increased along the flow paths. This study demonstrates that enhanced information on hydrochemistry in complex groundwater flow systems, by multivariate statistical methods, improves the understanding of groundwater flow and hydrogeochemical evolution due to natural and anthropogenic impacts.
Mean Line Pump Flow Model in Rocket Engine System Simulation
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Lavelle, Thomas M.
2000-01-01
A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.
NASA Technical Reports Server (NTRS)
Gerber, C. R.
1972-01-01
The computation and logical functions which are performed by the data processing assembly of the modular space station are defined. The subjects discussed are: (1) requirements analysis, (2) baseline data processing assembly configuration, (3) information flow study, (4) throughput simulation, (5) redundancy study, (6) memory studies, and (7) design requirements specification.
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
Planning an Information System for a Small College. AIR Forum Paper 1978.
ERIC Educational Resources Information Center
Toombs, William; Sagaria, Mary Ann
Data collection and analyses of college records and interviewing provided a cross-sectional view of data flow and information transmission in a small college. The micro-analysis of interview data, forms, and reports yielded a picture of functional relationships, clarified loci of decision making, and stipulated functions served by data items.…
ERIC Educational Resources Information Center
Price Waterhouse and Co., New York, NY.
This volume on Phase II of the New York State Educational Information System (NYSEIS) describes the Gross Systems Analysis and Design, which includes the general flow diagram and processing chart for each of the student, personnel, and financial subsystems. Volume II, Functional Specifications, includes input/output requirements and file…
Wavelet analysis methods for radiography of multidimensional growth of planar mixing layers
Merritt, Elizabeth Catherine; Doss, Forrest William
2016-07-06
The counter-propagating shear campaign is examining instability growth and its transition to turbulence in the high-energy-density physics regime using a laser-driven counter-propagating flow platform. In these experiments, we observe consistent complex break-up of and structure growth in a tracer layer placed at the shear flow interface during the instability growth phase. We present a wavelet-transform based analysis technique capable of characterizing the scale- and directionality-resolved average intensity perturbations in static radiographs of the experiment. This technique uses the complete spatial information available in each radiograph to describe the structure evolution. We designed this analysis technique to generate a two-dimensional powermore » spectrum for each radiograph from which we can recover information about structure widths, amplitudes, and orientations. Lastly, the evolution of the distribution of power in the spectra for an experimental series is a potential metric for quantifying the structure size evolution as well as a system’s evolution towards isotropy.« less
Wavelet analysis methods for radiography of multidimensional growth of planar mixing layers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merritt, E. C., E-mail: emerritt@lanl.gov; Doss, F. W.
2016-07-15
The counter-propagating shear campaign is examining instability growth and its transition to turbulence in the high-energy-density physics regime using a laser-driven counter-propagating flow platform. In these experiments, we observe consistent complex break-up of and structure growth in a tracer layer placed at the shear flow interface during the instability growth phase. We present a wavelet-transform based analysis technique capable of characterizing the scale- and directionality-resolved average intensity perturbations in static radiographs of the experiment. This technique uses the complete spatial information available in each radiograph to describe the structure evolution. We designed this analysis technique to generate a two-dimensional powermore » spectrum for each radiograph from which we can recover information about structure widths, amplitudes, and orientations. The evolution of the distribution of power in the spectra for an experimental series is a potential metric for quantifying the structure size evolution as well as a system’s evolution towards isotropy.« less
LFSTAT - Low-Flow Analysis in R
NASA Astrophysics Data System (ADS)
Koffler, Daniel; Laaha, Gregor
2013-04-01
The calculation of characteristic stream flow during dry conditions is a basic requirement for many problems in hydrology, ecohydrology and water resources management. As opposed to floods, a number of different indices are used to characterise low flows and streamflow droughts. Although these indices and methods of calculation have been well documented in the WMO Manual on Low-flow Estimation and Prediction [1], a comprehensive software was missing which enables a fast and standardized calculation of low flow statistics. We present the new software package lfstat to fill in this obvious gap. Our software package is based on the statistical open source software R, and expands it to analyse daily stream flow data records focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) provided for R which is based on tcl/tk. The functionality of lfstat includes estimation methods for low-flow indices, extreme value statistics, deficit characteristics, and additional graphical methods to control the computation of complex indices and to illustrate the data. Beside the basic low flow indices, the baseflow index and recession constants can be computed. For extreme value statistics, state-of-the-art methods for L-moment based local and regional frequency analysis (RFA) are available. The tools for deficit characteristics include various pooling and threshold selection methods to support the calculation of drought duration and deficit indices. The most common graphics for low flow analysis are available, and the plots can be modified according to the user preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, recession diagnostic, flow duration curves as well as double mass curves, and many more. From a technical point of view, the package uses a S3-class called lfobj (low-flow objects). This objects are usual R-data-frames including date, flow, hydrological year and possibly baseflow information. Once these objects are created, analysis can be performed by mouse-click and a script can be saved to make the analysis easily reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions [1]. Future plans include a dynamic low flow report in odt-file format using odf-weave which allows automatic updates if data or analysis change. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package can also be used in teaching students the first steps in low-flow hydrology. The software packages can be installed from CRAN (latest stable) and R-Forge: http://r-forge.r-project.org (development version). References: [1] Gustard, Alan; Demuth, Siegfried, (eds.) Manual on Low-flow Estimation and Prediction. Geneva, Switzerland, World Meteorological Organization, (Operational Hydrology Report No. 50, WMO-No. 1029).
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
Di Iorio, C T; Carinci, F; Azzopardi, J; Baglioni, V; Beck, P; Cunningham, S; Evripidou, A; Leese, G; Loevaas, K F; Olympios, G; Federici, M Orsini; Pruna, S; Palladino, P; Skeie, S; Taverner, P; Traynor, V; Benedetti, M Massi
2009-12-01
To foster the development of a privacy-protective, sustainable cross-border information system in the framework of a European public health project. A targeted privacy impact assessment was implemented to identify the best architecture for a European information system for diabetes directly tapping into clinical registries. Four steps were used to provide input to software designers and developers: a structured literature search, analysis of data flow scenarios or options, creation of an ad hoc questionnaire and conduction of a Delphi procedure. The literature search identified a core set of relevant papers on privacy (n = 11). Technicians envisaged three candidate system architectures, with associated data flows, to source an information flow questionnaire that was submitted to the Delphi panel for the selection of the best architecture. A detailed scheme envisaging an "aggregation by group of patients" was finally chosen, based upon the exchange of finely tuned summary tables. Public health information systems should be carefully engineered only after a clear strategy for privacy protection has been planned, to avoid breaching current regulations and future concerns and to optimise the development of statistical routines. The BIRO (Best Information Through Regional Outcomes) project delivers a specific method of privacy impact assessment that can be conveniently used in similar situations across Europe.
The aware toolbox for the detection of law infringements on web pages
NASA Astrophysics Data System (ADS)
Shahab, Asif; Kieninger, Thomas; Dengel, Andreas
2010-01-01
In the project Aware we aim to develop an automatic assistant for the detection of law infringements on web pages. The motivation for this project is that many authors of web pages are at some points infringing copyrightor other laws, mostly without being aware of that fact, and are more and more often confronted with costly legal warnings. As the legal environment is constantly changing, an important requirement of Aware is that the domain knowledge can be maintained (and initially defined) by numerous legal experts remotely working without further assistance of the computer scientists. Consequently, the software platform was chosen to be a web-based generic toolbox that can be configured to suit individual analysis experts, definitions of analysis flow, information gathering and report generation. The report generated by the system summarizes all critical elements of a given web page and provides case specific hints to the page author and thus forms a new type of service. Regarding the analysis subsystems, Aware mainly builds on existing state-of-the-art technologies. Their usability has been evaluated for each intended task. In order to control the heterogeneous analysis components and to gather the information, a lightweight scripting shell has been developed. This paper describes the analysis technologies, ranging from text based information extraction, over optical character recognition and phonetic fuzzy string matching to a set of image analysis and retrieval tools; as well as the scripting language to define the analysis flow.
NASA Astrophysics Data System (ADS)
Emerton, R.; Cloke, H. L.; Stephens, L.; Woolnough, S. J.; Zsoter, E.; Pappenberger, F.
2016-12-01
El Niño Southern Oscillation (ENSO), a mode of variability which sees fluctuations between anomalously high or low sea surface temperatures in the Pacific, is known to influence river flow and flooding at the global scale. The anticipation and forecasting of floods is crucial for flood preparedness, and this link, alongside the predictive skill of ENSO up to seasons ahead, may provide an early indication of upcoming severe flood events. Information is readily available indicating the likely impacts of El Niño and La Niña on precipitation across the globe, which is often used as a proxy for flood hazard. However, the nonlinearity between precipitation and flood magnitude and frequency means that it is important to assess the impact of ENSO events not only on precipitation, but also on river flow and flooding. Historical probabilities provide key information regarding the likely impacts of ENSO events. We estimate, for the first time, the historical probability of increased flood hazard during El Niño and La Niña through a global hydrological analysis, using a new 20thCentury ensemble river flow reanalysis for the global river network. This dataset was produced by running the ECMWF ERA-20CM atmospheric reanalysis through a research set-up of the Global Flood Awareness System (GloFAS) using the CaMa-Flood hydrodynamic model, to produce a 110-year global reanalysis of river flow. We further evaluate the added benefit of the hydrological analysis over the use of precipitation as a proxy for flood hazard. For example, providing information regarding regions that are likely to experience a lagged influence on river flow compared to the influence on precipitation. Our results map, at the global scale, the probability of abnormally high river flow during any given month during an El Niño or La Niña; information such as this is key for organisations that work at the global scale, such as humanitarian aid organisations, providing a seasons-ahead indicator of potential increased flood hazard that can be used as soon as the event onset is declared, or even earlier, when El Niño or La Niña conditions are first predicted.
Computer program for design analysis of radial-inflow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1976-01-01
A computer program written in FORTRAN that may be used for the design analysis of radial-inflow turbines was documented. The following information is included: loss model (estimation of losses), the analysis equations, a description of the input and output data, the FORTRAN program listing and list of variables, and sample cases. The input design requirements include the power, mass flow rate, inlet temperature and pressure, and rotational speed. The program output data includes various diameters, efficiencies, temperatures, pressures, velocities, and flow angles for the appropriate calculation stations. The design variables include the stator-exit angle, rotor radius ratios, and rotor-exit tangential velocity distribution. The losses are determined by an internal loss model.
Das, Swagatam; Biswas, Subhodip; Panigrahi, Bijaya K; Kundu, Souvik; Basu, Debabrota
2014-10-01
This paper presents a novel search metaheuristic inspired from the physical interpretation of the optic flow of information in honeybees about the spatial surroundings that help them orient themselves and navigate through search space while foraging. The interpreted behavior combined with the minimal foraging is simulated by the artificial bee colony algorithm to develop a robust search technique that exhibits elevated performance in multidimensional objective space. Through detailed experimental study and rigorous analysis, we highlight the statistical superiority enjoyed by our algorithm over a wide variety of functions as compared to some highly competitive state-of-the-art methods.
[The Information Flows System as an instrument for preventing technological illness].
Saldutti, Elisa; Bindi, Luciano; Di Giacobbe, Andrea; Innocenzi, Mariano; Innocenzi, Ludovico
2014-01-01
This paper describes the project "Information Flows", its contents of INAIL data about accidents and occupational diseases reported and recognized and its usefulness for programs of preventive initiatives undertaken by INAIL and by the responsible structures in the single italian regions. We propose some processings of data and suggest how their collection, according to criteria based on occupational medicine, industrial hygiene and epidemiology and a careful analysis and processing of data from more sources could lead to an extension of the workers protection, relatively to "unrecognized" occupational diseases, diseases caused by the "old" risks and the identification of occupational diseases caused by "new" risks.
Utility accommodation and conflict tracker (UACT) installation and configuration manual.
DOT National Transportation Integrated Search
2009-02-01
Project 0-5475 performed a comprehensive analysis of utility conflict data/information flows between utility : accommodation stakeholders in the Texas Department of Transportation project development process, : developed data models to accommodate wo...
Mission Analysis Program for Solar Electric Propulsion (MAPSEP). Volume 3: Program manual
NASA Technical Reports Server (NTRS)
Huling, K. R.; Boain, R. J.; Wilson, T.; Hong, P. E.; Shults, G. L.
1974-01-01
The internal structure of MAPSEP is described. Topics discussed include: macrologic, variable definition, subroutines, and logical flow. Information is given to facilitate modifications to the models and algorithms of MAPSEP.
NASA Astrophysics Data System (ADS)
Smirni, Salvatore; MacDonald, Michael P.; Robertson, Catherine P.; McNamara, Paul M.; O'Gorman, Sean; Leahy, Martin J.; Khan, Faisel
2018-02-01
The cutaneous microcirculation represents an index of the health status of the cardiovascular system. Conventional methods to evaluate skin microvascular function are based on measuring blood flow by laser Doppler in combination with reactive tests such as post-occlusive reactive hyperaemia (PORH). Moreover, the spectral analysis of blood flow signals by continuous wavelet transform (CWT) reveals nonlinear oscillations reflecting the functionality of microvascular biological factors, e.g. endothelial cells (ECs). Correlation mapping optical coherence tomography (cmOCT) has been previously described as an efficient methodology for the morphological visualisation of cutaneous micro-vessels. Here, we show that cmOCT flow maps can also provide information on the functional components of the microcirculation. A spectral domain optical coherence tomography (SD-OCT) imaging system was used to acquire 90 sequential 3D OCT volumes from the forearm of a volunteer, while challenging the micro-vessels with a PORH test. The volumes were sampled in a temporal window of 25 minutes, and were processed by cmOCT to obtain flow maps at different tissue depths. The images clearly show changes of flow in response to the applied stimulus. Furthermore, a blood flow signal was reconstructed from cmOCT maps intensities to investigate the microvascular nonlinear dynamics by CWT. The analysis revealed oscillations changing in response to PORH, associated with the activity of ECs and the sympathetic innervation. The results demonstrate that cmOCT may be potentially used as diagnostic tool for the assessment of microvascular function, with the advantage of also providing spatial resolution and structural information compared to the traditional laser Doppler techniques.
24 CFR 202.8 - Loan correspondent lenders and mortgagees.
Code of Federal Regulations, 2010 CFR
2010-04-01
... insured mortgages in its own portfolio. Sponsor. (1) With respect to Title I programs, a sponsor is a... of cash flows, an analysis of the net worth adjusted to reflect only assets acceptable to the Secretary and an analysis of escrow funds; and (ii) Such other financial information as the Secretary may...
Linear and Non-linear Information Flows In Rainfall Field
NASA Astrophysics Data System (ADS)
Molini, A.; La Barbera, P.; Lanza, L. G.
The rainfall process is the result of a complex framework of non-linear dynamical in- teractions between the different components of the atmosphere. It preserves the com- plexity and the intermittent features of the generating system in space and time as well as the strong dependence of these properties on the scale of observations. The understanding and quantification of how the non-linearity of the generating process comes to influence the single rain events constitute relevant research issues in the field of hydro-meteorology, especially in those applications where a timely and effective forecasting of heavy rain events is able to reduce the risk of failure. This work focuses on the characterization of the non-linear properties of the observed rain process and on the influence of these features on hydrological models. Among the goals of such a survey is the research of regular structures of the rainfall phenomenon and the study of the information flows within the rain field. The research focuses on three basic evo- lution directions for the system: in time, in space and between the different scales. In fact, the information flows that force the system to evolve represent in general a connection between the different locations in space, the different instants in time and, unless assuming the hypothesis of scale invariance is verified "a priori", the different characteristic scales. A first phase of the analysis is carried out by means of classic statistical methods, then a survey of the information flows within the field is devel- oped by means of techniques borrowed from the Information Theory, and finally an analysis of the rain signal in the time and frequency domains is performed, with par- ticular reference to its intermittent structure. The methods adopted in this last part of the work are both the classic techniques of statistical inference and a few procedures for the detection of non-linear and non-stationary features within the process starting from measured data.
The role of PET quantification in cardiovascular imaging.
Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido
2014-08-01
Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.
FuGEFlow: data model and markup language for flow cytometry.
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-06-16
Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development.
Hoganson, David M; Hinkel, Cameron J; Chen, Xiaomin; Agarwal, Ramesh K; Shenoy, Surendra
2014-01-01
Stenosis in a vascular access circuit is the predominant cause of access dysfunction. Hemodynamic significance of a stenosis identified by angiography in an access circuit is uncertain. This study utilizes computational fluid dynamics (CFD) to model flow through arteriovenous fistula to predict the functional significance of stenosis in vascular access circuits. Three-dimensional models of fistulas were created with a range of clinically relevant stenoses using SolidWorks. Stenoses diameters ranged from 1.0 to 3.0 mm and lengths from 5 to 60 mm within a fistula diameter of 7 mm. CFD analyses were performed using a blood model over a range of blood pressures. Eight patient-specific stenoses were also modeled and analyzed with CFD and the resulting blood flow calculations were validated by comparison with brachial artery flow measured by duplex ultrasound. Predicted flow rates were derived from CFD analysis of a range of stenoses. These stenoses were modeled by CFD and correlated with the ultrasound measured flow rate through the fistula of eight patients. The calculated flow rate using CFD correlated within 20% of ultrasound measured flow for five of eight patients. The mean difference was 17.2% (ranged from 1.3% to 30.1%). CFD analysis-generated flow rate tables provide valuable information to assess the functional significance of stenosis detected during imaging studies. The CFD study can help in determining the clinical relevance of a stenosis in access dysfunction and guide the need for intervention.
Modelling information flow along the human connectome using maximum flow.
Lyoo, Youngwook; Kim, Jieun E; Yoon, Sujung
2018-01-01
The human connectome is a complex network that transmits information between interlinked brain regions. Using graph theory, previously well-known network measures of integration between brain regions have been constructed under the key assumption that information flows strictly along the shortest paths possible between two nodes. However, it is now apparent that information does flow through non-shortest paths in many real-world networks such as cellular networks, social networks, and the internet. In the current hypothesis, we present a novel framework using the maximum flow to quantify information flow along all possible paths within the brain, so as to implement an analogy to network traffic. We hypothesize that the connection strengths of brain networks represent a limit on the amount of information that can flow through the connections per unit of time. This allows us to compute the maximum amount of information flow between two brain regions along all possible paths. Using this novel framework of maximum flow, previous network topological measures are expanded to account for information flow through non-shortest paths. The most important advantage of the current approach using maximum flow is that it can integrate the weighted connectivity data in a way that better reflects the real information flow of the brain network. The current framework and its concept regarding maximum flow provides insight on how network structure shapes information flow in contrast to graph theory, and suggests future applications such as investigating structural and functional connectomes at a neuronal level. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hur, Saebeom; Jae, Hwan Jun; Jang, Yeonggul; Min, Seung-Kee; Min, Sang-Il; Lee, Dong Yeon; Seo, Sang Gyo; Kim, Hyo-Cheol; Chung, Jin Wook; Kim, Kwang Gi; Park, Eun-Ah; Lee, Whal
2016-04-01
To demonstrate the feasibility of foot blood flow measurement by using dynamic volume perfusion computed tomographic (CT) technique with the upslope method in an animal experiment and a human study. The human study was approved by the institutional review board, and written informed consent was obtained from all patients. The animal study was approved by the research animal care and use committee. A perfusion CT experiment was first performed by using rabbits. A color-coded perfusion map was reconstructed by using in-house perfusion analysis software based on the upslope method, and the measured blood flow on the map was compared with the reference standard microsphere method by using correlation analysis. A total of 17 perfusion CT sessions were then performed (a) once in five human patients and (b) twice (before and after endovascular revascularization) in six human patients. Perfusion maps of blood flow were reconstructed and analyzed. The Wilcoxon signed rank test was used to prove significant differences in blood flow before and after treatment. The animal experiment demonstrated a strong correlation (R(2) = 0.965) in blood flow between perfusion CT and the microsphere method. Perfusion maps were obtained successfully in 16 human clinical sessions (94%) with the use of 32 mL of contrast medium and an effective radiation dose of 0.31 mSv (k factor for the ankle, 0.0002). The plantar dermis showed the highest blood flow among all anatomic structures of the foot, including muscle, subcutaneous tissue, tendon, and bone. After a successful revascularization procedure, the blood flow of the plantar dermis increased by 153% (P = .031). The interpretations of the color-coded perfusion map correlated well with the clinical and angiographic findings. Perfusion CT could be used to measure foot blood flow in both animals and humans. It can be a useful modality for the diagnosis of peripheral arterial disease by providing quantitative information on foot perfusion status.
Predicting System Accidents with Model Analysis During Hybrid Simulation
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land D.; Throop, David R.
2002-01-01
Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.
Thermal Analysis of Magnetically-Coupled Pump for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Senocak, Inanc; Udaykumar, H. S.; Ndri, Narcisse; Francois, Marianne; Shyy, Wei
1999-01-01
Magnetically-coupled pump is under evaluation at Kennedy Space Center for possible cryogenic applications. A major concern is the impact of low temperature fluid flows on the pump performance. As a first step toward addressing this and related issues, a computational fluid dynamics and heat transfer tool has been adopted in a pump geometry. The computational tool includes (i) a commercial grid generator to handle multiple grid blocks and complicated geometric definitions, and (ii) an in-house computational fluid dynamics and heat transfer software developed in the Principal Investigator's group at the University of Florida. Both pure-conduction and combined convection-conduction computations have been conducted. A pure-conduction analysis gives insufficient information about the overall thermal distribution. Combined convection-conduction analysis indicates the significant influence of the coolant over the entire flow path. Since 2-D simulation is of limited help, future work on full 3-D modeling of the pump using multi-materials is needed. A comprehensive and accurate model can be developed to take into account the effect of multi-phase flow in the cooling flow loop, and the magnetic interactions.
IB-LBM simulation of the haemocyte dynamics in a stenotic capillary.
Yuan-Qing, Xu; Xiao-Ying, Tang; Fang-Bao, Tian; Yu-Hua, Peng; Yong, Xu; Yan-Jun, Zeng
2014-01-01
To study the behaviour of a haemocyte when crossing a stenotic capillary, the immersed boundary-lattice Boltzmann method was used to establish a quantitative analysis model. The haemocyte was assumed to be spherical and to have an elastic cell membrane, which can be driven by blood flow to adopt a highly deformable character. In the stenotic capillary, the spherical blood cell was stressed both by the flow and the wall dimension, and the cell shape was forced to be stretched to cross the stenosis. Our simulation investigated the haemocyte crossing process in detail. The velocity and pressure were anatomised to obtain information on how blood flows through a capillary and to estimate the degree of cell damage caused by excessive pressure. Quantitative velocity analysis results demonstrated that a large haemocyte crossing a small stenosis would have a noticeable effect on blood flow, while quantitative pressure distribution analysis results indicated that the crossing process would produce a special pressure distribution in the cell interior and to some extent a sudden change between the cell interior and the surrounding plasma.
Communication Dynamics in Finite Capacity Social Networks
NASA Astrophysics Data System (ADS)
Haerter, Jan O.; Jamtveit, Bjørn; Mathiesen, Joachim
2012-10-01
In communication networks, structure and dynamics are tightly coupled. The structure controls the flow of information and is itself shaped by the dynamical process of information exchanged between nodes. In order to reconcile structure and dynamics, a generic model, based on the local interaction between nodes, is considered for the communication in large social networks. In agreement with data from a large human organization, we show that the flow is non-Markovian and controlled by the temporal limitations of individuals. We confirm the versatility of our model by predicting simultaneously the degree-dependent node activity, the balance between information input and output of nodes, and the degree distribution. Finally, we quantify the limitations to network analysis when it is based on data sampled over a finite period of time.
An impact of environmental changes on flows in the reach scale under a range of climatic conditions
NASA Astrophysics Data System (ADS)
Karamuz, Emilia; Romanowicz, Renata J.
2016-04-01
The present paper combines detection and adequate identification of causes of changes in flow regime at cross-sections along the Middle River Vistula reach using different methods. Two main experimental set ups (designs) have been applied to study the changes, a moving three-year window and low- and high-flow event based approach. In the first experiment, a Stochastic Transfer Function (STF) model and a quantile-based statistical analysis of flow patterns were compared. These two methods are based on the analysis of changes of the STF model parameters and standardised differences of flow quantile values. In the second experiment, in addition to the STF-based also a 1-D distributed model, MIKE11 was applied. The first step of the procedure used in the study is to define the river reaches that have recorded information on land use and water management changes. The second task is to perform the moving window analysis of standardised differences of flow quantiles and moving window optimisation of the STF model for flow routing. The third step consists of an optimisation of the STF and MIKE11 models for high- and low-flow events. The final step is to analyse the results and relate the standardised quantile changes and model parameter changes to historical land use changes and water management practices. Results indicate that both models give consistent assessment of changes in the channel for medium and high flows. ACKNOWLEDGEMENTS This research was supported by the Institute of Geophysics Polish Academy of Sciences through the Young Scientist Grant no. 3b/IGF PAN/2015.
Thermohydrodynamic analysis of cryogenic liquid turbulent flow fluid film bearings, phase 2
NASA Technical Reports Server (NTRS)
Sanandres, Luis
1994-01-01
The Phase 2 (1994) Annual Progress Report presents two major report sections describing the thermal analysis of tilting- and flexure-pad hybrid bearings, and the unsteady flow and transient response of a point mass rotor supported on fluid film bearings. A literature review on the subject of two-phase flow in fluid film bearings and part of the proposed work for 1995 are also included. The programs delivered at the end of 1994 are named hydroflext and hydrotran. Both codes are fully compatible with the hydrosealt (1993) program. The new programs retain the same calculating options of hydrosealt plus the added bearing geometries, and unsteady flow and transient forced response. Refer to the hydroflext & hydrotran User's Manual and Tutorial for basic information on the analysis and instructions to run the programs. The Examples Handbook contains the test bearing cases along with comparisons with experimental data or published analytical values. The following major tasks were completed in 1994 (Phase 2): (1) extension of the thermohydrodynamic analysis and development of computer program hydroflext to model various bearing geometries, namely, tilting-pad hydrodynamic journal bearings, flexure-pad cylindrical bearings (hydrostatic and hydrodynamic), and cylindrical pad bearings with a simple elastic matrix (ideal foil bearings); (2) improved thermal model including radial heat transfer through the bearing stator; (3) calculation of the unsteady bulk-flow field in fluid film bearings and the transient response of a point mass rotor supported on bearings; and (4) a literature review on the subject of two-phase flows and homogeneous-mixture flows in thin-film geometries.
Endo, Hidenori; Niizuma, Kuniyasu; Endo, Toshiki; Funamoto, Kenichi; Ohta, Makoto; Tominaga, Teiji
2016-01-01
This was a proof-of-concept computational fluid dynamics (CFD) study designed to identify atherosclerotic changes in intracranial aneurysms. We selected 3 patients with multiple unruptured aneurysms including at least one with atherosclerotic changes and investigated whether an image-based CFD study could provide useful information for discriminating the atherosclerotic aneurysms. Patient-specific geometries were constructed from three-dimensional data obtained using rotational angiography. Transient simulations were conducted under patient-specific inlet flow rates measured by phase-contrast magnetic resonance velocimetry. In the postanalyses, we calculated time-averaged wall shear stress (WSS), oscillatory shear index, and relative residence time (RRT). The volume of blood flow entering aneurysms through the neck and the mean velocity of blood flow inside aneurysms were examined. We applied the age-of-fluid method to quantitatively assess the residence of blood inside aneurysms. Atherosclerotic changes coincided with regions exposed to disturbed blood flow, as indicated by low WSS and long RRT. Blood entered aneurysms in phase with inlet flow rates. The mean velocities of blood inside atherosclerotic aneurysms were lower than those inside nonatherosclerotic aneurysms. Blood in atherosclerotic aneurysms was older than that in nonatherosclerotic aneurysms, especially near the wall. This proof-of-concept study demonstrated that CFD analysis provided detailed information on the exchange and residence of blood that is useful for the diagnosis of atherosclerotic changes in intracranial aneurysms. PMID:27703491
On the Limitations of Breakthrough Curve Analysis in Fixed-Bed Adsorption
NASA Technical Reports Server (NTRS)
Knox, James C.; Ebner, Armin D.; LeVan, M. Douglas; Coker, Robert F.; Ritter, James A.
2016-01-01
This work examined in detail the a priori prediction of the axial dispersion coefficient from available correlations versus obtaining it and also mass transfer information from experimental breakthrough data and the consequences that may arise when doing so based on using a 1-D axially dispersed plug flow model and its associated Danckwerts outlet boundary condition. These consequences mainly included determining the potential for erroneous extraction of the axial dispersion coefficient and/or the LDF mass transfer coefficient from experimental data, especially when non-plug flow conditions prevailed in the bed. Two adsorbent/adsorbate cases were considered, i.e., carbon dioxide and water vapor in zeolite 5A, because they both experimentally exhibited significant non-plug flow behavior, and the water-zeolite 5A system exhibited unusual concentration front sharpening that destroyed the expected constant pattern behavior (CPB) when modeled with the 1-D axially dispersed plug flow model. Overall, this work showed that it was possible to extract accurate mass transfer and dispersion information from experimental breakthrough curves using a 1-D axial dispersed plug flow model when they were measured both inside and outside the bed. To ensure the extracted information was accurate, the inside the bed breakthrough curves and their derivatives from the model were plotted to confirm whether or not the adsorbate/adsorbent system was exhibiting CPB or any concentration front sharpening near the bed exit. Even when concentration front sharpening was occurring with the water-zeolite 5A system, it was still possible to use the experimental inside and outside the bed breakthrough curves to extract fundamental mass transfer and dispersion information from the 1-D axial dispersed plug flow model based on the systematic methodology developed in this work.
Fluid dynamics in flexible tubes: An application to the study of the pulmonary circulation
NASA Technical Reports Server (NTRS)
Kuchar, N. R.
1971-01-01
Based on an analysis of unsteady, viscous flow through distensible tubes, a lumped-parameter model for the dynamics of blood flow through the pulmonary vascular bed was developed. The model is nonlinear, incorporating the variation of flow resistance with transmural pressure. Solved using a hybrid computer, the model yields information concerning the time-dependent behavior of blood pressures, flow rates, and volumes in each important class of vessels in each lobe of each lung in terms of the important physical and environmental parameters. Simulations of twenty abnormal or pathological situations of interest in environmental physiology and clinical medicine were performed. The model predictions agree well with physiological data.
Publications - GMC 332 | Alaska Division of Geological & Geophysical
DGGS GMC 332 Publication Details Title: X-Ray Diffraction analysis and flow testing of Hemlock information. Quadrangle(s): Alaska Statewide Bibliographic Reference BJ Services Company , 2006, X-Ray
Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software
Williams, Linda; Grayson, Diana; Gosbee, John
2001-01-01
Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.
Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software
Williams, Linda; Grayson, Diana; Gosbee, John
2002-01-01
Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.
NASA Technical Reports Server (NTRS)
Meyer, Marit Elisabeth
2015-01-01
A thermal precipitator (TP) was designed to collect smoke aerosol particles for microscopic analysis in fire characterization research. Information on particle morphology, size and agglomerate structure obtained from these tests supplements additional aerosol data collected. Modeling of the thermal precipitator throughout the design process was performed with the COMSOL Multiphysics finite element software package, including the Eulerian flow field and thermal gradients in the fluid. The COMSOL Particle Tracing Module was subsequently used to determine particle deposition. Modeling provided optimized design parameters such as geometry, flow rate and temperatures. The thermal precipitator was built and testing verified the performance of the first iteration of the device. The thermal precipitator was successfully operated and provided quality particle samples for microscopic analysis, which furthered the body of knowledge on smoke particulates. This information is a key element of smoke characterization and will be useful for future spacecraft fire detection research.
Simulation of a 3D Turbulent Wavy Channel based on the High-order WENO Scheme
NASA Astrophysics Data System (ADS)
Tsai, Bor-Jang; Chou, Chung-Chyi; Tsai, Yeong-Pei; Chuang, Ying Hung
2018-02-01
Passive interest turbulent drag reduction, effective means to improve air vehicle fuel consumption costs. Most turbulent problems happening to the nature and engineering applications were exactly the turbulence problem frequently caused by one or more turbulent shear flows. This study was operated with incompressible 3-D channels with cyclic wavy boundary to explore the physical properties of turbulence flow. This research measures the distribution of average velocity, instant flowing field shapes, turbulence and pressure distribution, etc. Furthermore, the systematic computation and analysis for the 3-D flow field was also implemented. It was aimed to clearly understand the turbulence fields formed by wavy boundary of tube flow. The purpose of this research is to obtain systematic structural information about the turbulent flow field and features of the turbulence structure are discussed.
Spectral features of solar plasma flows
NASA Astrophysics Data System (ADS)
Barkhatov, N. A.; Revunov, S. E.
2014-11-01
Research to the identification of plasma flows in the Solar wind by spectral characteristics of solar plasma flows in the range of magnetohydrodynamics is devoted. To do this, the wavelet skeleton pattern of Solar wind parameters recorded on Earth orbit by patrol spacecraft and then executed their neural network classification differentiated by bandwidths is carry out. This analysis of spectral features of Solar plasma flows in the form of magnetic clouds (MC), corotating interaction regions (CIR), shock waves (Shocks) and highspeed streams from coronal holes (HSS) was made. The proposed data processing and the original correlation-spectral method for processing information about the Solar wind flows for further classification as online monitoring of near space can be used. This approach will allow on early stages in the Solar wind flow detect geoeffective structure to predict global geomagnetic disturbances.
Natural Resource Information System, design analysis
NASA Technical Reports Server (NTRS)
1972-01-01
The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.
A project management system for the X-29A flight test program
NASA Technical Reports Server (NTRS)
Stewart, J. F.; Bauer, C. A.
1983-01-01
The project-management system developed for NASA's participation in the X-29A aircraft development program is characterized from a theoretical perspective, as an example of a system appropriate to advanced, highly integrated technology projects. System-control theory is applied to the analysis of classical project-management techniques and structures, which are found to be of closed-loop multivariable type; and the effects of increasing project complexity and integration are evaluated. The importance of information flow, sampling frequency, information holding, and delays is stressed. The X-29A system is developed in four stages: establishment of overall objectives and requirements, determination of information processes (block diagrams) definition of personnel functional roles and relationships, and development of a detailed work-breakdown structure. The resulting system is shown to require a greater information flow to management than conventional methods. Sample block diagrams are provided.
A Geometry Based Infra-structure for Computational Analysis and Design
NASA Technical Reports Server (NTRS)
Haimes, Robert
1997-01-01
The computational steps traditionally taken for most engineering analysis (CFD, structural analysis, and etc.) are: Surface Generation - usually by employing a CAD system; Grid Generation - preparing the volume for the simulation; Flow Solver - producing the results at the specified operational point; and Post-processing Visualization - interactively attempting to understand the results For structural analysis, integrated systems can be obtained from a number of commercial vendors. For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. Specifically the problems with this procedure are: (1) File based. Information flows from one step to the next via data files with formats specified for that procedure. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. (3) One-Way communication. All information travels on from one phase to the next. Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive.
Transitional flow in thin tubes for space station freedom radiator
NASA Technical Reports Server (NTRS)
Loney, Patrick; Ibrahim, Mounir
1995-01-01
A two dimensional finite volume method is used to predict the film coefficients in the transitional flow region (laminar or turbulent) for the radiator panel tubes. The code used to perform this analysis is CAST (Computer Aided Simulation of Turbulent Flows). The information gathered from this code is then used to augment a Sinda85 model that predicts overall performance of the radiator. A final comparison is drawn between the results generated with a Sinda85 model using the Sinda85 provided transition region heat transfer correlations and the Sinda85 model using the CAST generated data.
Flow visualization and flow field measurements of a 1/12 scale tilt rotor aircraft in hover
NASA Technical Reports Server (NTRS)
Coffen, Charles D.; George, Albert R.; Hardinge, Hal; Stevenson, Ryan
1991-01-01
The results are given of flow visualization studies and inflow velocity field measurements performed on a 1/12 scale model of the XV-15 tilt rotor aircraft in the hover mode. The complex recirculating flow due to the rotor-wake-body interactions characteristic of tilt rotors was studied visually using neutrally buoyant soap bubbles and quantitatively using hot wire anemometry. Still and video photography were used to record the flow patterns. Analysis of the photos and video provided information on the physical dimensions of the recirculating fountain flow and on details of the flow including the relative unsteadiness and turbulence characteristics of the flow. Recirculating flows were also observed along the length of the fuselage. Hot wire anemometry results indicate that the wing under the rotor acts to obstruct the inflow causing a deficit in the inflow velocities over the inboard region of the model. Hot wire anemometry also shows that the turbulence intensities in the inflow are much higher in the recirculating fountain reingestion zone.
NASA Astrophysics Data System (ADS)
Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric
2017-10-01
The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.
Closing Intelligence Gaps: Synchronizing the Collection Management Process
information flow. The US military divides the world into six distinct geographic areas with corresponding commanders managing risk and weighing...analyzed information , creating a mismatch between supply and demand. The result is a burden on all facets of the intelligence process. However, if the target...system, or problem requiring analysis is not collected, intelligence fails. Executing collection management under the traditional tasking process
ERIC Educational Resources Information Center
Zheng, Lanqin; Yang, Kaicheng; Huang, Ronghuai
2012-01-01
This study proposes a new method named the IIS-map-based method for analyzing interactions in face-to-face collaborative learning settings. This analysis method is conducted in three steps: firstly, drawing an initial IIS-map according to collaborative tasks; secondly, coding and segmenting information flows into information items of IIS; thirdly,…
NASA Astrophysics Data System (ADS)
Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh
2017-03-01
Silicon testing results are regularly collected for a particular lot of wafers to study yield loss from test result diagnostics. Product engineers will analyze the diagnostic results and perform a number of physical failure analyses to detect systematic defects which cause yield loss for these sets of wafers in order to feedback the information to process engineers for process improvements. Most of time, the systematic defects that are detected are major issues or just one of the causes for the overall yield loss. This paper will present a working flow for using design analysis techniques combined with diagnostic methods to systematically transform silicon testing information into physical layout information. A new set of the testing results are received from a new lot of wafers for the same product. We can then correlate all the diagnostic results from different periods of time to check which blocks or nets have been highlighted or stop occurring on the failure reports in order to monitor process changes which impact the yield. The design characteristic analysis flow is also implemented to find 1) the block connections on a design that have failed electrical test or 2) frequently used cells that been highlighted multiple times.
Direct numerical simulation of reactor two-phase flows enabled by high-performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Jun; Cambareri, Joseph J.; Brown, Cameron S.
Nuclear reactor two-phase flows remain a great engineering challenge, where the high-resolution two-phase flow database which can inform practical model development is still sparse due to the extreme reactor operation conditions and measurement difficulties. Owing to the rapid growth of computing power, the direct numerical simulation (DNS) is enjoying a renewed interest in investigating the related flow problems. A combination between DNS and an interface tracking method can provide a unique opportunity to study two-phase flows based on first principles calculations. More importantly, state-of-the-art high-performance computing (HPC) facilities are helping unlock this great potential. This paper reviews the recent researchmore » progress of two-phase flow DNS related to reactor applications. The progress in large-scale bubbly flow DNS has been focused not only on the sheer size of those simulations in terms of resolved Reynolds number, but also on the associated advanced modeling and analysis techniques. Specifically, the current areas of active research include modeling of sub-cooled boiling, bubble coalescence, as well as the advanced post-processing toolkit for bubbly flow simulations in reactor geometries. A novel bubble tracking method has been developed to track the evolution of bubbles in two-phase bubbly flow. Also, spectral analysis of DNS database in different geometries has been performed to investigate the modulation of the energy spectrum slope due to bubble-induced turbulence. In addition, the single-and two-phase analysis results are presented for turbulent flows within the pressurized water reactor (PWR) core geometries. The related simulations are possible to carry out only with the world leading HPC platforms. These simulations are allowing more complex turbulence model development and validation for use in 3D multiphase computational fluid dynamics (M-CFD) codes.« less
Williams, John H.; Paillet, Frederick L.
2002-01-01
Flow zones in a fractured shale in and near a plume of volatile organic compounds at the Watervliet Arsenal in Albany County, N. Y. were characterized through the integrated analysis of geophysical logs and single- and cross-hole flow tests. Information on the fracture-flow network at the site was needed to design an effective groundwater monitoring system, estimate offsite contaminant migration, and evaluate potential containment and remedial actions.Four newly drilled coreholes and four older monitoring wells were logged and tested to define the distribution and orientation of fractures that intersected a combined total of 500 feet of open hole. Analysis of borehole-wall image logs obtained with acoustic and optical televiewers indicated 79 subhorizontal to steeply dipping fractures with a wide range of dip directions. Analysis of fluid resistivity, temperature, and heat-pulse and electromagnetic flowmeter logs obtained under ambient and short-term stressed conditions identified 14 flow zones, which consist of one to several fractures and whose estimated transmissivity values range from 0.1 to more than 250 feet squared per day.Cross-hole flow tests, which were used to characterize the hydraulic connection between fracture-flow zones intersected by the boreholes, entailed (1) injection into or extraction from boreholes that penetrated a single fracture-flow zone or whose zones were isolated by an inflatable packer, and (2) measurement of the transient response of water levels and flow in surrounding boreholes. Results indicate a wellconnected fracture network with an estimated transmissivity of 80 to 250 feet squared per day that extends for at least 200 feet across the site. This interconnected fracture-flow network greatly affects the hydrology of the site and has important implications for contaminant monitoring and remedial actions.
NASA Astrophysics Data System (ADS)
von Larcher, Thomas; Blome, Therese; Klein, Rupert; Schneider, Reinhold; Wolf, Sebastian; Huber, Benjamin
2016-04-01
Handling high-dimensional data sets like they occur e.g. in turbulent flows or in multiscale behaviour of certain types in Geosciences are one of the big challenges in numerical analysis and scientific computing. A suitable solution is to represent those large data sets in an appropriate compact form. In this context, tensor product decomposition methods currently emerge as an important tool. One reason is that these methods often enable one to attack high-dimensional problems successfully, another that they allow for very compact representations of large data sets. We follow the novel Tensor-Train (TT) decomposition method to support the development of improved understanding of the multiscale behavior and the development of compact storage schemes for solutions of such problems. One long-term goal of the project is the construction of a self-consistent closure for Large Eddy Simulations (LES) of turbulent flows that explicitly exploits the tensor product approach's capability of capturing self-similar structures. Secondly, we focus on a mixed deterministic-stochastic subgrid scale modelling strategy currently under development for application in Finite Volume Large Eddy Simulation (LES) codes. Advanced methods of time series analysis for the databased construction of stochastic models with inherently non-stationary statistical properties and concepts of information theory based on a modified Akaike information criterion and on the Bayesian information criterion for the model discrimination are used to construct surrogate models for the non-resolved flux fluctuations. Vector-valued auto-regressive models with external influences form the basis for the modelling approach [1], [2], [4]. Here, we present the reconstruction capabilities of the two modeling approaches tested against 3D turbulent channel flow data computed by direct numerical simulation (DNS) for an incompressible, isothermal fluid at Reynolds number Reτ = 590 (computed by [3]). References [1] I. Horenko. On identification of nonstationary factor models and its application to atmospherical data analysis. J. Atm. Sci., 67:1559-1574, 2010. [2] P. Metzner, L. Putzig and I. Horenko. Analysis of persistent non-stationary time series and applications. CAMCoS, 7:175-229, 2012. [3] M. Uhlmann. Generation of a temporally well-resolved sequence of snapshots of the flow-field in turbulent plane channel flow. URL: http://www-turbul.ifh.unikarlsruhe.de/uhlmann/reports/produce.pdf, 2000. [4] Th. von Larcher, A. Beck, R. Klein, I. Horenko, P. Metzner, M. Waidmann, D. Igdalov, G. Gassner and C.-D. Munz. Towards a Framework for the Stochastic Modelling of Subgrid Scale Fluxes for Large Eddy Simulation. Meteorol. Z., 24:313-342, 2015.
Kim, Jinyoung
2017-12-01
As it becomes common for Internet users to use hashtags when posting and searching information on social media, it is important to understand who builds a hashtag network and how information is circulated within the network. This article focused on unlocking the potential of the #AlphaGo hashtag network by addressing the following questions. First, the current study examined whether traditional opinion leadership (i.e., the influentials hypothesis) or grassroot participation by the public (i.e., the interpersonal hypothesis) drove dissemination of information in the hashtag network. Second, several unique patterns of information distribution by key users were identified. Finally, the association between attributes of key users who exerted great influence on information distribution (i.e., the number of followers and follows) and their central status in the network was tested. To answer the proffered research questions, a social network analysis was conducted using a large-scale hashtag network data set from Twitter (n = 21,870). The results showed that the leading actors in the network were actively receiving information from their followers rather than serving as intermediaries between the original information sources and the public. Moreover, the leading actors played several roles (i.e., conversation starters, influencers, and active engagers) in the network. Furthermore, the number of their follows and followers were significantly associated with their central status in the hashtag network. Based on the results, the current research explained how the information was exchanged in the hashtag network by proposing the reciprocal model of information flow.
Skylab S-191 spectrometer single spectral scan analysis program. [user manual
NASA Technical Reports Server (NTRS)
Downes, E. L.
1974-01-01
Documentation and user information for the S-191 single spectral scan analysis program are reported. A breakdown of the computational algorithms is supplied, followed by the program listing and examples of sample output. A copy of the flow chart which describes the driver routine in the body of the main program segment is included.
Meyer, Georg F; Spray, Amy; Fairlie, Jo E; Uomini, Natalie T
2014-01-01
Current neuroimaging techniques with high spatial resolution constrain participant motion so that many natural tasks cannot be carried out. The aim of this paper is to show how a time-locked correlation-analysis of cerebral blood flow velocity (CBFV) lateralization data, obtained with functional TransCranial Doppler (fTCD) ultrasound, can be used to infer cerebral activation patterns across tasks. In a first experiment we demonstrate that the proposed analysis method results in data that are comparable with the standard Lateralization Index (LI) for within-task comparisons of CBFV patterns, recorded during cued word generation (CWG) at two difficulty levels. In the main experiment we demonstrate that the proposed analysis method shows correlated blood-flow patterns for two different cognitive tasks that are known to draw on common brain areas, CWG, and Music Synthesis. We show that CBFV patterns for Music and CWG are correlated only for participants with prior musical training. CBFV patterns for tasks that draw on distinct brain areas, the Tower of London and CWG, are not correlated. The proposed methodology extends conventional fTCD analysis by including temporal information in the analysis of cerebral blood-flow patterns to provide a robust, non-invasive method to infer whether common brain areas are used in different cognitive tasks. It complements conventional high resolution imaging techniques.
Comparison of connectivity analyses for resting state EEG data
NASA Astrophysics Data System (ADS)
Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo
2017-06-01
Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.
Analysis of the U.S. geological survey streamgaging network
Scott, A.G.
1987-01-01
This paper summarizes the results from the first 3 years of a 5-year cost-effectiveness study of the U.S. Geological Survey streamgaging network. The objective of the study is to define and document the most cost-effective means of furnishing streamflow information. In the first step of this study, data uses were identified for 3,493 continuous-record stations currently being operated in 32 States. In the second step, evaluation of alternative methods of providing streamflow information, flow-routing models, and regression models were developed for estimating daily flows at 251 stations of the 3,493 stations analyzed. In the third step of the analysis, relationships were developed between the accuracy of the streamflow records and the operating budget. The weighted standard error for all stations, with current operating procedures, was 19.9 percent. By altering field activities, as determined by the analyses, this could be reduced to 17.8 percent. The existing streamgaging networks in four Districts were further analyzed to determine the impacts that satellite telemetry would have on the cost effectiveness. Satellite telemetry was not found to be cost effective on the basis of hydrologic data collection alone, given present cost of equipment and operation.This paper summarizes the results from the first 3 years of a 5-year cost-effectiveness study of the U. S. Geological Survey streamgaging network. The objective of the study is to define and document the most cost-effective means of furnishing streamflow information. In the first step of this study, data uses were identified for 3,493 continuous-record stations currently being operated in 32 States. In the second step, evaluation of alternative methods of providing streamflow information, flow-routing models, and regression models were developed for estimating daily flows at 251 stations of the 3, 493 stations analyzed. In the third step of the analysis, relationships were developed between the accuracy of the streamflow records and the operating budget. The weighted standard error for all stations, with current operating procedures, was 19. 9 percent. By altering field activities, as determined by the analyses, this could be reduced to 17. 8 percent. Additional study results are discussed.
Improved Measurement of B(sub 22) of Macromolecules in a Flow Cell
NASA Technical Reports Server (NTRS)
Wilson, Wilbur; Fanguy, Joseph; Holman, Steven; Guo, Bin
2008-01-01
An improved apparatus has been invented for use in determining the osmotic second virial coefficient of macromolecules in solution. In a typical intended application, the macromolecules would be, more specifically, protein molecules, and the protein solution would be pumped through a flow cell to investigate the physical and chemical conditions that affect crystallization of the protein in question. Some background information is prerequisite to a meaningful description of the novel aspects of this apparatus. A method of determining B22 from simultaneous measurements of the static transmittance (taken as an indication of concentration) and static scattering of light from the same location in a flowing protein solution was published in 2004. The apparatus used to implement the method at that time included a dual-detector flow cell, which had two drawbacks: a) The amount of protein required for analysis of each solution condition was of the order of a milligram - far too large a quantity for a high-throughput analysis system, for which microgram or even nanogram quantities of protein per analysis are desirable. b) The design of flow cell was such that two light sources were used to probe different regions of the flowing solution. Consequently, the apparatus did not afford simultaneous measurements at the same location in the solution and, hence, did not guarantee an accurate determination of B22.
Lin, Chih-Yung; Chuang, Chao-Chun; Hua, Tzu-En; Chen, Chun-Chao; Dickson, Barry J; Greenspan, Ralph J; Chiang, Ann-Shyn
2013-05-30
How the brain perceives sensory information and generates meaningful behavior depends critically on its underlying circuitry. The protocerebral bridge (PB) is a major part of the insect central complex (CX), a premotor center that may be analogous to the human basal ganglia. Here, by deconstructing hundreds of PB single neurons and reconstructing them into a common three-dimensional framework, we have constructed a comprehensive map of PB circuits with labeled polarity and predicted directions of information flow. Our analysis reveals a highly ordered information processing system that involves directed information flow among CX subunits through 194 distinct PB neuron types. Circuitry properties such as mirroring, convergence, divergence, tiling, reverberation, and parallel signal propagation were observed; their functional and evolutional significance is discussed. This layout of PB neuronal circuitry may provide guidelines for further investigations on transformation of sensory (e.g., visual) input into locomotor commands in fly brains. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Accounting Information Systems in Healthcare: A Review of the Literature.
Hammour, Hadal; Househ, Mowafa; Razzak, Hira Abdul
2017-01-01
As information technology progresses in Saudi Arabia, the manual accounting systems have become graduallyinadequate for decision needs. Subsequently, private and public healthcare divisions in Saudi Arabia perceive Computerized accounting information system (CAIS) as a vehicle to safeguard efficient and effective flow of information during the analysis, processes, and recording of financial data. Efficient and effective flow of information improvesthe decision making of staff, thereby improving the capability of health care sectors to reduce cost of the medical services.In this paper, we define computerized accounting systems from the point of view of health informatics. Also, the challenges and benefits of supporting CAIS applications in hospitals of Saudi Arabia. With these elements, we conclude that CAIS in Saudi Arabia can serve as a valuable tool for evaluating and controlling the cost of medical services in healthcare sectors. Supplementary education on the significance of having systems of computerized accounting within hospitals for nurses, doctors, and accountants with other health care staff is warranted in future.
A general multiblock Euler code for propulsion integration. Volume 1: Theory document
NASA Technical Reports Server (NTRS)
Chen, H. C.; Su, T. Y.; Kao, T. J.
1991-01-01
A general multiblock Euler solver was developed for the analysis of flow fields over geometrically complex configurations either in free air or in a wind tunnel. In this approach, the external space around a complex configuration was divided into a number of topologically simple blocks, so that surface-fitted grids and an efficient flow solution algorithm could be easily applied in each block. The computational grid in each block is generated using a combination of algebraic and elliptic methods. A grid generation/flow solver interface program was developed to facilitate the establishment of block-to-block relations and the boundary conditions for each block. The flow solver utilizes a finite volume formulation and an explicit time stepping scheme to solve the Euler equations. A multiblock version of the multigrid method was developed to accelerate the convergence of the calculations. The generality of the method was demonstrated through the analysis of two complex configurations at various flow conditions. Results were compared to available test data. Two accompanying volumes, user manuals for the preparation of multi-block grids (vol. 2) and for the Euler flow solver (vol. 3), provide information on input data format and program execution.
Acoustics flow analysis in circular duct using sound intensity and dynamic mode decomposition
NASA Astrophysics Data System (ADS)
Weyna, S.
2014-08-01
Sound intensity generation in hard-walled duct with acoustic flow (no mean-flow) is treated experimentally and shown graphically. In paper, numerous methods of visualization illustrating the vortex flow (2D, 3D) can graphically explain diffraction and scattering phenomena occurring inside the duct and around open end area. Sound intensity investigation in annular duct gives a physical picture of sound waves in any duct mode. In the paper, modal energy analysis are discussed with particular reference to acoustics acoustic orthogonal decomposition (AOD). The image of sound intensity fields before and above "cut-off" frequency region are found to compare acoustic modes which might resonate in duct. The experimental results show also the effects of axial and swirling flow. However acoustic field is extremely complicated, because pressures in non-propagating (cut-off) modes cooperate with the particle velocities in propagating modes, and vice versa. Measurement in cylindrical duct demonstrates also the cut-off phenomenon and the effect of reflection from open end. The aim of experimental study was to obtain information on low Mach number flows in ducts in order to improve physical understanding and validate theoretical CFD and CAA models that still may be improved.
Information Transfer in the Brain: Insights from a Unified Approach
NASA Astrophysics Data System (ADS)
Marinazzo, Daniele; Wu, Guorong; Pellicoro, Mario; Stramaglia, Sebastiano
Measuring directed interactions in the brain in terms of information flow is a promising approach, mathematically treatable and amenable to encompass several methods. In this chapter we propose some approaches rooted in this framework for the analysis of neuroimaging data. First we will explore how the transfer of information depends on the network structure, showing how for hierarchical networks the information flow pattern is characterized by exponential distribution of the incoming information and a fat-tailed distribution of the outgoing information, as a signature of the law of diminishing marginal returns. This was reported to be true also for effective connectivity networks from human EEG data. Then we address the problem of partial conditioning to a limited subset of variables, chosen as the most informative ones for the driver node.We will then propose a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be associated to the sign of the contribution. Applications are reported for EEG and fMRI data.
Solving the influence maximization problem reveals regulatory organization of the yeast cell cycle.
Gibbs, David L; Shmulevich, Ilya
2017-06-01
The Influence Maximization Problem (IMP) aims to discover the set of nodes with the greatest influence on network dynamics. The problem has previously been applied in epidemiology and social network analysis. Here, we demonstrate the application to cell cycle regulatory network analysis for Saccharomyces cerevisiae. Fundamentally, gene regulation is linked to the flow of information. Therefore, our implementation of the IMP was framed as an information theoretic problem using network diffusion. Utilizing more than 26,000 regulatory edges from YeastMine, gene expression dynamics were encoded as edge weights using time lagged transfer entropy, a method for quantifying information transfer between variables. By picking a set of source nodes, a diffusion process covers a portion of the network. The size of the network cover relates to the influence of the source nodes. The set of nodes that maximizes influence is the solution to the IMP. By solving the IMP over different numbers of source nodes, an influence ranking on genes was produced. The influence ranking was compared to other metrics of network centrality. Although the top genes from each centrality ranking contained well-known cell cycle regulators, there was little agreement and no clear winner. However, it was found that influential genes tend to directly regulate or sit upstream of genes ranked by other centrality measures. The influential nodes act as critical sources of information flow, potentially having a large impact on the state of the network. Biological events that affect influential nodes and thereby affect information flow could have a strong effect on network dynamics, potentially leading to disease. Code and data can be found at: https://github.com/gibbsdavidl/miergolf.
Escribano, Luis; Garcia Montero, Andres C; Núñez, Rosa; Orfao, Alberto
2006-08-01
Human mast cells (MCs) are directly derived from human pluripotent CD34+ stem and progenitor hematopoietic cells with stem cell factor being a critical growth factor supporting human MC proliferation, differentiation, and survival. Because of the advantages that flow cytometry offers (it allows rapid, objective, and sensitive multiparameter analysis of high numbers of cells from a sample, with information being provided on the basis of a single cell), it has become the method of choice in the past decade for immunophenotypic identification, enumeration, and characterization of human MCs in bone marrow and other tissue specimens.
Analysis of information flows among individual companies in the KOSDAQ market
NASA Astrophysics Data System (ADS)
Kim, Ho-Yong; Oh, Gabjin
2016-08-01
In this paper, we employ the variance decomposition method to measure the strength and the direction of interconnections among companies in the KOSDAQ (Korean Securities Dealers Automated Quotation) stock market. We analyze the 200 companies listed on the KOSDAQ market from January 2001 to December 2015. We find that the systemic risk, measured by using the interconnections, increases substantially during periods of financial crisis such as the bankruptcy of Lehman brothers and the European financial crisis. In particular, we find that the increases in the aggregated information flows can be used to predict the increment of the market volatility that may occur during a sub-prime financial crisis period.
NASA Astrophysics Data System (ADS)
Moult, Eric M.; Ploner, Stefan A.; Choi, WooJhon; Lee, ByungKun; Husvogt, Lennart A.; Lu, Chen D.; Novais, Eduardo; Cole, Emily D.; Potsaid, Benjamin M.; Duker, Jay S.; Hornegger, Joachim; Meier, Andreas K.; Waheed, Nadia K.; Fujimoto, James G.
2017-02-01
OCT angiography (OCTA) has recently garnered immense interest in clinical ophthalmology, permitting ocular vasculature to be viewed in exquisite detail, in vivo, and without the injection of exogenous dyes. However, commercial OCTA systems provide little information about actual erythrocyte speeds; instead, OCTA is typically used to visualize the presence and/or absence of vasculature. This is an important limitation because in many ocular diseases, including diabetic retinopathy (DR) and age-related macular degeneration (AMD), alterations in blood flow, but not necessarily only the presence or absence of vasculature, are thought to be important in understanding pathogenesis. To address this limitation, we have developed an algorithm, variable interscan time analysis (VISTA), which is capable of resolving different erythrocyte speeds. VISTA works by acquiring >2 repeated B-scans, and then computing multiple OCTA signals corresponding to different effective interscan times. The OCTA signals corresponding to different effective interscan times contain independent information about erythrocyte speed. In this study we provide a theoretical overview of VISTA, and investigate the utility of VISTA in studying blood flow alterations in ocular disease. OCTA-VISTA images of eyes with choroidal neovascularization, geographic atrophy, and diabetic retinopathy are presented.
Current Status on Radiation Modeling for the Hayabusa Re-entry
NASA Technical Reports Server (NTRS)
Winter, Michael W.; McDaniel, Ryan D.; Chen, Yih-Kang; Liu, Yen; Saunders, David
2011-01-01
On June 13, 2010 the Japanese Hayabusa capsule performed its reentry into the Earths atmosphere over Australia after a seven year journey to the asteroid Itokawa. The reentry was studied by numerous imaging and spectroscopic instruments onboard NASA's DC-8 Airborne Laboratory and from three sites on the ground, in order to measure surface and plasma radiation generated by the Hayabusa Sample Return Capsule (SRC). Post flight, the flow solutions were recomputed to include the whole flow field around the capsule at 11 points along the reentry trajectory using updated trajectory information. Again, material response was taken into account to obtain most reliable surface temperature information. These data will be used to compute thermal radiation of the glowing heat shield and plasma radiation by the shock/post-shock layer system to support analysis of the experimental observation data. For this purpose, lines of sight data are being extracted from the flow field volume grids and plasma radiation will be computed using NEQAIR [4] which is a line-by-line spectroscopic code with one-dimensional transport of radiation intensity. The procedures being used were already successfully applied to the analysis of the observation of the Stardust reentry [5].
[The motive force of evolution based on the principle of organismal adjustment evolution.].
Cao, Jia-Shu
2010-08-01
From the analysis of the existing problems of the prevalent theories of evolution, this paper discussed the motive force of evolution based on the knowledge of the principle of organismal adjustment evolution to get a new understanding of the evolution mechanism. In the guide of Schrodinger's theory - "life feeds on negative entropy", the author proposed that "negative entropy flow" actually includes material flow, energy flow and information flow, and the "negative entropy flow" is the motive force for living and development. By modifying my own theory of principle of organismal adjustment evolution (not adaptation evolution), a new theory of "regulation system of organismal adjustment evolution involved in DNA, RNA and protein interacting with environment" is proposed. According to the view that phylogenetic development is the "integral" of individual development, the difference of negative entropy flow between organisms and environment is considered to be a motive force for evolution, which is a new understanding of the mechanism of evolution. Based on such understanding, evolution is regarded as "a changing process that one subsystem passes all or part of its genetic information to the next generation in a larger system, and during the adaptation process produces some new elements, stops some old ones, and thereby lasts in the larger system". Some other controversial questions related to evolution are also discussed.
Inertial objects in complex flows
NASA Astrophysics Data System (ADS)
Syed, Rayhan; Ho, George; Cavas, Samuel; Bao, Jialun; Yecko, Philip
2017-11-01
Chaotic Advection and Finite Time Lyapunov Exponents both describe stirring and transport in complex and time-dependent flows, but FTLE analysis has been largely limited to either purely kinematic flow models or high Reynolds number flow field data. The neglect of dynamic effects in FTLE and Lagrangian Coherent Structure studies has stymied detailed information about the role of pressure, Coriolis effects and object inertia. We present results of laboratory and numerical experiments on time-dependent and multi-gyre Stokes flows. In the lab, a time-dependent effectively two-dimensional low Re flow is used to distinguish transport properties of passive tracer from those of small paramagnetic spheres. Companion results of FTLE calculations for inertial particles in a time-dependent multi-gyre flow are presented, illustrating the critical roles of density, Stokes number and Coriolis forces on their transport. Results of Direct Numerical Simulations of fully resolved inertial objects (spheroids) immersed in a three dimensional (ABC) flow show the role of shape and finite size in inertial transport at small finite Re. We acknowledge support of NSF DMS-1418956.
The Shock and Vibration Digest. Volume 15. Number 1
1983-01-01
acoustics The books are arranged to engineer is statistical energy analysis (SEA). This show the wealth of information that exists and the concept is...is also used for vibrating systems in pie nonlinear elements. However, for systems with a which statistical energy analysis and power flow continuous... statistical energy analysis to analyze the random nonlinear algebraic equations can be difficult. response of two identical subsystems coupled at an end
Population Viability Analysis of Riverine Fishes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, P.; Chandler, J.; Jager, H.I.
Many utilities face conflkts between two goals: cost-efficient hydropower generation and protecting riverine fishes. Research to develop ecological simulation tools that can evaluate alternative mitigation strategies in terms of their benefits to fish populations is vital to informed decision-making. In this paper, we describe our approach to population viability analysis of riverine fishes in general and Snake River white sturgeon in particular. We are finding that the individual-based modeling approach used in previous in-stream flow applications is well suited to addressing questions about the viability of species of concern for several reasons. Chief among these are: (1) the abiIity tomore » represent the effects of individual variation in life history characteristics on predicted population viabili~, (2) the flexibili~ needed to quanti~ the ecological benefits of alternative flow management options by representing spatial and temporal variation in flow and temperaturty and (3) the flexibility needed to quantifi the ecological benefits of non-flow related manipulations (i.e., passage, screening and hatchery supplementation).« less
Wang, Chengdong; Zhang, Shenyan; Yan, Wanglin; Wang, Renqing; Liu, Jian; Wang, Yutao
2016-11-18
Renewable natural resources, such as solar radiation, rainfall, wind, and geothermal heat, together with ecosystem services, provide the elementary supports for the sustainable development of human society. To improve regional sustainability, we studied the spatial distributions and quantities of renewable natural resources and net primary productivity (NPP) in Hokkaido, which is the second largest island of Japan. With the help of Geographic Information System (GIS) software, distribution maps for each type of renewable natural resource were generated by kriging interpolation based on statistical records. A composite map of the flow of all types of renewable natural resources was also generated by map layer overlapping. Additionally, we utilized emergy analysis to convert each renewable flow with different attributes into a unified unit (i.e., solar equivalent joules [sej]). As a result, the spatial distributions of the flow of renewable natural resources of the Hokkaido region are presented in the form of thematic emergy maps. Thus, the areas with higher renewable emergy can be easily visualized and identified. The dominant renewable flow in certain areas can also be directly distinguished. The results can provide useful information for regional sustainable development, environmental conservation and ecological management.
Wang, Chengdong; Zhang, Shenyan; Yan, Wanglin; Wang, Renqing; Liu, Jian; Wang, Yutao
2016-01-01
Renewable natural resources, such as solar radiation, rainfall, wind, and geothermal heat, together with ecosystem services, provide the elementary supports for the sustainable development of human society. To improve regional sustainability, we studied the spatial distributions and quantities of renewable natural resources and net primary productivity (NPP) in Hokkaido, which is the second largest island of Japan. With the help of Geographic Information System (GIS) software, distribution maps for each type of renewable natural resource were generated by kriging interpolation based on statistical records. A composite map of the flow of all types of renewable natural resources was also generated by map layer overlapping. Additionally, we utilized emergy analysis to convert each renewable flow with different attributes into a unified unit (i.e., solar equivalent joules [sej]). As a result, the spatial distributions of the flow of renewable natural resources of the Hokkaido region are presented in the form of thematic emergy maps. Thus, the areas with higher renewable emergy can be easily visualized and identified. The dominant renewable flow in certain areas can also be directly distinguished. The results can provide useful information for regional sustainable development, environmental conservation and ecological management. PMID:27857230
An efficient and rapid transgenic pollen screening and detection method using flow cytometry.
Moon, Hong S; Eda, Shigetoshi; Saxton, Arnold M; Ow, David W; Stewart, C Neal
2011-01-01
Assaying for transgenic pollen, a major vector of transgene flow, provides valuable information and essential data for the study of gene flow and assessing the effectiveness of transgene containment. Most studies have employed microscopic screening methods or progeny analyses to estimate the frequency of transgenic pollen. However, these methods are time-consuming and laborious when large numbers of pollen grains must be analyzed to look for rare transgenic pollen grains. Thus, there is an urgent need for the development of a simple, rapid, and high throughput analysis method for transgenic pollen analysis. In this study, our objective was to determine the accuracy of using flow cytometry technology for transgenic pollen quantification in practical application where transgenic pollen is not frequent. A suspension of non-transgenic tobacco pollen was spiked with a known amount of verified transgenic tobacco pollen synthesizing low or high amounts of green fluorescent protein (GFP). The flow cytometric method detected approximately 75% and 100% of pollen grains synthesizing low and high amounts of GFP, respectively. The method is rapid, as it is able to count 5000 pollen grains per minute-long run. Our data indicate that this flow cytometric method is useful to study gene flow and assessment of transgene containment.
Utengen, Audun; Rouholiman, Dara; Gamble, Jamison G; Grajales, Francisco Jose; Pradhan, Nisha; Staley, Alicia C; Bernstein, Liza; Young, Sean D; Clauson, Kevin A; Chu, Larry F
2017-08-17
Health care conferences present a unique opportunity to network, spark innovation, and disseminate novel information to a large audience, but the dissemination of information typically stays within very specific networks. Social network analysis can be adopted to understand the flow of information between virtual social communities and the role of patients within the network. The purpose of this study is to examine the impact engaged patients bring to health care conference social media information flow and how they expand dissemination and distribution of tweets compared to other health care conference stakeholders such as physicians and researchers. From January 2014 through December 2016, 7,644,549 tweets were analyzed from 1672 health care conferences with at least 1000 tweets who had registered in Symplur's Health Care Hashtag Project from 2014 to 2016. The tweet content was analyzed to create a list of the top 100 influencers by mention from each conference, who were then subsequently categorized by stakeholder group. Multivariate linear regression models were created using stepwise function building to identify factors explaining variability as predictor variables for the model in which conference tweets were taken as the dependent variable. Inclusion of engaged patients in health care conference social media was low compared to that of physicians and has not significantly changed over the last 3 years. When engaged patient voices are included in health care conferences, they greatly increase information flow as measured by total tweet volume (beta=301.6) compared to physicians (beta=137.3, P<.001), expand propagation of information tweeted during a conference as measured by social media impressions created (beta=1,700,000) compared to physicians (beta=270,000, P<.001), and deepen engagement in the tweet conversation as measured by replies to their tweets (beta=24.4) compared to physicians (beta=5.5, P<.001). Social network analysis of hubs and authorities revealed that patients had statistically significant higher hub scores (mean 8.26×10-4, SD 2.96×10-4) compared to other stakeholder groups' Twitter accounts (mean 7.19×10-4, SD 3.81×10-4; t273.84=4.302, P<.001). Although engaged patients are powerful accelerators of information flow, expanders of tweet propagation, and greatly deepen engagement in conversation of tweets on social media of health care conferences compared to physicians, they represent only 1.4% of the stakeholder mix of the top 100 influencers in the conversation. Health care conferences that fail to engage patients in their proceedings may risk limiting their engagement with the public, disseminating scientific information to a narrow community and slowing flow of information across social media channels. ©Audun Utengen, Dara Rouholiman, Jamison G. Gamble, Francisco Jose Grajales III, Nisha Pradhan, Alicia C Staley, Liza Bernstein, Sean D Young, Kevin A Clauson, Larry F Chu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 17.08.2017.
Feature-Based Statistical Analysis of Combustion Simulation Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, J; Krishnamoorthy, V; Liu, S
2011-11-18
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing andmore » reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.« less
Information Flow in Teachers' Organizations in Israel During Confrontations with Employers: I
ERIC Educational Resources Information Center
Glasman, Naftaly S.
1975-01-01
First part of an article examining the content of information flow; the amount of information released; the mechanism of the flow; the factors affecting the content, amount, and mechanism; and the corollaries of information flow and the characteristics of the school system. Includes the questions put to the teachers. (Author/IRT)
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Kiselev, A. N.; Shepelev, S. V.; Galanin, A. V.
2015-02-01
Specific features relating to development of the information-analytical system on the problem of flow-accelerated corrosion of pipeline elements in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh nuclear power plant are considered. The results from a statistical analysis of data on the quantity, location, and operating conditions of the elements and preinserted segments of pipelines used in the condensate-feedwater and wet steam paths are presented. The principles of preparing and using the information-analytical system for determining the lifetime to reaching inadmissible wall thinning in elements of pipelines used in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered.
NASA Astrophysics Data System (ADS)
Mishra, P. K.; Bernini Campos, H. E.
2016-12-01
The lower portion of the Salinas River in Monterey bay, California has a history of flood, lots of study has been made ab out the water quality since the river provides water for the crops around, but is still in need a detailed study about the river behavior and flood analysis. The floods did significant damage, affecting valuable landing farms, residences and businesses in Monterey County. The first step for this study is comprehend and collect the river bathymetry and surroundings and then analyze the discharge and how it is going to change with time. This thesis develops a model about the specific site, recruiting real data from GIS and performing a flow simulation according to flow data provided by USGS, to verify water surface elevation and floodplain. The ArcMap, developed by ESRI, was used along with an extension (HEC-GeoRAS) because it was indeed the most appropriate model to work with the Digital Elevation Model, develop the floodplain and characterizing the land surface accurately in the study site. The HEC-RAS software, developed by US Army Corp of Engineers, was used to compute one-dimension steady flow and two-dimension unsteady flow, providing flow velocity, water surface elevation and profiles, total surface area, head and friction loss and other characteristics, allowing the analysis of the flow. A mean discharge, a mean peak streamflow and a peak discharge were used for the steady flow and a Hydrograph was used for the unsteady flow, both are based on the 1995 flood and discharge history. This study provides important information about water surface elevation and water flow, allowing stakeholders and the government to analyze solutions to avoid damage to the society and landowners.
Psychology of Intelligence Analysis
1999-01-01
information. Each neuron has octopus - like arms called axons and dendrites. Electrical impulses flow through these arms and are ferried by...information rings a bell, the bell cannot be unrung. The ambiguity of most real-world situations contributes to the operation of this perseverance...Potomac, MD: Lawrence Erlbaum Associates, 1976), pp. 177-78. 147. This is a modified version, developed by Frank J. Stech, of the blue and green taxicab
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
Michael, Dada O; Bamidele, Awojoyogbe O; Adewale, Adesola O; Karem, Boubaker
2013-01-01
Nuclear magnetic resonance (NMR) allows for fast, accurate and noninvasive measurement of fluid flow in restricted and non-restricted media. The results of such measurements may be possible for a very small B 0 field and can be enhanced through detailed examination of generating functions that may arise from polynomial solutions of NMR flow equations in terms of Legendre polynomials and Boubaker polynomials. The generating functions of these polynomials can present an array of interesting possibilities that may be useful for understanding the basic physics of extracting relevant NMR flow information from which various hemodynamic problems can be carefully studied. Specifically, these results may be used to develop effective drugs for cardiovascular-related diseases.
Michael, Dada O.; Bamidele, Awojoyogbe O.; Adewale, Adesola O.; Karem, Boubaker
2013-01-01
Nuclear magnetic resonance (NMR) allows for fast, accurate and noninvasive measurement of fluid flow in restricted and non-restricted media. The results of such measurements may be possible for a very small B0 field and can be enhanced through detailed examination of generating functions that may arise from polynomial solutions of NMR flow equations in terms of Legendre polynomials and Boubaker polynomials. The generating functions of these polynomials can present an array of interesting possibilities that may be useful for understanding the basic physics of extracting relevant NMR flow information from which various hemodynamic problems can be carefully studied. Specifically, these results may be used to develop effective drugs for cardiovascular-related diseases. PMID:25114546
Information fusion in regularized inversion of tomographic pumping tests
Bohling, Geoffrey C.; ,
2008-01-01
In this chapter we investigate a simple approach to incorporating geophysical information into the analysis of tomographic pumping tests for characterization of the hydraulic conductivity (K) field in an aquifer. A number of authors have suggested a tomographic approach to the analysis of hydraulic tests in aquifers - essentially simultaneous analysis of multiple tests or stresses on the flow system - in order to improve the resolution of the estimated parameter fields. However, even with a large amount of hydraulic data in hand, the inverse problem is still plagued by non-uniqueness and ill-conditioning and the parameter space for the inversion needs to be constrained in some sensible fashion in order to obtain plausible estimates of aquifer properties. For seismic and radar tomography problems, the parameter space is often constrained through the application of regularization terms that impose penalties on deviations of the estimated parameters from a prior or background model, with the tradeoff between data fit and model norm explored through systematic analysis of results for different levels of weighting on the regularization terms. In this study we apply systematic regularized inversion to analysis of tomographic pumping tests in an alluvial aquifer, taking advantage of the steady-shape flow regime exhibited in these tests to expedite the inversion process. In addition, we explore the possibility of incorporating geophysical information into the inversion through a regularization term relating the estimated K distribution to ground penetrating radar velocity and attenuation distributions through a smoothing spline model. ?? 2008 Springer-Verlag Berlin Heidelberg.
FuGEFlow: data model and markup language for flow cytometry
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-01-01
Background Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. Methods We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. Results The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. Conclusion We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development. PMID:19531228
Merrill, Jacqueline; Bakken, Suzanne; Rockoff, Maxine; Gebbie, Kristine; Carley, Kathleen
2007-01-01
In this case study we describe a method that has potential to provide systematic support for public health information management. Public health agencies depend on specialized information that travels throughout an organization via communication networks among employees. Interactions that occur within these networks are poorly understood and are generally unmanaged. We applied organizational network analysis, a method for studying communication networks, to assess the method’s utility to support decision making for public health managers, and to determine what links existed between information use and agency processes. Data on communication links among a health department’s staff was obtained via survey with a 93% response rate, and analyzed using Organizational Risk Analyzer (ORA) software. The findings described the structure of information flow in the department’s communication networks. The analysis succeeded in providing insights into organizational processes which informed public health managers’ strategies to address problems and to take advantage of network strengths. PMID:17098480
Dorfman, David M; LaPlante, Charlotte D; Pozdnyakova, Olga; Li, Betty
2015-11-01
In our high-sensitivity flow cytometric approach for systemic mastocytosis (SM), we identified mast cell event clustering as a new diagnostic criterion for the disease. To objectively characterize mast cell gated event distributions, we performed cluster analysis using FLOCK, a computational approach to identify cell subsets in multidimensional flow cytometry data in an unbiased, automated fashion. FLOCK identified discrete mast cell populations in most cases of SM (56/75 [75%]) but only a minority of non-SM cases (17/124 [14%]). FLOCK-identified mast cell populations accounted for 2.46% of total cells on average in SM cases and 0.09% of total cells on average in non-SM cases (P < .0001) and were predictive of SM, with a sensitivity of 75%, a specificity of 86%, a positive predictive value of 76%, and a negative predictive value of 85%. FLOCK analysis provides useful diagnostic information for evaluating patients with suspected SM, and may be useful for the analysis of other hematopoietic neoplasms. Copyright© by the American Society for Clinical Pathology.
Paretti, Nicholas V.; Kennedy, Jeffrey R.; Cohn, Timothy A.
2014-01-01
Flooding is among the costliest natural disasters in terms of loss of life and property in Arizona, which is why the accurate estimation of flood frequency and magnitude is crucial for proper structural design and accurate floodplain mapping. Current guidelines for flood frequency analysis in the United States are described in Bulletin 17B (B17B), yet since B17B’s publication in 1982 (Interagency Advisory Committee on Water Data, 1982), several improvements have been proposed as updates for future guidelines. Two proposed updates are the Expected Moments Algorithm (EMA) to accommodate historical and censored data, and a generalized multiple Grubbs-Beck (MGB) low-outlier test. The current guidelines use a standard Grubbs-Beck (GB) method to identify low outliers, changing the determination of the moment estimators because B17B uses a conditional probability adjustment to handle low outliers while EMA censors the low outliers. B17B and EMA estimates are identical if no historical information or censored or low outliers are present in the peak-flow data. EMA with MGB (EMA-MGB) test was compared to the standard B17B (B17B-GB) method for flood frequency analysis at 328 streamgaging stations in Arizona. The methods were compared using the relative percent difference (RPD) between annual exceedance probabilities (AEPs), goodness-of-fit assessments, random resampling procedures, and Monte Carlo simulations. The AEPs were calculated and compared using both station skew and weighted skew. Streamgaging stations were classified by U.S. Geological Survey (USGS) National Water Information System (NWIS) qualification codes, used to denote historical and censored peak-flow data, to better understand the effect that nonstandard flood information has on the flood frequency analysis for each method. Streamgaging stations were also grouped according to geographic flood regions and analyzed separately to better understand regional differences caused by physiography and climate. The B17B-GB and EMA-MGB RPD-boxplot results showed that the median RPDs across all streamgaging stations for the 10-, 1-, and 0.2-percent AEPs, computed using station skew, were approximately zero. As the AEP flow estimates decreased (that is, from 10 to 0.2 percent AEP) the variability in the RPDs increased, indicating that the AEP flow estimate was greater for EMA-MGB when compared to B17B-GB. There was only one RPD greater than 100 percent for the 10- and 1-percent AEP estimates, whereas 19 RPDs exceeded 100 percent for the 0.2-percent AEP. At streamgaging stations with low-outlier data, historical peak-flow data, or both, RPDs ranged from −84 to 262 percent for the 0.2-percent AEP flow estimate. When streamgaging stations were separated by the presence of historical peak-flow data (that is, no low outliers or censored peaks) or by low outlier peak-flow data (no historical data), the results showed that RPD variability was greatest for the 0.2-AEP flow estimates, indicating that the treatment of historical and (or) low-outlier data was different between methods and that method differences were most influential when estimating the less probable AEP flows (1, 0.5, and 0.2 percent). When regional skew information was weighted with the station skew, B17B-GB estimates were generally higher than the EMA-MGB estimates for any given AEP. This was related to the different regional skews and mean square error used in the weighting procedure for each flood frequency analysis. The B17B-GB weighted skew analysis used a more positive regional skew determined in USGS Water Supply Paper 2433 (Thomas and others, 1997), while the EMA-MGB analysis used a more negative regional skew with a lower mean square error determined from a Bayesian generalized least squares analysis. Regional groupings of streamgaging stations reflected differences in physiographic and climatic characteristics. Potentially influential low flows (PILFs) were more prevalent in arid regions of the State, and generally AEP flows were larger with EMA-MGB than with B17B-GB for gaging stations with PILFs. In most cases EMA-MGB curves would fit the largest floods more accurately than B17B-GB. In areas of the State with more baseflow, such as along the Mogollon Rim and the White Mountains, streamgaging stations generally had fewer PILFs and more positive skews, causing estimated AEP flows to be larger with B17B-GB than with EMA-MGB. The effect of including regional skew was similar for all regions, and the observed pattern was increasingly greater B17B-GB flows (more negative RPDs) with each decreasing AEP quantile. A variation on a goodness-of-fit test statistic was used to describe each method’s ability to fit the largest floods. The mean absolute percent difference between the measured peak flows and the log-Pearson Type 3 (LP3)-estimated flows, for each method, was averaged over the 90th, 75th, and 50th percentiles of peak-flow data at each site. In most percentile subsets, EMA-MGB on average had smaller differences (1 to 3 percent) between the observed and fitted value, suggesting that the EMA-MGB-LP3 distribution is fitting the observed peak-flow data more precisely than B17B-GB. The smallest EMA-MGB percent differences occurred for the greatest 10 percent (90th percentile) of the peak-flow data. When stations were analyzed by USGS NWIS peak flow qualification code groups, the stations with historical peak flows and no low outliers had average percent differences as high as 11 percent greater for B17B-GB, indicating that EMA-MGB utilized the historical information to fit the largest observed floods more accurately. A resampling procedure was used in which 1,000 random subsamples were drawn, each comprising one-half of the observed data. An LP3 distribution was fit to each subsample using B17B-GB and EMA-MGB methods, and the predicted 1-percent AEP flows were compared to those generated from distributions fit to the entire dataset. With station skew, the two methods were similar in the median percent difference, but with weighted skew EMA-MGB estimates were generally better. At two gages where B17B-GB appeared to perform better, a large number of peak flows were deemed to be PILFs by the MGB test, although they did not appear to depart significantly from the trend of the data (step or dogleg appearance). At two gages where EMA-MGB performed better, the MGB identified several PILFs that were affecting the fitted distribution of the B17B-GB method. Monte Carlo simulations were run for the LP3 distribution using different skews and with different assumptions about the expected number of historical peaks. The primary benefit of running Monte Carlo simulations is that the underlying distribution statistics are known, meaning that the true 1-percent AEP is known. The results showed that EMA-MGB performed as well or better in situations where the LP3 distribution had a zero or positive skew and historical information. When the skew for the LP3 distribution was negative, EMA-MGB performed significantly better than B17B-GB and EMA-MGB estimates were less biased by more closely estimating the true 1-percent AEP for 1, 2, and 10 historical flood scenarios.
Doing more with less (data): complexities of resource flow analysis in the Gauteng City-Region
NASA Astrophysics Data System (ADS)
Culwick, Christina; Götz, Graeme; Butcher, Siân; Harber, Jesse; Maree, Gillian; Mushongera, Darlington
2017-12-01
Urban metabolism is a growing field of study into resource flows through cities, and how these could be managed more sustainably. There are two main schools of thought on urban metabolism—metabolic flow analysis (MFA) and urban political ecology (UPE). The two schools remain siloed despite common foundations. This paper reflects on recent research by the Gauteng City-Region Observatory (GCRO) into urban sustainability transitions in South Africa’s Gauteng City-Region, a large and sprawling urban formation that faces a host of sustainability challenges including water deficits, erratic electricity supply, stretched infrastructure networks and increasingly carbon-intensive settlement patterns. Three GCRO research projects are reviewed. Each project began with the assumption that data collection on the region’s metabolism could enable an MFA or MFA-like analysis to highlight where possible resource efficiency and sustainability gains might be achieved. However, in each case we confronted severe data-limitations, and ended up asking UPE-style questions on the reasons for and implications of the chronic paucity of urban metabolism data. We have been led to conclude that urban metabolism research will require much more than just assembling and modelling flows data, although these efforts should not be abandoned. A synthesis of MFA and UPE is needed, which simultaneously builds a deeper understanding of resource flows and the systems that govern these flows. We support the emerging approach in political-industrial ecology literature which values both material data on and socio-political insight into urban metabolism, and emphasises the importance of multi-disciplinary and multi-dimensional analysis to inform decision-making in urban sustainability transitions.
Guide to Flow Measurement for Electric Propulsion Systems
NASA Technical Reports Server (NTRS)
Frieman, Jason D.; Walker, Mitchell L. R.; Snyder, Steve
2013-01-01
In electric propulsion (EP) systems, accurate measurement of the propellant mass flow rate of gas or liquid to the thruster and external cathode is a key input in the calculation of thruster efficiency and specific impulse. Although such measurements are often achieved with commercial mass flow controllers and meters integrated into propellant feed systems, the variability in potential propellant options and flow requirements amongst the spectrum of EP power regimes and devices complicates meter selection, integration, and operation. At the direction of the Committee on Standards for Electric Propulsion Testing, a guide was jointly developed by members of the electric propulsion community to establish a unified document that contains the working principles, methods of implementation and analysis, and calibration techniques and recommendations on the use of mass flow meters in laboratory and spacecraft electric propulsion systems. The guide is applicable to EP devices of all types and power levels ranging from microthrusters to high-power ion engines and Hall effect thrusters. The establishment of a community standard on mass flow metering will help ensure the selection of the proper meter for each application. It will also improve the quality of system performance estimates by providing comprehensive information on the physical phenomena and systematic errors that must be accounted for during the analysis of flow measurement data. This paper will outline the standard methods and recommended practices described in the guide titled "Flow Measurement for Electric Propulsion Systems."
Advanced nozzle and engine components test facility
NASA Technical Reports Server (NTRS)
Beltran, Luis R.; Delroso, Richard L.; Delrosario, Ruben
1992-01-01
A test facility for conducting scaled advanced nozzle and engine component research is described. The CE-22 test facility, located in the Engine Research Building of the NASA Lewis Research Center, contains many systems for the economical testing of advanced scale-model nozzles and engine components. The combustion air and altitude exhaust systems are described. Combustion air can be supplied to a model up to 40 psig for primary air flow, and 40, 125, and 450 psig for secondary air flow. Altitude exhaust can be simulated up to 48,000 ft, or the exhaust can be atmospheric. Descriptions of the multiaxis thrust stand, a color schlieren flow visualization system used for qualitative flow analysis, a labyrinth flow measurement system, a data acquisition system, and auxiliary systems are discussed. Model recommended design information and temperature and pressure instrumentation recommendations are included.
Particle image and acoustic Doppler velocimetry analysis of a cross-flow turbine wake
NASA Astrophysics Data System (ADS)
Strom, Benjamin; Brunton, Steven; Polagye, Brian
2017-11-01
Cross-flow turbines have advantageous properties for converting kinetic energy in wind and water currents to rotational mechanical energy and subsequently electrical power. A thorough understanding of cross-flow turbine wakes aids understanding of rotor flow physics, assists geometric array design, and informs control strategies for individual turbines in arrays. In this work, the wake physics of a scale model cross-flow turbine are investigated experimentally. Three-component velocity measurements are taken downstream of a two-bladed turbine in a recirculating water channel. Time-resolved stereoscopic particle image and acoustic Doppler velocimetry are compared for planes normal to and distributed along the turbine rotational axis. Wake features are described using proper orthogonal decomposition, dynamic mode decomposition, and the finite-time Lyapunov exponent. Consequences for downstream turbine placement are discussed in conjunction with two-turbine array experiments.
Visual Modelling of Data Warehousing Flows with UML Profiles
NASA Astrophysics Data System (ADS)
Pardillo, Jesús; Golfarelli, Matteo; Rizzi, Stefano; Trujillo, Juan
Data warehousing involves complex processes that transform source data through several stages to deliver suitable information ready to be analysed. Though many techniques for visual modelling of data warehouses from the static point of view have been devised, only few attempts have been made to model the data flows involved in a data warehousing process. Besides, each attempt was mainly aimed at a specific application, such as ETL, OLAP, what-if analysis, data mining. Data flows are typically very complex in this domain; for this reason, we argue, designers would greatly benefit from a technique for uniformly modelling data warehousing flows for all applications. In this paper, we propose an integrated visual modelling technique for data cubes and data flows. This technique is based on UML profiling; its feasibility is evaluated by means of a prototype implementation.
Brokering the Research-Practice Gap: A typology.
Neal, Jennifer Watling; Neal, Zachary P; Kornbluh, Mariah; Mills, Kristen J; Lawlor, Jennifer A
2015-12-01
Despite widespread recognition of a research-practice gap in multiple service sectors, less is known about how pre-existing communication channels facilitate the flow of information between researchers and practitioners. In the current study, we applied an existing typology of brokerage developed by Gould and Fernandez (Sociol Methodol 19:89-126, 1989) to examine what types of brokerage facilitate information spread between researchers and educational practitioners. Specifically, we conducted semi-structured interviews with 19 school administrators and staff in two public school districts regarding their experiences searching for information about instructional, health, and social skills programs. Using deductive content analysis, we found evidence of all five types of brokerage identified by Gould and Fernandez (1989). However, only three types of brokerage-gatekeepers, representatives, and liaisons-were involved in the flow of information between school administrators and researchers. Moreover, information transfer often occurred in longer chains that involved multiple, distinct types of brokerage. We conclude with the broad implications of our findings for narrowing the research-practice gap by improving researchers' dissemination efforts and practitioners' search for information.
Investigation of culvert hydraulics related to juvenile fish passage. Final research report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, M.E.; Downs, R.C.
1996-01-01
Culverts often create barriers to the upstream migration of juvenile fish. The objective of this study was to determine hydraulic characteristics of culverts with different flow conditions. Methods of predicting flow profiles were developed by both Chiu and Mountjoy. Two equations were compared to experimental results. An area of flow corresponding to a predetermined allowable velocity can be calculated using Mountjoy equation. This can then be used in the design of culverts as fish passage guidelines. The report contains a summary of background information, experimental methodology, the results of experimental tests, and an analysis of both the Chiu and Mountjoymore » equations.« less
NASA Technical Reports Server (NTRS)
Cook, W. J.
1973-01-01
A theoretical study of heat transfer for zero pressure gradient hypersonic laminar boundary layers for various gases with particular application to the flows produced in an expansion tube facility was conducted. A correlation based on results obtained from solutions to the governing equations for five gases was formulated. Particular attention was directed toward the laminar boundary layer shock tube splitter plates in carbon dioxide flows generated by high speed shock waves. Computer analysis of the splitter plate boundary layer flow provided information that is useful in interpreting experimental data obtained in shock tube gas radiation studies.
Cerebral capillary velocimetry based on temporal OCT speckle contrast.
Choi, Woo June; Li, Yuandong; Qin, Wan; Wang, Ruikang K
2016-12-01
We propose a new optical coherence tomography (OCT) based method to measure red blood cell (RBC) velocities of single capillaries in the cortex of rodent brain. This OCT capillary velocimetry exploits quantitative laser speckle contrast analysis to estimate speckle decorrelation rate from the measured temporal OCT speckle signals, which is related to microcirculatory flow velocity. We hypothesize that OCT signal due to sub-surface capillary flow can be treated as the speckle signal in the single scattering regime and thus its time scale of speckle fluctuations can be subjected to single scattering laser speckle contrast analysis to derive characteristic decorrelation time. To validate this hypothesis, OCT measurements are conducted on a single capillary flow phantom operating at preset velocities, in which M-mode B-frames are acquired using a high-speed OCT system. Analysis is then performed on the time-varying OCT signals extracted at the capillary flow, exhibiting a typical inverse relationship between the estimated decorrelation time and absolute RBC velocity, which is then used to deduce the capillary velocities. We apply the method to in vivo measurements of mouse brain, demonstrating that the proposed approach provides additional useful information in the quantitative assessment of capillary hemodynamics, complementary to that of OCT angiography.
Vortex Analysis of Intra-Aneurismal Flow in Cerebral Aneurysms
Sunderland, Kevin; Haferman, Christopher; Chintalapani, Gouthami
2016-01-01
This study aims to develop an alternative vortex analysis method by measuring structure ofIntracranial aneurysm (IA) flow vortexes across the cardiac cycle, to quantify temporal stability of aneurismal flow. Hemodynamics were modeled in “patient-specific” geometries, using computational fluid dynamics (CFD) simulations. Modified versions of known λ 2 and Q-criterion methods identified vortex regions; then regions were segmented out using the classical marching cube algorithm. Temporal stability was measured by the degree of vortex overlap (DVO) at each step of a cardiac cycle against a cycle-averaged vortex and by the change in number of cores over the cycle. No statistical differences exist in DVO or number of vortex cores between 5 terminal IAs and 5 sidewall IAs. No strong correlation exists between vortex core characteristics and geometric or hemodynamic characteristics of IAs. Statistical independence suggests this proposed method may provide novel IA information. However, threshold values used to determine the vortex core regions and resolution of velocity data influenced analysis outcomes and have to be addressed in future studies. In conclusions, preliminary results show that the proposed methodology may help give novel insight toward aneurismal flow characteristic and help in future risk assessment given more developments. PMID:27891172
Vortex Analysis of Intra-Aneurismal Flow in Cerebral Aneurysms.
Sunderland, Kevin; Haferman, Christopher; Chintalapani, Gouthami; Jiang, Jingfeng
2016-01-01
This study aims to develop an alternative vortex analysis method by measuring structure ofIntracranial aneurysm (IA) flow vortexes across the cardiac cycle, to quantify temporal stability of aneurismal flow. Hemodynamics were modeled in "patient-specific" geometries, using computational fluid dynamics (CFD) simulations. Modified versions of known λ 2 and Q -criterion methods identified vortex regions; then regions were segmented out using the classical marching cube algorithm. Temporal stability was measured by the degree of vortex overlap (DVO) at each step of a cardiac cycle against a cycle-averaged vortex and by the change in number of cores over the cycle. No statistical differences exist in DVO or number of vortex cores between 5 terminal IAs and 5 sidewall IAs. No strong correlation exists between vortex core characteristics and geometric or hemodynamic characteristics of IAs. Statistical independence suggests this proposed method may provide novel IA information. However, threshold values used to determine the vortex core regions and resolution of velocity data influenced analysis outcomes and have to be addressed in future studies. In conclusions, preliminary results show that the proposed methodology may help give novel insight toward aneurismal flow characteristic and help in future risk assessment given more developments.
Methods and new approaches to the calculation of physiological parameters by videodensitometry
NASA Technical Reports Server (NTRS)
Kedem, D.; Londstrom, D. P.; Rhea, T. C., Jr.; Nelson, J. H.; Price, R. R.; Smith, C. W.; Graham, T. P., Jr.; Brill, A. B.; Kedem, D.
1976-01-01
A complex system featuring a video-camera connected to a video disk, cine (medical motion picture) camera and PDP-9 computer with various input/output facilities has been developed. This system enables the performance of quantitative analysis of various functions recorded in clinical studies. Several studies are described, such as heart chamber volume calculations, left ventricle ejection fraction, blood flow through the lungs and also the possibility of obtaining information about blood flow and constrictions in small cross-section vessels
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
2006-12-01
92–101. Bovee, K . D . 1982. A guide to stream habitat analysis using the in stream flow incremental methodology. Instream Flow Information Paper No...Thames. 1991. Hydrology and the management of watersheds . Iowa State University Press, Ames, IA. Brown, J. K . 1974. Handbook for inventorying downed...woody material. General Technical Report INT-16, U.S. Department of Agriculture, Forest Service. Brown, J. K ., R. D . Oberheu, and C. M. Johnston
Social network analysis (SNA) is based on a conceptual network representation of social interactions and is an invaluable tool for conservation professionals to increase collaboration, improve information flow, and increase efficiency. We present two approaches to constructing i...
Social network analysis (SNA) is based on a conceptual network representation of social interactions and is an invaluable tool for conservation professionals to increase collaboration, improve information flow, and increase efficiency. We present two approaches to constructing in...
Diagnosing Communication Pathologies.
ERIC Educational Resources Information Center
Mueller, Carol J.
This paper addresses the concept of the communication audit, i.e., a fact-finding analysis, interpretation, and reporting process that studies the communication philosophy, structure, flow, and practice of the organization. Reasons for doing a communication audit are identified: (1) to uncover information blockages and organizational hindrances;…
STORAGE/SEDIMENTATION FACILITIES FOR CONTROL OF STORM AND COMBINED SEWER OVERFLOW: DESIGN MANUAL
This manual describes applications of storage facilities in wet-weather flow management and presents step-by-step procedures for analysis and design of storage-treatment facilities. Retention, detention, and sedimentation storage information is classified and described. Internati...
Examining Food Risk in the Large using a Complex, Networked System-of-sytems Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, John; Newkirk, Ryan; Mc Donald, Mark P
2010-12-03
The food production infrastructure is a highly complex system of systems. Characterizing the risks of intentional contamination in multi-ingredient manufactured foods is extremely challenging because the risks depend on the vulnerabilities of food processing facilities and on the intricacies of the supply-distribution networks that link them. A pure engineering approach to modeling the system is impractical because of the overall system complexity and paucity of data. A methodology is needed to assess food contamination risk 'in the large', based on current, high-level information about manufacturing facilities, corrunodities and markets, that will indicate which food categories are most at risk ofmore » intentional contamination and warrant deeper analysis. The approach begins by decomposing the system for producing a multi-ingredient food into instances of two subsystem archetypes: (1) the relevant manufacturing and processing facilities, and (2) the networked corrunodity flows that link them to each other and consumers. Ingredient manufacturing subsystems are modeled as generic systems dynamics models with distributions of key parameters that span the configurations of real facilities. Networks representing the distribution systems are synthesized from general information about food corrunodities. This is done in a series of steps. First, probability networks representing the aggregated flows of food from manufacturers to wholesalers, retailers, other manufacturers, and direct consumers are inferred from high-level approximate information. This is followed by disaggregation of the general flows into flows connecting 'large' and 'small' categories of manufacturers, wholesalers, retailers, and consumers. Optimization methods are then used to determine the most likely network flows consistent with given data. Vulnerability can be assessed for a potential contamination point using a modified CARVER + Shock model. Once the facility and corrunodity flow models are instantiated, a risk consequence analysis can be performed by injecting contaminant at chosen points in the system and propagating the event through the overarching system to arrive at morbidity and mortality figures. A generic chocolate snack cake model, consisting of fluid milk, liquid eggs, and cocoa, is described as an intended proof of concept for multi-ingredient food systems. We aim for an eventual tool that can be used directly by policy makers and planners.« less
Design and implementation of spatial knowledge grid for integrated spatial analysis
NASA Astrophysics Data System (ADS)
Liu, Xiangnan; Guan, Li; Wang, Ping
2006-10-01
Supported by spatial information grid(SIG), the spatial knowledge grid (SKG) for integrated spatial analysis utilizes the middleware technology in constructing the spatial information grid computation environment and spatial information service system, develops spatial entity oriented spatial data organization technology, carries out the profound computation of the spatial structure and spatial process pattern on the basis of Grid GIS infrastructure, spatial data grid and spatial information grid (specialized definition). At the same time, it realizes the complex spatial pattern expression and the spatial function process simulation by taking the spatial intelligent agent as the core to establish space initiative computation. Moreover through the establishment of virtual geographical environment with man-machine interactivity and blending, complex spatial modeling, network cooperation work and spatial community decision knowledge driven are achieved. The framework of SKG is discussed systematically in this paper. Its implement flow and the key technology with examples of overlay analysis are proposed as well.
Doppler Fourier Domain Optical Coherence Tomography for Label-Free Tissue Angiography
NASA Astrophysics Data System (ADS)
Leitgeb, Rainer A.; Szkulmowski, Maciej; Blatter, Cedric; Wojtkowski, Maciej
Information about tissue perfusion and the vascular structure is certainly most important for assessment of tissue state or personal health and the diagnosis of any pathological conditions. It is therefore of key medical interest to have tools available for both quantitative blood flow assessment as well as qualitative vascular imaging. The strength of optical techniques is the unprecedented level of detail even for small capillary structures or microaneurysms and the possibility to combine different techniques for additional tissue spectroscopy giving insight into tissue metabolism. There is an immediate diagnostic and pharmacological demand for high-resolution, label-free, tissue angiography and flow assessment that in addition allow for precise depth gating of flow information. The most promising candidate is Doppler optical coherence tomography (DOCT) being noncontact, label free, and without employing hazardous radiation. DOCT provides fully quantitative volumetric information about blood flow together with the vascular and structural anatomy. Besides flow quantification, analysis of OCT signal fluctuations allows to contrast moving scatterers in tissue such as red blood cells from static tissue. This allows for non-invasive optical angiography and yields high resolution even for smallest capillaries. Because of the huge potential of DOCT and lable-free optical angiography for diagnosis, the last years saw a rapid increase of publications in this field with many different approaches. The present chapter gives an overview over existing Doppler OCT approaches and angiography techniques. It furthermore discusses limitations and noise issues, and gives examples for angiography in the eye and the skin.
Reentrant Information Flow in Electrophysiological Rat Default Mode Network.
Jing, Wei; Guo, Daqing; Zhang, Yunxiang; Guo, Fengru; Valdés-Sosa, Pedro A; Xia, Yang; Yao, Dezhong
2017-01-01
Functional MRI (fMRI) studies have demonstrated that the rodent brain shows a default mode network (DMN) activity similar to that in humans, offering a potential preclinical model both for physiological and pathophysiological studies. However, the neuronal mechanism underlying rodent DMN remains poorly understood. Here, we used electrophysiological data to analyze the power spectrum and estimate the directed phase transfer entropy (dPTE) within rat DMN across three vigilance states: wakeful rest (WR), slow-wave sleep (SWS), and rapid-eye-movement sleep (REMS). We observed decreased gamma powers during SWS compared with WR in most of the DMN regions. Increased gamma powers were found in prelimbic cortex, cingulate cortex, and hippocampus during REMS compared with WR, whereas retrosplenial cortex showed a reverse trend. These changed gamma powers are in line with the local metabolic variation of homologous brain regions in humans. In the analysis of directional interactions, we observed well-organized anterior-to-posterior patterns of information flow in the delta band, while opposite patterns of posterior-to-anterior flow were found in the theta band. These frequency-specific opposite patterns were only observed in WR and REMS. Additionally, most of the information senders in the delta band were also the receivers in the theta band, and vice versa. Our results provide electrophysiological evidence that rat DMN is similar to its human counterpart, and there is a frequency-dependent reentry loop of anterior-posterior information flow within rat DMN, which may offer a mechanism for functional integration, supporting conscious awareness.
Informed Decision Making Process for Managing Environmental Flows in Small River Basins
NASA Astrophysics Data System (ADS)
Padikkal, S.; Rema, K. P.
2013-03-01
Numerous examples exist worldwide of partial or complete alteration to the natural flow regime of river systems as a consequence of large scale water abstraction from upstream reaches. The effects may not be conspicuous in the case of very large rivers, but the ecosystems of smaller rivers or streams may be completely destroyed over a period of time. While restoration of the natural flow regime may not be possible, at present there is increased effort to implement restoration by regulating environmental flow. This study investigates the development of an environmental flow management model at an icon site in the small river basin of Bharathapuzha, west India. To determine optimal environmental flow regimes, a historic flow model based on data assimilated since 1978 indicated a satisfactory minimum flow depth for river ecosystem sustenance is 0.907 m (28.8 m3/s), a value also obtained from the hydraulic model; however, as three of the reservoirs were already operational at this time a flow depth of 0.922 m is considered a more viable estimate. Analysis of daily stream flow in 1997-2006, indicated adequate flow regimes during the monsoons in June-November, but that sections of the river dried out in December-May with alarming water quality conditions near the river mouth. Furthermore, the preferred minimum `dream' flow regime expressed by stakeholders of the region is a water depth of 1.548 m, which exceeds 50 % of the flood discharge in July. Water could potentially be conserved for environmental flow purposes by (1) the de-siltation of existing reservoirs or (2) reducing water spillage in the transfer between river basins. Ultimately environmental flow management of the region requires the establishment of a co-ordinated management body and the regular assimilation of water flow information from which science based decisions are made, to ensure both economic and environmental concerns are adequately addressed.
Flood risk changes in Northeastern part of Iberian Peninsula: from impact data to flow data
NASA Astrophysics Data System (ADS)
Llasat, Maria-Carmen; Gilabert, Joan; Llasat-Botija, Montserrat; Marcos, Raül; Quintana-Seguí, Pere; Turco, Marco
2014-05-01
The analysis of the temporal evolution of historical floods usually is based on proxy data obtained collecting flooding information from continuous records in municipal, ecclesiastic and private documentary sources. This kind of documentary series usually provide details of the damage caused by the flooding, with the exact date and duration, and in some occasions, some details on the behaviour of the rising water (duration, magnitude, indirect measurements), further details about the precipitation episode that gave rise to it, and the characteristics and dimensions of the riverbeds and the infrastructure associated with the watercourse (dams, bridges, mills, dykes). Based on this information, the first step is to estimate the flood impacts and, usually, in order to build flood data series, the event is classified following some criteria (i.e. catastrophic, extraordinary, ordinary). Exceptionally, some events are reconstructed and the maximum flow or level of the inundation is estimated. However, there are not so many studies that compare flow series and flood series obtained from proxy data. The interest of doing it is, not only to check the quality of the information and to compare the trend of both kind of series, but also to identify the role of other variables and their potential change in the flood risk evolution. Besides this, a potential relationship between the flood classification criteria and the flood frequency distribution obtained from flow data could be done. The contribution departs from the INUNGAMA database that contains 372 flood events recorded in Northeastern of Iberian Peninsula from 1900 to 2010 (Barnolas and Llasat, 2007; Llasat et al, 2013); the PRESSGAMA database that includes more than 15,000 news related to natural hazards and climate change published between 1981 and 2010 and with detailed information for each flood event (Llasat et al, 2009) and the historical flood database with data since the 14th century for the rivers Ter, Llobregat and Segre (Llasat et al, 2005). Daily flow data for the rivers Muga (1971-2013), Ter (1912-2013) and Llobregat (1912-2013) has also been obtained from the Catalan Water Agency. Precipitation and temperature daily data has been provided by Spain-02 (Herrera et al 2012) for the period 1950-2008. First of all, the quality of all the series has been checked and a consistency analysis between them has been done. The correlation between rainfall and flow series has been studied for some specific catchments. Then, trend analysis of different series has been made by applying the Mann-Kendall method and a resampling method (Turco and Llasat, 2011), in order to identify decadal changes. Finally, a flood event has been selected as case study to illustrate the different factors that can be involved. This contribution has been supported by the DRIHM project.
Bohling, Geoffrey C.; Butler, J.J.
2001-01-01
We have developed a program for inverse analysis of two-dimensional linear or radial groundwater flow problems. The program, 1r2dinv, uses standard finite difference techniques to solve the groundwater flow equation for a horizontal or vertical plane with heterogeneous properties. In radial mode, the program simulates flow to a well in a vertical plane, transforming the radial flow equation into an equivalent problem in Cartesian coordinates. The physical parameters in the model are horizontal or x-direction hydraulic conductivity, anisotropy ratio (vertical to horizontal conductivity in a vertical model, y-direction to x-direction in a horizontal model), and specific storage. The program allows the user to specify arbitrary and independent zonations of these three parameters and also to specify which zonal parameter values are known and which are unknown. The Levenberg-Marquardt algorithm is used to estimate parameters from observed head values. Particularly powerful features of the program are the ability to perform simultaneous analysis of heads from different tests and the inclusion of the wellbore in the radial mode. These capabilities allow the program to be used for analysis of suites of well tests, such as multilevel slug tests or pumping tests in a tomographic format. The combination of information from tests stressing different vertical levels in an aquifer provides the means for accurately estimating vertical variations in conductivity, a factor profoundly influencing contaminant transport in the subsurface. ?? 2001 Elsevier Science Ltd. All rights reserved.
Microfluidic Imaging Flow Cytometry by Asymmetric-detection Time-stretch Optical Microscopy (ATOM).
Tang, Anson H L; Lai, Queenie T K; Chung, Bob M F; Lee, Kelvin C M; Mok, Aaron T Y; Yip, G K; Shum, Anderson H C; Wong, Kenneth K Y; Tsia, Kevin K
2017-06-28
Scaling the number of measurable parameters, which allows for multidimensional data analysis and thus higher-confidence statistical results, has been the main trend in the advanced development of flow cytometry. Notably, adding high-resolution imaging capabilities allows for the complex morphological analysis of cellular/sub-cellular structures. This is not possible with standard flow cytometers. However, it is valuable for advancing our knowledge of cellular functions and can benefit life science research, clinical diagnostics, and environmental monitoring. Incorporating imaging capabilities into flow cytometry compromises the assay throughput, primarily due to the limitations on speed and sensitivity in the camera technologies. To overcome this speed or throughput challenge facing imaging flow cytometry while preserving the image quality, asymmetric-detection time-stretch optical microscopy (ATOM) has been demonstrated to enable high-contrast, single-cell imaging with sub-cellular resolution, at an imaging throughput as high as 100,000 cells/s. Based on the imaging concept of conventional time-stretch imaging, which relies on all-optical image encoding and retrieval through the use of ultrafast broadband laser pulses, ATOM further advances imaging performance by enhancing the image contrast of unlabeled/unstained cells. This is achieved by accessing the phase-gradient information of the cells, which is spectrally encoded into single-shot broadband pulses. Hence, ATOM is particularly advantageous in high-throughput measurements of single-cell morphology and texture - information indicative of cell types, states, and even functions. Ultimately, this could become a powerful imaging flow cytometry platform for the biophysical phenotyping of cells, complementing the current state-of-the-art biochemical-marker-based cellular assay. This work describes a protocol to establish the key modules of an ATOM system (from optical frontend to data processing and visualization backend), as well as the workflow of imaging flow cytometry based on ATOM, using human cells and micro-algae as the examples.
Waveform shape analysis: extraction of physiologically relevant information from Doppler recordings.
Ramsay, M M; Broughton Pipkin, F; Rubin, P C; Skidmore, R
1994-05-01
1. Doppler recordings were made from the brachial artery of healthy female subjects during a series of manoeuvres which altered the pressure-flow characteristics of the vessel. 2. Changes were induced in the peripheral circulation of the forearm by the application of heat or ice-packs. A sphygmomanometer cuff was used to create graded occlusion of the vessel above and below the point of measurement. Recordings were also made whilst the subjects performed a standardized Valsalva manoeuvre. 3. The Doppler recordings were analysed both with the standard waveform indices (systolic/diastolic ratio, pulsatility index and resistance index) and by the method of Laplace transform analysis. 4. The waveform parameters obtained by Laplace transform analysis distinguished the different changes in flow conditions; they thus had direct physiological relevance, unlike the standard waveform indices.
Šimůnek, Jirka; Nimmo, John R.
2005-01-01
A modified version of the Hydrus software package that can directly or inversely simulate water flow in a transient centrifugal field is presented. The inverse solver for parameter estimation of the soil hydraulic parameters is then applied to multirotation transient flow experiments in a centrifuge. Using time‐variable water contents measured at a sequence of several rotation speeds, soil hydraulic properties were successfully estimated by numerical inversion of transient experiments. The inverse method was then evaluated by comparing estimated soil hydraulic properties with those determined independently using an equilibrium analysis. The optimized soil hydraulic properties compared well with those determined using equilibrium analysis and steady state experiment. Multirotation experiments in a centrifuge not only offer significant time savings by accelerating time but also provide significantly more information for the parameter estimation procedure compared to multistep outflow experiments in a gravitational field.
[Doppler echocardiography of tricuspid insufficiency. Methods of quantification].
Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P
1994-01-01
Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.
Nonlinear analysis of an improved continuum model considering headway change with memory
NASA Astrophysics Data System (ADS)
Cheng, Rongjun; Wang, Jufeng; Ge, Hongxia; Li, Zhipeng
2018-01-01
Considering the effect of headway changes with memory, an improved continuum model of traffic flow is proposed in this paper. By means of linear stability theory, the new model’s linear stability with the effect of headway changes with memory is obtained. Through nonlinear analysis, the KdV-Burgers equation is derived to describe the propagating behavior of traffic density wave near the neutral stability line. Numerical simulation is carried out to study the improved traffic flow model, which explores how the headway changes with memory affected each car’s velocity, density and energy consumption. Numerical results show that when considering the effects of headway changes with memory, the traffic jams can be suppressed efficiently. Furthermore, research results demonstrate that the effect of headway changes with memory can avoid the disadvantage of historical information, which will improve the stability of traffic flow and minimize car energy consumption.
Heo, Young Jin; Lee, Donghyeon; Kang, Junsu; Lee, Keondo; Chung, Wan Kyun
2017-09-14
Imaging flow cytometry (IFC) is an emerging technology that acquires single-cell images at high-throughput for analysis of a cell population. Rich information that comes from high sensitivity and spatial resolution of a single-cell microscopic image is beneficial for single-cell analysis in various biological applications. In this paper, we present a fast image-processing pipeline (R-MOD: Real-time Moving Object Detector) based on deep learning for high-throughput microscopy-based label-free IFC in a microfluidic chip. The R-MOD pipeline acquires all single-cell images of cells in flow, and identifies the acquired images as a real-time process with minimum hardware that consists of a microscope and a high-speed camera. Experiments show that R-MOD has the fast and reliable accuracy (500 fps and 93.3% mAP), and is expected to be used as a powerful tool for biomedical and clinical applications.
Hodge Decomposition of Information Flow on Small-World Networks.
Haruna, Taichi; Fujiki, Yuuya
2016-01-01
We investigate the influence of the small-world topology on the composition of information flow on networks. By appealing to the combinatorial Hodge theory, we decompose information flow generated by random threshold networks on the Watts-Strogatz model into three components: gradient, harmonic and curl flows. The harmonic and curl flows represent globally circular and locally circular components, respectively. The Watts-Strogatz model bridges the two extreme network topologies, a lattice network and a random network, by a single parameter that is the probability of random rewiring. The small-world topology is realized within a certain range between them. By numerical simulation we found that as networks become more random the ratio of harmonic flow to the total magnitude of information flow increases whereas the ratio of curl flow decreases. Furthermore, both quantities are significantly enhanced from the level when only network structure is considered for the network close to a random network and a lattice network, respectively. Finally, the sum of these two ratios takes its maximum value within the small-world region. These findings suggest that the dynamical information counterpart of global integration and that of local segregation are the harmonic flow and the curl flow, respectively, and that a part of the small-world region is dominated by internal circulation of information flow.
ERIC Educational Resources Information Center
Kramer, John Francis
A simulation of Cincinnati mass media system predicts frequency and reach of flow of messages from known facts taken from census statistics, newspaper and radio audience studies, and a content analysis of the press relevant to attitudes and opinions measured by NORC survey of the effects of a public information campaign on the United Nations made…
NASA Astrophysics Data System (ADS)
Sadegh, M.; Vrugt, J. A.
2011-12-01
In the past few years, several contributions have begun to appear in the hydrologic literature that introduced and analyzed the benefits of using a signature based approach to watershed analysis. This signature-based approach abandons the standard single criteria model-data fitting paradigm in favor of a diagnostic approach that better extracts the available information from the available data. Despite the prospects of this new viewpoint, rather ad-hoc criteria have hitherto been proposed to improve watershed modeling. Here, we aim to provide a proper mathematical foundation to signature based analysis. We analyze the information content of different data transformation by analyzing their convergence speed with Markov Chain Monte Carlo (MCMC) simulation using the Generalized Likelihood function of Schousp and Vrugt (2010). We compare the information content of the original discharge data against a simple square root and Box-Cox transformation of the streamflow data. We benchmark these results against wavelet and flow duration curve transformations that temporally disaggregate the discharge data. Our results conclusive demonstrate that wavelet transformations and flow duration curves significantly reduce the information content of the streamflow data and consequently unnecessarily increase the uncertainty of the HYMOD model parameters. Hydrologic signatures thus need to be found in the original data, without temporal disaggregation.
Learning to classify wakes from local sensory information
NASA Astrophysics Data System (ADS)
Alsalman, Mohamad; Colvert, Brendan; Kanso, Eva; Kanso Team
2017-11-01
Aquatic organisms exhibit remarkable abilities to sense local flow signals contained in their fluid environment and to surmise the origins of these flows. For example, fish can discern the information contained in various flow structures and utilize this information for obstacle avoidance and prey tracking. Flow structures created by flapping and swimming bodies are well characterized in the fluid dynamics literature; however, such characterization relies on classical methods that use an external observer to reconstruct global flow fields. The reconstructed flows, or wakes, are then classified according to the unsteady vortex patterns. Here, we propose a new approach for wake identification: we classify the wakes resulting from a flapping airfoil by applying machine learning algorithms to local flow information. In particular, we simulate the wakes of an oscillating airfoil in an incoming flow, extract the downstream vorticity information, and train a classifier to learn the different flow structures and classify new ones. This data-driven approach provides a promising framework for underwater navigation and detection in application to autonomous bio-inspired vehicles.
Optical coherence tomography angiography-based capillary velocimetry
NASA Astrophysics Data System (ADS)
Wang, Ruikang K.; Zhang, Qinqin; Li, Yuandong; Song, Shaozhen
2017-06-01
Challenge persists in the field of optical coherence tomography (OCT) when it is required to quantify capillary blood flow within tissue beds in vivo. We propose a useful approach to statistically estimate the mean capillary flow velocity using a model-based statistical method of eigendecomposition (ED) analysis of the complex OCT signals obtained with the OCT angiography (OCTA) scanning protocol. ED-based analysis is achieved by the covariance matrix of the ensemble complex OCT signals, upon which the eigenvalues and eigenvectors that represent the subsets of the signal makeup are calculated. From this analysis, the signals due to moving particles can be isolated by employing an adaptive regression filter to remove the eigencomponents that represent static tissue signals. The mean frequency (MF) of moving particles can be estimated by the first lag-one autocorrelation of the corresponding eigenvectors. Three important parameters are introduced, including the blood flow signal power representing the presence of blood flow (i.e., OCTA signals), the MF indicating the mean velocity of blood flow, and the frequency bandwidth describing the temporal flow heterogeneity within a scanned tissue volume. The proposed approach is tested using scattering phantoms, in which microfluidic channels are used to simulate the functional capillary vessels that are perfused with the scattering intralipid solution. The results indicate a linear relationship between the MF and mean flow velocity. In vivo animal experiments are also conducted by imaging mouse brain with distal middle cerebral artery ligation to test the capability of the method to image the changes in capillary flows in response to an ischemic insult, demonstrating the practical usefulness of the proposed method for providing important quantifiable information about capillary tissue beds in the investigations of neurological conditions in vivo.
ANALYSIS AND REDUCTION OF LANDSAT DATA FOR USE IN A HIGH PLAINS GROUND-WATER FLOW MODEL.
Thelin, Gail; Gaydas, Leonard; Donovan, Walter; Mladinich, Carol
1984-01-01
Data obtained from 59 Landsat scenes were used to estimate the areal extent of irrigated agriculture over the High Plains region of the United States for a ground-water flow model. This model provides information on current trends in the amount and distribution of water used for irrigation. The analysis and reduction process required that each Landsat scene be ratioed, interpreted, and aggregated. Data reduction by aggregation was an efficient technique for handling the volume of data analyzed. This process bypassed problems inherent in geometrically correcting and mosaicking the data at pixel resolution and combined the individual Landsat classification into one comprehensive data set.
Moisture convergence from a combined mesoscale moisture analysis and wind field for 24 April 1975
NASA Technical Reports Server (NTRS)
Negri, A. J.; Hillger, D. W.; Vonder Haar, T. H.
1977-01-01
Precipitable water values inferred from the Vertical Temperature Profile Radiometer data of the polar orbiting NOAA-4 satellite are used in conjunction with wind-field analyses obtained from Synchronous Meteorological Satellite visible-channel data to study the moisture convergence in the boundary layer immediately preceding a storm. This combination of data simulates the information that will be available from the Visible and Infrared Spin-Scan Radiometer on board the GOES-D satellite, which is scheduled to begin operation in the 1980s. Serviceable representations of boundary layer flow are developed through analysis of the satellite infrared cumulus velocities, although the flow representations are not exactly located in the vertical.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less
Geochemistry and the understanding of ground-water systems
Glynn, Pierre D.; Plummer, Niel
2005-01-01
Geochemistry has contributed significantly to the understanding of ground-water systems over the last 50 years. Historic advances include development of the hydrochemical facies concept, application of equilibrium theory, investigation of redox processes, and radiocarbon dating. Other hydrochemical concepts, tools, and techniques have helped elucidate mechanisms of flow and transport in ground-water systems, and have helped unlock an archive of paleoenvironmental information. Hydrochemical and isotopic information can be used to interpret the origin and mode of ground-water recharge, refine estimates of time scales of recharge and ground-water flow, decipher reactive processes, provide paleohydrological information, and calibrate ground-water flow models. Progress needs to be made in obtaining representative samples. Improvements are needed in the interpretation of the information obtained, and in the construction and interpretation of numerical models utilizing hydrochemical data. The best approach will ensure an optimized iterative process between field data collection and analysis, interpretation, and the application of forward, inverse, and statistical modeling tools. Advances are anticipated from microbiological investigations, the characterization of natural organics, isotopic fingerprinting, applications of dissolved gas measurements, and the fields of reaction kinetics and coupled processes. A thermodynamic perspective is offered that could facilitate the comparison and understanding of the multiple physical, chemical, and biological processes affecting ground-water systems.
DOT National Transportation Integrated Search
1971-07-01
The problem of displaying visibility information to both : controller and pilot is discussed in the context of visibility : information flow in the airport-aircraft system. : The optimum amount of visibility information, as well as its : rate of flow...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-08
... monitor trends on an annual basis. To continue our time-series analysis, we request data as of June 30... information and time- series data we should collect for the analysis of various MVPD performance metrics. In... revenues, cash flows, and margins. To the extent possible, we seek five-year time-series data to allow us...
NASA Astrophysics Data System (ADS)
Yang, Liang-Yi; Sun, Di-Hua; Zhao, Min; Cheng, Sen-Lin; Zhang, Geng; Liu, Hui
2018-03-01
In this paper, a new micro-cooperative driving car-following model is proposed to investigate the effect of continuous historical velocity difference information on traffic stability. The linear stability criterion of the new model is derived with linear stability theory and the results show that the unstable region in the headway-sensitivity space will be shrunk by taking the continuous historical velocity difference information into account. Through nonlinear analysis, the mKdV equation is derived to describe the traffic evolution behavior of the new model near the critical point. Via numerical simulations, the theoretical analysis results are verified and the results indicate that the continuous historical velocity difference information can enhance the stability of traffic flow in the micro-cooperative driving process.
Anwar, A R; Muthalib, M; Perrey, S; Galka, A; Granert, O; Wolff, S; Heute, U; Deuschl, G; Raethjen, J; Muthuraman, Muthuraman
2016-09-01
Recently, interest has been growing to understand the underlying dynamic directional relationship between simultaneously activated regions of the brain during motor task performance. Such directionality analysis (or effective connectivity analysis), based on non-invasive electrophysiological (electroencephalography-EEG) and hemodynamic (functional near infrared spectroscopy-fNIRS; and functional magnetic resonance imaging-fMRI) neuroimaging modalities can provide an estimate of the motor task-related information flow from one brain region to another. Since EEG, fNIRS and fMRI modalities achieve different spatial and temporal resolutions of motor-task related activation in the brain, the aim of this study was to determine the effective connectivity of cortico-cortical sensorimotor networks during finger movement tasks measured by each neuroimaging modality. Nine healthy subjects performed right hand finger movement tasks of different complexity (simple finger tapping-FT, simple finger sequence-SFS, and complex finger sequence-CFS). We focused our observations on three cortical regions of interest (ROIs), namely the contralateral sensorimotor cortex (SMC), the contralateral premotor cortex (PMC) and the contralateral dorsolateral prefrontal cortex (DLPFC). We estimated the effective connectivity between these ROIs using conditional Granger causality (GC) analysis determined from the time series signals measured by fMRI (blood oxygenation level-dependent-BOLD), fNIRS (oxygenated-O2Hb and deoxygenated-HHb hemoglobin), and EEG (scalp and source level analysis) neuroimaging modalities. The effective connectivity analysis showed significant bi-directional information flow between the SMC, PMC, and DLPFC as determined by the EEG (scalp and source), fMRI (BOLD) and fNIRS (O2Hb and HHb) modalities for all three motor tasks. However the source level EEG GC values were significantly greater than the other modalities. In addition, only the source level EEG showed a significantly greater forward than backward information flow between the ROIs. This simultaneous fMRI, fNIRS and EEG study has shown through independent GC analysis of the respective time series that a bi-directional effective connectivity occurs within a cortico-cortical sensorimotor network (SMC, PMC and DLPFC) during finger movement tasks.
Comparison and Validation of Hydrological E-Flow Methods through Hydrodynamic Modelling
NASA Astrophysics Data System (ADS)
Kuriqi, Alban; Rivaes, Rui; Sordo-Ward, Alvaro; Pinheiro, António N.; Garrote, Luis
2017-04-01
Flow regime determines physical habitat conditions and local biotic configuration. The development of environmental flow guidelines to support the river integrity is becoming a major concern in water resources management. In this study, we analysed two sites located in southern part of Portugal, respectively at Odelouca and Ocreza Rivers, characterised by the Mediterranean climate. Both rivers are almost in pristine condition, not regulated by dams or other diversion construction. This study presents an analysis of the effect on fish habitat suitability by the implementation of different hydrological e-flow methods. To conduct this study we employed certain hydrological e-flow methods recommended by the European Small Hydropower Association (ESHA). River hydrology assessment was based on approximately 30 years of mean daily flow data, provided by the Portuguese Water Information System (SNIRH). The biological data, bathymetry, physical and hydraulic features, and the Habitat Suitability Index for fish species were collected from extensive field works. We followed the Instream Flow Incremental Methodology (IFIM) to assess the flow-habitat relationship taking into account the habitat suitability of different instream flow releases. Initially, we analysed fish habitat suitability based on natural conditions, and we used it as reference condition for other scenarios considering the chosen hydrological e-flow methods. We accomplished the habitat modelling through hydrodynamic analysis by using River-2D model. The same methodology was applied to each scenario by considering as input the e-flows obtained from each of the hydrological method employed in this study. This contribution shows the significance of ecohydrological studies in establishing a foundation for water resources management actions. Keywords: ecohydrology, e-flow, Mediterranean rivers, river conservation, fish habitat, River-2D, Hydropower.
NASA Astrophysics Data System (ADS)
Seong, Myeongsu; Phillips, Zephaniah; Mai, Phuong M.; Yeo, Chaebeom; Song, Cheol; Lee, Kijoon; Kim, Jae G.
2015-07-01
Appropriate oxygen supply and blood flow are important in coordination of body functions and maintaining a life. To measure both oxygen supply and blood flow simultaneously, we developed a system that combined near-infrared spectroscopy (NIRS) and diffuse speckle contrast analysis (DSCA). Our system is more cost effective and compact than such combined systems as diffuse correlation spectroscopy(DCS)-NIRS or DCS flow oximeter, and also offers the same quantitative information. In this article, we present the configuration of DSCA-NIRS and preliminary data from an arm cuff occlusion and a repeated gripping exercise. With further investigation, we believe that DSCA-NIRS can be a useful tool for the field of neuroscience, muscle physiology and metabolic diseases such as diabetes.
Peak-flow characteristics of Virginia streams
Austin, Samuel H.; Krstolic, Jennifer L.; Wiegand, Ute
2011-01-01
Peak-flow annual exceedance probabilities, also called probability-percent chance flow estimates, and regional regression equations are provided describing the peak-flow characteristics of Virginia streams. Statistical methods are used to evaluate peak-flow data. Analysis of Virginia peak-flow data collected from 1895 through 2007 is summarized. Methods are provided for estimating unregulated peak flow of gaged and ungaged streams. Station peak-flow characteristics identified by fitting the logarithms of annual peak flows to a Log Pearson Type III frequency distribution yield annual exceedance probabilities of 0.5, 0.4292, 0.2, 0.1, 0.04, 0.02, 0.01, 0.005, and 0.002 for 476 streamgaging stations. Stream basin characteristics computed using spatial data and a geographic information system are used as explanatory variables in regional regression model equations for six physiographic regions to estimate regional annual exceedance probabilities at gaged and ungaged sites. Weighted peak-flow values that combine annual exceedance probabilities computed from gaging station data and from regional regression equations provide improved peak-flow estimates. Text, figures, and lists are provided summarizing selected peak-flow sites, delineated physiographic regions, peak-flow estimates, basin characteristics, regional regression model equations, error estimates, definitions, data sources, and candidate regression model equations. This study supersedes previous studies of peak flows in Virginia.
NASA Astrophysics Data System (ADS)
Wickersham, Andrew Joseph
There are two critical research needs for the study of hydrocarbon combustion in high speed flows: 1) combustion diagnostics with adequate temporal and spatial resolution, and 2) mathematical techniques that can extract key information from large datasets. The goal of this work is to address these needs, respectively, by the use of high speed and multi-perspective chemiluminescence and advanced mathematical algorithms. To obtain the measurements, this work explored the application of high speed chemiluminescence diagnostics and the use of fiber-based endoscopes (FBEs) for non-intrusive and multi-perspective chemiluminescence imaging up to 20 kHz. Non-intrusive and full-field imaging measurements provide a wealth of information for model validation and design optimization of propulsion systems. However, it is challenging to obtain such measurements due to various implementation difficulties such as optical access, thermal management, and equipment cost. This work therefore explores the application of FBEs for non-intrusive imaging to supersonic propulsion systems. The FBEs used in this work are demonstrated to overcome many of the aforementioned difficulties and provided datasets from multiple angular positions up to 20 kHz in a supersonic combustor. The combustor operated on ethylene fuel at Mach 2 with an inlet stagnation temperature and pressure of approximately 640 degrees Fahrenheit and 70 psia, respectively. The imaging measurements were obtained from eight perspectives simultaneously, providing full-field datasets under such flow conditions for the first time, allowing the possibility of inferring multi-dimensional measurements. Due to the high speed and multi-perspective nature, such new diagnostic capability generates a large volume of data and calls for analysis algorithms that can process the data and extract key physics effectively. To extract the key combustion dynamics from the measurements, three mathematical methods were investigated in this work: Fourier analysis, proper orthogonal decomposition (POD), and wavelet analysis (WA). These algorithms were first demonstrated and tested on imaging measurements obtained from one perspective in a sub-sonic combustor (up to Mach 0.2). The results show that these algorithms are effective in extracting the key physics from large datasets, including the characteristic frequencies of flow-flame interactions especially during transient processes such as lean blow off and ignition. After these relatively simple tests and demonstrations, these algorithms were applied to process the measurements obtained from multi-perspective in the supersonic combustor. compared to past analyses (which have been limited to data obtained from one perspective only), the availability of data at multiple perspective provide further insights into the flame and flow structures in high speed flows. In summary, this work shows that high speed chemiluminescence is a simple yet powerful combustion diagnostic. Especially when combined with FBEs and the analyses algorithms described in this work, such diagnostics provide full-field imaging at high repetition rate in challenging flows. Based on such measurements, a wealth of information can be obtained from proper analysis algorithms, including characteristic frequency, dominating flame modes, and even multi-dimensional flame and flow structures.
Dorfman, David M; LaPlante, Charlotte D; Li, Betty
2016-09-01
We analyzed plasma cell populations in bone marrow samples from 353 patients with possible bone marrow involvement by a plasma cell neoplasm, using FLOCK (FLOw Clustering without K), an unbiased, automated, computational approach to identify cell subsets in multidimensional flow cytometry data. FLOCK identified discrete plasma cell populations in the majority of bone marrow specimens found by standard histologic and immunophenotypic criteria to be involved by a plasma cell neoplasm (202/208 cases; 97%), including 34 cases that were negative by standard flow cytometric analysis that included clonality assessment. FLOCK identified discrete plasma cell populations in only a minority of cases negative for involvement by a plasma cell neoplasm by standard histologic and immunophenotypic criteria (38/145 cases; 26%). Interestingly, 55% of the cases negative by standard analysis, but containing a FLOCK-identified discrete plasma cell population, were positive for monoclonal gammopathy by serum protein electrophoresis and immunofixation. FLOCK-identified and quantitated plasma cell populations accounted for 3.05% of total cells on average in cases positive for involvement by a plasma cell neoplasm by standard histologic and immunophenotypic criteria, and 0.27% of total cells on average in cases negative for involvement by a plasma cell neoplasm by standard histologic and immunophenotypic criteria (p<0.0001; area under the curve by ROC analysis=0.96). The presence of a FLOCK-identified discrete plasma cell population was predictive of the presence of plasma cell neoplasia with a sensitivity of 97%, compared with only 81% for standard flow cytometric analysis, and had specificity of 74%, PPV of 84% and NPV of 95%. FLOCK analysis, which has been shown to provide useful diagnostic information for evaluating patients with suspected systemic mastocytosis, is able to identify neoplastic plasma cell populations analyzed by flow cytometry, and may be helpful in the diagnostic evaluation of bone marrow samples for involvement by plasma cell neoplasia. Copyright © 2016 Elsevier Ltd. All rights reserved.
Progress Toward Efficient Laminar Flow Analysis and Design
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas
2011-01-01
A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Ho, Y. H.; Prezekwas, A. J.
2005-01-01
Higher power, high efficiency gas turbine engines require optimization of the seals and secondary flow systems as well as their impact on the powerstream. This work focuses on two aspects: 1. To apply the present day CFD tools (SCISEAL) to different real-life secondary flow applications from different original equipment manufacturers (OEM s) to provide feedback data and 2. Develop a computational methodology for coupled time-accurate simulation of the powerstream and secondary flow with emphasis on the interaction between the disk-cavity and rim seals flows with the powerstream (SCISEAL-MS-TURBO). One OEM simulation was of the Allison Engine Company T-56 turbine drum cavities including conjugate heat transfer with good agreement with data and provided design feedback information. Another was the GE aspirating seal where the 3-D CFD simulations played a major role in analysis and modification of that seal configuration. The second major objective, development of a coupled flow simulation capability was achieved by using two codes MS-TURBO for the powerstream and SCISEAL for the secondary flows with an interface coupling algorithm. The coupled code was tested against data from three differed configurations: 1. bladeless-rotor-stator-cavity turbine test rig, 2. UTRC high pressure turbine test rig, and, 3. the NASA Low-Speed-Air Compressor rig (LSAC) with results and limitations discussed herein.
Reporting of participant flow diagrams in published reports of randomized trials.
Hopewell, Sally; Hirst, Allison; Collins, Gary S; Mallett, Sue; Yu, Ly-Mee; Altman, Douglas G
2011-12-05
Reporting of the flow of participants through each stage of a randomized trial is essential to assess the generalisability and validity of its results. We assessed the type and completeness of information reported in CONSORT (Consolidated Standards of Reporting Trials) flow diagrams published in current reports of randomized trials. A cross sectional review of all primary reports of randomized trials which included a CONSORT flow diagram indexed in PubMed core clinical journals (2009). We assessed the proportion of parallel group trial publications reporting specific items recommended by CONSORT for inclusion in a flow diagram. Of 469 primary reports of randomized trials, 263 (56%) included a CONSORT flow diagram of which 89% (237/263) were published in a CONSORT endorsing journal. Reports published in CONSORT endorsing journals were more likely to include a flow diagram (62%; 237/380 versus 29%; 26/89). Ninety percent (236/263) of reports which included a flow diagram had a parallel group design, of which 49% (116/236) evaluated drug interventions, 58% (137/236) were multicentre, and 79% (187/236) compared two study groups, with a median sample size of 213 participants. Eighty-one percent (191/236) reported the overall number of participants assessed for eligibility, 71% (168/236) the number excluded prior to randomization and 98% (231/236) the overall number randomized. Reasons for exclusion prior to randomization were more poorly reported. Ninety-four percent (223/236) reported the number of participants allocated to each arm of the trial. However, only 40% (95/236) reported the number who actually received the allocated intervention, 67% (158/236) the number lost to follow up in each arm of the trial, 61% (145/236) whether participants discontinued the intervention during the trial and 54% (128/236) the number included in the main analysis. Over half of published reports of randomized trials included a diagram showing the flow of participants through the trial. However, information was often missing from published flow diagrams, even in articles published in CONSORT endorsing journals. If important information is not reported it can be difficult and sometimes impossible to know if the conclusions of that trial are justified by the data presented.
NASA Astrophysics Data System (ADS)
Carlsohn, Matthias F.; Kemmling, André; Petersen, Arne; Wietzke, Lennart
2016-04-01
Cerebral aneurysms require endovascular treatment to eliminate potentially lethal hemorrhagic rupture by hemostasis of blood flow within the aneurysm. Devices (e.g. coils and flow diverters) promote homeostasis, however, measurement of blood flow within an aneurysm or cerebral vessel before and after device placement on a microscopic level has not been possible so far. This would allow better individualized treatment planning and improve manufacture design of devices. For experimental analysis, direct measurement of real-time microscopic cerebrovascular flow in micro-structures may be an alternative to computed flow simulations. An application of microscopic aneurysm flow measurement on a regular basis to empirically assess a high number of different anatomic shapes and the corresponding effect of different devices would require a fast and reliable method at low cost with high throughout assessment. Transparent three dimensional 3D models of brain vessels and aneurysms may be used for microscopic flow measurements by particle image velocimetry (PIV), however, up to now the size of structures has set the limits for conventional 3D-imaging camera set-ups. On line flow assessment requires additional computational power to cope with the processing large amounts of data generated by sequences of multi-view stereo images, e.g. generated by a light field camera capturing the 3D information by plenoptic imaging of complex flow processes. Recently, a fast and low cost workflow for producing patient specific three dimensional models of cerebral arteries has been established by stereo-lithographic (SLA) 3D printing. These 3D arterial models are transparent an exhibit a replication precision within a submillimeter range required for accurate flow measurements under physiological conditions. We therefore test the feasibility of microscopic flow measurements by PIV analysis using a plenoptic camera system capturing light field image sequences. Averaging across a sequence of single double or triple shots of flashed images enables reconstruction of the real-time corpuscular flow through the vessel system before and after device placement. This approach could enable 3D-insight of microscopic flow within blood vessels and aneurysms at submillimeter resolution. We present an approach that allows real-time assessment of 3D particle flow by high-speed light field image analysis including a solution that addresses high computational load by image processing. The imaging set-up accomplishes fast and reliable PIV analysis in transparent 3D models of brain aneurysms at low cost. High throughput microscopic flow assessment of different shapes of brain aneurysms may therefore be possibly required for patient specific device designs.
Time-Resolved Rayleigh Scattering Measurements in Hot Gas Flows
NASA Technical Reports Server (NTRS)
Mielke, Amy F.; Elam, Kristie A.; Sung, Chih-Jen
2008-01-01
A molecular Rayleigh scattering technique is developed to measure time-resolved gas velocity, temperature, and density in unseeded gas flows at sampling rates up to 32 kHz. A high power continuous-wave laser beam is focused at a point in an air flow field and Rayleigh scattered light is collected and fiber-optically transmitted to the spectral analysis and detection equipment. The spectrum of the light, which contains information about the temperature and velocity of the flow, is analyzed using a Fabry-Perot interferometer. Photomultipler tubes operated in the photon counting mode allow high frequency sampling of the circular interference pattern to provide time-resolved flow property measurements. Mean and rms velocity and temperature fluctuation measurements in both an electrically-heated jet facility with a 10-mm diameter nozzle and also in a hydrogen-combustor heated jet facility with a 50.8-mm diameter nozzle at NASA Glenn Research Center are presented.
Quantification of uncertainty for fluid flow in heterogeneous petroleum reservoirs
NASA Astrophysics Data System (ADS)
Zhang, Dongxiao
Detailed description of the heterogeneity of oil/gas reservoirs is needed to make performance predictions of oil/gas recovery. However, only limited measurements at a few locations are usually available. This combination of significant spatial heterogeneity with incomplete information about it leads to uncertainty about the values of reservoir properties and thus, to uncertainty in estimates of production potential. The theory of stochastic processes provides a natural method for evaluating these uncertainties. In this study, we present a stochastic analysis of transient, single phase flow in heterogeneous reservoirs. We derive general equations governing the statistical moments of flow quantities by perturbation expansions. These moments can be used to construct confidence intervals for the flow quantities (e.g., pressure and flow rate). The moment equations are deterministic and can be solved numerically with existing solvers. The proposed moment equation approach has certain advantages over the commonly used Monte Carlo approach.
Representation and display of vector field topology in fluid flow data sets
NASA Technical Reports Server (NTRS)
Helman, James; Hesselink, Lambertus
1989-01-01
The visualization of physical processes in general and of vector fields in particular is discussed. An approach to visualizing flow topology that is based on the physics and mathematics underlying the physical phenomenon is presented. It involves determining critical points in the flow where the velocity vector vanishes. The critical points, connected by principal lines or planes, determine the topology of the flow. The complexity of the data is reduced without sacrificing the quantitative nature of the data set. By reducing the original vector field to a set of critical points and their connections, a representation of the topology of a two-dimensional vector field that is much smaller than the original data set but retains with full precision the information pertinent to the flow topology is obtained. This representation can be displayed as a set of points and tangent curves or as a graph. Analysis (including algorithms), display, interaction, and implementation aspects are discussed.
Pressure modulation algorithm to separate cerebral hemodynamic signals from extracerebral artifacts.
Baker, Wesley B; Parthasarathy, Ashwin B; Ko, Tiffany S; Busch, David R; Abramson, Kenneth; Tzeng, Shih-Yu; Mesquita, Rickson C; Durduran, Turgut; Greenberg, Joel H; Kung, David K; Yodh, Arjun G
2015-07-01
We introduce and validate a pressure measurement paradigm that reduces extracerebral contamination from superficial tissues in optical monitoring of cerebral blood flow with diffuse correlation spectroscopy (DCS). The scheme determines subject-specific contributions of extracerebral and cerebral tissues to the DCS signal by utilizing probe pressure modulation to induce variations in extracerebral blood flow. For analysis, the head is modeled as a two-layer medium and is probed with long and short source-detector separations. Then a combination of pressure modulation and a modified Beer-Lambert law for flow enables experimenters to linearly relate differential DCS signals to cerebral and extracerebral blood flow variation without a priori anatomical information. We demonstrate the algorithm's ability to isolate cerebral blood flow during a finger-tapping task and during graded scalp ischemia in healthy adults. Finally, we adapt the pressure modulation algorithm to ameliorate extracerebral contamination in monitoring of cerebral blood oxygenation and blood volume by near-infrared spectroscopy.
NASA Astrophysics Data System (ADS)
de Andrade, Ricardo Lopes; Rêgo, Leandro Chaves
2018-02-01
The social network analysis (SNA) studies the interactions among actors in a network formed through some relationship (friendship, cooperation, trade, among others). The SNA is constantly approached from a binary point of view, i.e., it is only observed if a link between two actors is present or not regardless of the strength of this link. It is known that different information can be obtained in weighted and unweighted networks and that the information extracted from weighted networks is more accurate and detailed. Another rarely discussed approach in the SNA is related to the individual attributes of the actors (nodes), because such analysis is usually focused on the topological structure of networks. Features of the nodes are not incorporated in the SNA what implies that there is some loss or misperception of information in those analyze. This paper aims at exploring more precisely the complexities of a social network, initially developing a method that inserts the individual attributes in the topological structure of the network and then analyzing the network in four different ways: unweighted, edge-weighted and two methods for using both edge-weights and nodes' attributes. The international trade network was chosen in the application of this approach, where the nodes represent the countries, the links represent the cash flow in the trade transactions and countries' GDP were chosen as nodes' attributes. As a result, it is possible to observe which countries are most connected in the world economy and with higher cash flows, to point out the countries that are central to the intermediation of the wealth flow and those that are most benefited from being included in this network. We also made a principal component analysis to study which metrics are more influential in describing the data variability, which turn out to be mostly the weighted metrics which include the nodes' attributes.
Guidelines on CV networking information flow optimization for Texas.
DOT National Transportation Integrated Search
2017-03-01
Recognizing the fundamental role of information flow in future transportation applications, the research team investigated the quality and security of information flow in the connected vehicle (CV) environment. The research team identified key challe...
NASA Astrophysics Data System (ADS)
Florez, C.; Romero, M. A.; Ramirez, M. I.; Monsalve, G.
2013-05-01
In the elaboration of a hydrogeological conceptual model in regions of mining exploration where there is significant presence of crystalline massif rocks., the influence of physical and geometrical properties of rock discontinuities must be evaluated. We present the results of a structural analysis of rock discontinuities in a region of the Central Cordillera of Colombia (The upper and middle Bermellon Basin) in order to establish its hydrogeological characteristics for the improvement of the conceptual hydrogeological model for the region. The geology of the study area consists of schists with quartz and mica and porphyritic rocks, in a region of high slopes with a nearly 10 m thick weathered layer. The main objective of this research is to infer the preferential flow directions of groundwater and to estimate the tensor of potential hydraulic conductivity by using surface information and avoiding the use of wells and packer tests. The first step of our methodology is an analysis of drainage directions to detect patterns of structural controls in the run-off; after a field campaign of structural data recollection, where we compile information of strike, dip, continuity, spacing, roughness, aperture and frequency, we built equal area hydro-structural polar diagrams that indicate the potential directions for groundwater flow. These results are confronted with records of Rock Quality Designation (RQD) that have been systematically taken from several mining exploration boreholes in the area of study. By using all this information we estimate the potential tensor of hydraulic conductivity from a cubic law, obtaining the three principal directions with conductivities of the order of 10-5 and 10-6 m/s; the more conductive joint family has a NE strike with a nearly vertical dip.
Sibbald, Shannon L.; Wathen, C. Nadine; Kothari, Anita; Day, Adam M. B.
2013-01-01
Objective: Improving the process of evidence-based practice in primary health care requires an understanding of information exchange among colleagues. This study explored how clinically oriented research knowledge flows through multidisciplinary primary health care teams (PHCTs) and influences clinical decisions. Methods: This was an exploratory mixed-methods study with members of six PHCTs in Ontario, Canada. Quantitative data were collected using a questionnaire and analyzed with social network analysis (SNA) using UCINet. Qualitative data were collected using semi-structured interviews and analyzed with content analysis procedures using NVivo8. Results: It was found that obtaining research knowledge was perceived to be a shared responsibility among team members, whereas its application in patient care was seen as the responsibility of the team leader, usually the senior physician. PHCT members acknowledged the need for resources for information access, synthesis, interpretation, or management. Conclusion: Information sharing in interdisciplinary teams is a complex and multifaceted process. Specific interventions need to be improved such as formalizing modes of communication, better organizing knowledge-sharing activities, and improving the active use of allied health professionals. Despite movement toward team-based models, senior physicians are often gatekeepers of uptake of new evidence and changes in practice. PMID:23646028
Developing an emergency department crowding dashboard: A design science approach.
Martin, Niels; Bergs, Jochen; Eerdekens, Dorien; Depaire, Benoît; Verelst, Sandra
2017-08-30
As an emergency department (ED) is a complex adaptive system, the analysis of continuously gathered data is valuable to gain insight in the real-time patient flow. To support the analysis and management of ED operations, relevant data should be provided in an intuitive way. Within this context, this paper outlines the development of a dashboard which provides real-time information regarding ED crowding. The research project underlying this paper follows the principles of design science research, which involves the development and study of artifacts which aim to solve a generic problem. To determine the crowding indicators that are desired in the dashboard, a modified Delphi study is used. The dashboard is implemented using the open source Shinydashboard package in R. A dashboard is developed containing the desired crowding indicators, together with general patient flow characteristics. It is demonstrated using a dataset of a Flemish ED and fulfills the requirements which are defined a priori. The developed dashboard provides real-time information on ED crowding. This information enables ED staff to judge whether corrective actions are required in an effort to avoid the adverse effects of ED crowding. Copyright © 2017 Elsevier Ltd. All rights reserved.
1986-01-01
the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests
[Research progress of ecosystem service flow.
Liu, Hui Min; Fan, Yu Long; Ding, Sheng Yan
2016-07-01
With the development of social economy, human disturbance has resulted in a variety of ecosystem service degradation or disappearance. Ecosystem services flow plays an important role in delivery, transformation and maintenance of ecosystem services, and becomes one of the new research directions. In this paper, based on the classification of ecosystem services flow, we analyzed ecosystem service delivery carrier, and investigated the mechanism of ecosystem service flow, including the information, property, scale features, quantification and cartography. Moreover, a tentative analysis on cost-effective of ecosystem services flow (such as transportation costs, conversion costs, usage costs and cost of relativity) was made to analyze the consumption cost in ecosystem services flow process. It aimed to analyze dissipation cost in ecosystem services flow process. To a certain extent, the study of ecosystem service flow solved the problem of "double counting" in ecosystem services valuation, which could make a contribution for the sake of recognizing hot supply and consumption spots of ecosystem services. In addition, it would be conducive to maximizing the ecosystem service benefits in the transmission process and putting forward scientific and reasonable ecological compensation.
[A capillary blood flow velocity detection system based on linear array charge-coupled devices].
Zhou, Houming; Wang, Ruofeng; Dang, Qi; Yang, Li; Wang, Xiang
2017-12-01
In order to detect the flow characteristics of blood samples in the capillary, this paper introduces a blood flow velocity measurement system based on field-programmable gate array (FPGA), linear charge-coupled devices (CCD) and personal computer (PC) software structure. Based on the analysis of the TCD1703C and AD9826 device data sheets, Verilog HDL hardware description language was used to design and simulate the driver. Image signal acquisition and the extraction of the real-time edge information of the blood sample were carried out synchronously in the FPGA. Then a series of discrete displacement were performed in a differential operation to scan each of the blood samples displacement, so that the sample flow rate could be obtained. Finally, the feasibility of the blood flow velocity detection system was verified by simulation and debugging. After drawing the flow velocity curve and analyzing the velocity characteristics, the significance of measuring blood flow velocity is analyzed. The results show that the measurement of the system is less time-consuming and less complex than other flow rate monitoring schemes.
Beyond Metrics? The Role of Hydrologic Baseline Archetypes in Environmental Water Management.
Lane, Belize A; Sandoval-Solis, Samuel; Stein, Eric D; Yarnell, Sarah M; Pasternack, Gregory B; Dahlke, Helen E
2018-06-22
Balancing ecological and human water needs often requires characterizing key aspects of the natural flow regime and then predicting ecological response to flow alterations. Flow metrics are generally relied upon to characterize long-term average statistical properties of the natural flow regime (hydrologic baseline conditions). However, some key aspects of hydrologic baseline conditions may be better understood through more complete consideration of continuous patterns of daily, seasonal, and inter-annual variability than through summary metrics. Here we propose the additional use of high-resolution dimensionless archetypes of regional stream classes to improve understanding of baseline hydrologic conditions and inform regional environmental flows assessments. In an application to California, we describe the development and analysis of hydrologic baseline archetypes to characterize patterns of flow variability within and between stream classes. We then assess the utility of archetypes to provide context for common flow metrics and improve understanding of linkages between aquatic patterns and processes and their hydrologic controls. Results indicate that these archetypes may offer a distinct and complementary tool for researching mechanistic flow-ecology relationships, assessing regional patterns for streamflow management, or understanding impacts of changing climate.
Olson, Scott A.; Tasker, Gary D.; Johnston, Craig M.
2003-01-01
Estimates of the magnitude and frequency of streamflow are needed to safely and economically design bridges, culverts, and other structures in or near streams. These estimates also are used for managing floodplains, identifying flood-hazard areas, and establishing flood-insurance rates, but may be required at ungaged sites where no observed flood data are available for streamflow-frequency analysis. This report describes equations for estimating flow-frequency characteristics at ungaged, unregulated streams in Vermont. In the past, regression equations developed to estimate streamflow statistics required users to spend hours manually measuring basin characteristics for the stream site of interest. This report also describes the accompanying customized geographic information system (GIS) tool that automates the measurement of basin characteristics and calculation of corresponding flow statistics. The tool includes software that computes the accuracy of the results and adjustments for expected probability and for streamflow data of a nearby stream-gaging station that is either upstream or downstream and within 50 percent of the drainage area of the site where the flow-frequency characteristics are being estimated. The custom GIS can be linked to the National Flood Frequency program, adding the ability to plot peak-flow-frequency curves and synthetic hydrographs and to compute adjustments for urbanization.
Tracking interface and common curve dynamics for two-fluid flow in porous media
Mcclure, James E.; Miller, Cass T.; Gray, W. G.; ...
2016-04-29
Pore-scale studies of multiphase flow in porous medium systems can be used to understand transport mechanisms and quantitatively determine closure relations that better incorporate microscale physics into macroscale models. Multiphase flow simulators constructed using the lattice Boltzmann method provide a means to conduct such studies, including both the equilibrium and dynamic aspects. Moving, storing, and analyzing the large state space presents a computational challenge when highly-resolved models are applied. We present an approach to simulate multiphase flow processes in which in-situ analysis is applied to track multiphase flow dynamics at high temporal resolution. We compute a comprehensive set of measuresmore » of the phase distributions and the system dynamics, which can be used to aid fundamental understanding and inform closure relations for macroscale models. The measures computed include microscale point representations and macroscale averages of fluid saturations, the pressure and velocity of the fluid phases, interfacial areas, interfacial curvatures, interface and common curve velocities, interfacial orientation tensors, phase velocities and the contact angle between the fluid-fluid interface and the solid surface. Test cases are studied to validate the approach and illustrate how measures of system state can be obtained and used to inform macroscopic theory.« less
Yoo, Peter E; Hagan, Maureen A; John, Sam E; Opie, Nicholas L; Ordidge, Roger J; O'Brien, Terence J; Oxley, Thomas J; Moffat, Bradford A; Wong, Yan T
2018-06-01
Performing voluntary movements involves many regions of the brain, but it is unknown how they work together to plan and execute specific movements. We recorded high-resolution ultra-high-field blood-oxygen-level-dependent signal during a cued ankle-dorsiflexion task. The spatiotemporal dynamics and the patterns of task-relevant information flow across the dorsal motor network were investigated. We show that task-relevant information appears and decays earlier in the higher order areas of the dorsal motor network then in the primary motor cortex. Furthermore, the results show that task-relevant information is encoded in general initially, and then selective goals are subsequently encoded in specifics subregions across the network. Importantly, the patterns of recurrent information flow across the network vary across different subregions depending on the goal. Recurrent information flow was observed across all higher order areas of the dorsal motor network in the subregions encoding for the current goal. In contrast, only the top-down information flow from the supplementary motor cortex to the frontoparietal regions, with weakened recurrent information flow between the frontoparietal regions and bottom-up information flow from the frontoparietal regions to the supplementary cortex were observed in the subregions encoding for the opposing goal. We conclude that selective motor goal encoding and execution rely on goal-dependent differences in subregional recurrent information flow patterns across the long-range dorsal motor network areas that exhibit graded functional specialization. © 2018 Wiley Periodicals, Inc.
Law, Andrew J.; Sharma, Gaurav; Schieber, Marc H.
2014-01-01
We present a methodology for detecting effective connections between simultaneously recorded neurons using an information transmission measure to identify the presence and direction of information flow from one neuron to another. Using simulated and experimentally-measured data, we evaluate the performance of our proposed method and compare it to the traditional transfer entropy approach. In simulations, our measure of information transmission outperforms transfer entropy in identifying the effective connectivity structure of a neuron ensemble. For experimentally recorded data, where ground truth is unavailable, the proposed method also yields a more plausible connectivity structure than transfer entropy. PMID:21096617
Calculation of flow about posts and powerhead model
NASA Technical Reports Server (NTRS)
1988-01-01
A large number of computational fluid mechanics (CFD) problems were investigated. The primary studies include: the analysis of the turnaround duct/hot gas manifold/transfer tubes (fuel side) of the Space Shuttle Main Engine (SSME); the analysis of the LOX-T manifold (oxidizer side) of the SSME; the analysis of hydrogen accumulation in the Vandeburg flame trench; and modification of the Intel/VT241 systems to accommodate the EADS and PLOT 3D. Some of the analyses were exploratory in nature, using the CONTINUSYS code to provide preliminary information to enhance understanding of the problem, while in other the primary thrust was to acquire design information. In all cases the ability to predict information rapidly in these very complex analyses is seen to be an important demonstration of the power and utility of this mature predictive capability.
Space station data system analysis/architecture study. Task 4: System definition report. Appendix
NASA Technical Reports Server (NTRS)
1985-01-01
Appendices to the systems definition study for the space station Data System are compiled. Supplemental information on external interface specification, simulation and modeling, and function design characteristics is presented along with data flow diagrams, a data dictionary, and function allocation matrices.
Characterizing Mechanical and Flow Properties using Injection Falloff Tests, March 28, 2011
This presentation asserts that Injection Fall-off Testing is an efficient way to derive in-situ information on most rock types, after-closure analysis can derive rock transmissibility and pore fluid pressure, and this is used to assist in the HF process.
A Mobile Device for Measuring Regional Cerebral Circulation
Howard, George; Griffith, David W.; Stump, David A.; Hinschelwood, Laura
1980-01-01
Immobility and costs of currently available regional cerebral blood flow (rCBF) equipment usually require having a single fixed blood flow lab, which cannot be used to study non-ambulatory patients who are often the most interesting to study. After careful study of the information flow between the steps involved in the collection, analysis and display of data, a new rCBF machine has been developed with a mobile satellite and a host processor. The satellite is equipped with a Z-80 microprocessor which controls the data collection, screen formating, data display and communications with the host. The host provides the processing power necessary for moderately complex curve fitting and data storage.
Analyzing lease/purchase options.
Ciolek, D; Mace, J D
1998-01-01
The authors' previous article, "Equipment Acquisition Using Various Forms of Leasing," covers information necessary for selecting among the different kinds of leases. This article explains how to reach a proper financial analysis, preferably using two phases. Using a representative example, the article guides the reader through the first phase and introduces the elements needing review in the second phase. Key elements include pretax aftertax and cash flow analyses. Different organizations use different yardsticks to measure the financials of a transaction, but in general, cash is king. Therefore, the most widely used comparison is the purchase versus lease IRR (internal rate of return) produced by measuring the cash flow of the purchase case compared to the cash flow of the lease case.
Hypersonic Shock Interactions About a 25 deg/65 deg Sharp Double Cone
NASA Technical Reports Server (NTRS)
Moss, James N.; LeBeau, Gerald J.; Glass, Christopher E.
2002-01-01
This paper presents the results of a numerical study of shock interactions resulting from Mach 10 air flow about a sharp double cone. Computations are made with the direct simulation Monte Carlo (DSMC) method by using two different codes: the G2 code of Bird and the DAC (DSMC Analysis Code) code of LeBeau. The flow conditions are the pretest nominal free-stream conditions specified for the ONERA R5Ch low-density wind tunnel. The focus is on the sensitivity of the interactions to grid resolution while providing information concerning the flow structure and surface results for the extent of separation, heating, pressure, and skin friction.
Stirling, Christine; Lloyd, Barbara; Scott, Jenn; Abbey, Jenny; Croft, Toby; Robinson, Andrew
2012-03-29
This paper explores the meanings given by a diverse range of stakeholders to a decision aid aimed at helping carers of people in early to moderate stages of dementia (PWD) to select community based respite services. Decision aids aim to empower clients to share decision making with health professionals. However, the match between health professionals' perspectives on decision support needs and their clients' perspective is an important and often unstudied aspect of decision aid use. A secondary analysis was undertaken of qualitative data collected as part of a larger study. The data included twelve interviews with carers of people with dementia, three interviews with expert advisors, and three focus groups with health professionals. A theoretical analysis was conducted, drawing on theories of 'positioning' and professional identity. Health professionals are seen to hold varying attitudes and beliefs about carers' decision support needs, and these appeared to be grounded in the professional identity of each group. These attitudes and beliefs shaped their attitudes towards decision aids, the information they believed should be offered to dementia carers, and the timing of its offering. Some groups understood carers as needing to be protected from realistic information and consequently saw a need to filter information to carer clients. Health professionals' beliefs may cause them to restrict information flows, which can limit carers' ability to make decisions, and limit health services' ability to improve partnering and shared decision making. In an era where information is freely available to those with the resources to access it, we question whether health professionals should filter information.
Analysis of coherent dynamical processes through computer vision
NASA Astrophysics Data System (ADS)
Hack, M. J. Philipp
2016-11-01
Visualizations of turbulent boundary layers show an abundance of characteristic arc-shaped structures whose apparent similarity suggests a common origin in a coherent dynamical process. While the structures have been likened to the hairpin vortices observed in the late stages of transitional flow, a consistent description of the underlying mechanism has remained elusive. Detailed studies are complicated by the chaotic nature of turbulence which modulates each manifestation of the process and which renders the isolation of individual structures a challenging task. The present study applies methods from the field of computer vision to capture the time evolution of turbulent flow features and explore the associated physical mechanisms. The algorithm uses morphological operations to condense the structure of the turbulent flow field into a graph described by nodes and links. The low-dimensional geometric information is stored in a database and allows the identification and analysis of equivalent dynamical processes across multiple scales. The framework is not limited to turbulent boundary layers and can also be applied to different types of flows as well as problems from other fields of science.
Information retrieval from holographic interferograms: Fundamentals and problems
NASA Technical Reports Server (NTRS)
Vest, Charles M.
1987-01-01
Holographic interferograms can contain large amounts of information about flow and temperature fields. Their information content can be very high because they can be viewed from many different directions. This multidirectionality, and fringe localization add to the information contained in the fringe pattern if diffuse illumination is used. Additional information, and increased accuracy can be obtained through the use of dual reference wave holography to add reference fringes or to effect discrete phase shift or hetrodyne interferometry. Automated analysis of fringes is possible if interferograms are of simple structure and good quality. However, in practice a large number of practical problems can arise, so that a difficult image processing task results.
Wellman, Tristan P.; Rupert, Michael G.
2016-03-03
The results of this investigation offer the foundational information needed for developing best management practices to mitigate nitrate contamination, basic concepts on water quality to aid public education, and information to guide regulatory measures if policy makers determine this is warranted. Science-based decision making will require continued monitoring and analysis of water quality in the future.
An Analysis of Air Force Management of Turbine Engine Monitoring Systems (TEMS).
1980-06-01
AIR FORCE AIR UNIVERSITY (ATC) C AIR FORCE INSTITUTE OF TECHNOLOGY LWright-Patterson Air Force Base, Ohio 80 9 22 057 All BBO RCE j GEMEM Elbert B...detrimental ideas, or deleterious information are contained therein. Furthermore, the views expressed in the document are those of the author(s) and...role problems, information flow and integration problems, and leadership and command problems. Four alternative management concepts were analyzed. Based
NASA Astrophysics Data System (ADS)
Ying, Shen; Li, Lin; Gao, Yurong
2009-10-01
Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.
Renal blood flow dynamics in inbred rat strains provides insight into autoregulation.
A Mitrou, Nicholas G; Cupples, William A
2014-01-01
Renal autoregulation maintains stable renal blood flow in the face of constantly fluctuating blood pressure. Autoregulation is also the only mechanism that protects the delicate glomerular capillaries when blood pressure increases. In order to understand autoregulation, the renal blood flow response to changing blood pressure is studied. The steadystate response of blood flow is informative, but limits investigation of the individual mechanisms of autoregulation. The dynamics of autoregulation can be probed with transfer function analysis. The frequency-domain analysis of autoregulation allows investigators to probe the relative activity of each mechanism of autoregulation. We discuss the methodology and interpretation of transfer function analysis. Autoregulation is routinely studied in the rat, of which there are many inbred strains. There are multiple strains of rat that are either selected or inbred as models of human pathology. We discuss relevant characteristics of Brown Norway, Spontaneously hypertensive, Dahl, and Fawn-Hooded hypertensive rats and explore differences among these strains in blood pressure, dynamic autoregulation, and susceptibility to hypertensive renal injury. Finally we show that the use of transfer function analysis in these rat strains has contributed to our understanding of the physiology and pathophysiology of autoregulation and hypertensive renal disease.Interestingly all these strains demonstrate effective tubuloglomerular feedback suggesting that this mechanism is not sufficient for effective autoregulation. In contrast, obligatory or conditional failure of the myogenic mechanism suggests that this component is both necessary and sufficient for autoregulation.
NASA Technical Reports Server (NTRS)
Mielke, Amy F.; Seasholtz, Richard G.; Elam, Kristie A.; Panda, Jayanta
2004-01-01
A molecular Rayleigh scattering based flow diagnostic is developed to measure time average velocity, density, temperature, and turbulence intensity in a 25.4-mm diameter nozzle free jet facility. The spectrum of the Rayleigh scattered light is analyzed using a Fabry-Perot interferometer operated in the static imaging mode. The resulting fringe pattern containing spectral information of the scattered light is recorded using a low noise CCD camera. Nonlinear least squares analysis of the fringe pattern using a kinetic theory model of the Rayleigh scattered light provides estimates of density, velocity, temperature, and turbulence intensity of the gas flow. Resulting flow parameter estimates are presented for an axial scan of subsonic flow at Mach 0.95 for comparison with previously acquired pitot tube data, and axial scans of supersonic flow in an underexpanded screeching jet. The issues related to obtaining accurate turbulence intensity measurements using this technique are discussed.
Zochodne, Douglas W
2018-06-01
Over 3 decades ago, seminal work by Phillip Low and colleagues established exquisite physiology around the measurement of nerve blood flow (NBF). Although not widely explored recently, its connection to the clinic has awaited human methodology. While human studies have not achieved a convincing level of rigour, newer imaging technologies are offering early information. The peripheral nerve trunk has parallel blood flow compartments that include epineurial flow dominated by arteriovenous shunts and downstream endoneurial blood flow (EBF). NBF and EBF have lower values than central nervous system blood flow, lack autoregulation yet have sympathetic and peptidergic neurovascular control. Contrary to expectation, injury to nerves is often associated with rises in NBF rather than ischemia, a finding of biological interest corroborated by human studies. Despite its potential importance, quantitative human measurements of EBF and NBF are not yet available. However, with development, careful NBF analysis may present new insights into nerve disorders. Muscle Nerve 57: 884-895, 2018. © 2017 Wiley Periodicals, Inc.
Engels, M M A; Yu, M; Stam, C J; Gouw, A A; van der Flier, W M; Scheltens, Ph; van Straaten, E C W; Hillebrand, A
2017-01-01
In a recent magnetoencephalography (MEG) study, we found posterior-to-anterior information flow over the cortex in higher frequency bands in healthy subjects, with a reversed pattern in the theta band. A disruption of information flow may underlie clinical symptoms in Alzheimer's disease (AD). In AD, highly connected regions (hubs) in posterior areas are mostly disrupted. We therefore hypothesized that in AD the information flow from these hub regions would be disturbed. We used resting-state MEG recordings from 27 early-onset AD patients and 26 healthy controls. Using beamformer-based virtual electrodes, we estimated neuronal oscillatory activity for 78 cortical regions of interest (ROIs) and 12 subcortical ROIs of the AAL atlas, and calculated the directed phase transfer entropy (dPTE) as a measure of information flow between these ROIs. Group differences were evaluated using permutation tests and, for the AD group, associations between dPTE and general cognition or CSF biomarkers were determined using Spearman correlation coefficients. We confirmed the previously reported posterior-to-anterior information flow in the higher frequency bands in the healthy controls, and found it to be disturbed in the beta band in AD. Most prominently, the information flow from the precuneus and the visual cortex, towards frontal and subcortical structures, was decreased in AD. These disruptions did not correlate with cognitive impairment or CSF biomarkers. We conclude that AD pathology may affect the flow of information between brain regions, particularly from posterior hub regions, and that changes in the information flow in the beta band indicate an aspect of the pathophysiological process in AD.
Interdisciplinary barriers: An impediment to the effective application of systems engineering
NASA Technical Reports Server (NTRS)
Harrison, E., Jr.
1971-01-01
Interdisciplinary transfer of information and technology does not occur very readily, even for system planners, because of the existence of some very real barriers. These barriers to flow of information and technology between disciplines represent one of the important difficulties associated with the application of systems analysis to many problems. The nature and characteristics of some of these barriers are enumerated and discussed in detail. A number of methodologies and techniques which have been specifically developed to aid in the transfer of technology and information across these interdisciplinary barriers is examined.
Quantitative holographic interferometry applied to combustion and compressible flow research
NASA Astrophysics Data System (ADS)
Bryanston-Cross, Peter J.; Towers, D. P.
1993-03-01
The application of holographic interferometry to phase object analysis is described. Emphasis has been given to a method of extracting quantitative information automatically from the interferometric fringe data. To achieve this a carrier frequency has been added to the holographic data. This has made it possible, firstly to form a phase map using a fast Fourier transform (FFT) algorithm. Then to `solve,' or unwrap, this image to give a contiguous density map using a minimum weight spanning tree (MST) noise immune algorithm, known as fringe analysis (FRAN). Applications of this work to a burner flame and a compressible flow are presented. In both cases the spatial frequency of the fringes exceed the resolvable limit of conventional digital framestores. Therefore, a flatbed scanner with a resolution of 3200 X 2400 pixels has been used to produce very high resolution digital images from photographs. This approach has allowed the processing of data despite the presence of caustics, generated by strong thermal gradients at the edge of the combustion field. A similar example is presented from the analysis of a compressible transonic flow in the shock wave and trailing edge regions.
NASA Technical Reports Server (NTRS)
Lauer, H. V., Jr.; Ming, D. W.; Golden, D. C.; Lin, I.-C.; Boynton, W. V.
2000-01-01
Volatile-bearing minerals (e.g., Fe-oxyhydroxides, phyllosilicates, carbonates, and sulfates) may be important phases on the surface of Mars. In order to characterize these potential phases the Thermal Evolved-Gas Analyzer (TEGA), which was onboard the Mars Polar Lander, was to have performed differential scanning calorimetry (DSC) and evolved-gas analysis of soil samples collected from the surface. The sample chamber in TEGA operates at about 100 mbar (approximately 76 torr) with a N2, carrier gas flow of 0.4 seem. Essentially, no information exists on the effects of reduced pressure on the thermal properties of volatile-bearing minerals. In support of TEGA, we have constructed a laboratory analog for TEGA from commercial instrumentation. We connected together a commercial differential scanning calorimeter, a quadruple mass spectrometer, a vacuum pump, digital pressure gauge, electronic mass flow meter, gas "K" bottles, gas dryers, and high and low pressure regulators using a collection of shut off and needle valves. Our arrangement allows us to vary and control the pressure and carrier gas flow rate inside the calorimeter oven chamber.
Selectivity to Translational Egomotion in Human Brain Motion Areas
Pitzalis, Sabrina; Sdoia, Stefano; Bultrini, Alessandro; Committeri, Giorgia; Di Russo, Francesco; Fattori, Patrizia; Galletti, Claudio; Galati, Gaspare
2013-01-01
The optic flow generated when a person moves through the environment can be locally decomposed into several basic components, including radial, circular, translational and spiral motion. Since their analysis plays an important part in the visual perception and control of locomotion and posture it is likely that some brain regions in the primate dorsal visual pathway are specialized to distinguish among them. The aim of this study is to explore the sensitivity to different types of egomotion-compatible visual stimulations in the human motion-sensitive regions of the brain. Event-related fMRI experiments, 3D motion and wide-field stimulation, functional localizers and brain mapping methods were used to study the sensitivity of six distinct motion areas (V6, MT, MST+, V3A, CSv and an Intra-Parietal Sulcus motion [IPSmot] region) to different types of optic flow stimuli. Results show that only areas V6, MST+ and IPSmot are specialized in distinguishing among the various types of flow patterns, with a high response for the translational flow which was maximum in V6 and IPSmot and less marked in MST+. Given that during egomotion the translational optic flow conveys differential information about the near and far external objects, areas V6 and IPSmot likely process visual egomotion signals to extract information about the relative distance of objects with respect to the observer. Since area V6 is also involved in distinguishing object-motion from self-motion, it could provide information about location in space of moving and static objects during self-motion, particularly in a dynamically unstable environment. PMID:23577096
Xu, Nan; Spreng, R Nathan; Doerschuk, Peter C
2017-01-01
Resting-state functional MRI (rs-fMRI) is widely used to noninvasively study human brain networks. Network functional connectivity is often estimated by calculating the timeseries correlation between blood-oxygen-level dependent (BOLD) signal from different regions of interest (ROIs). However, standard correlation cannot characterize the direction of information flow between regions. In this paper, we introduce and test a new concept, prediction correlation, to estimate effective connectivity in functional brain networks from rs-fMRI. In this approach, the correlation between two BOLD signals is replaced by a correlation between one BOLD signal and a prediction of this signal via a causal system driven by another BOLD signal. Three validations are described: (1) Prediction correlation performed well on simulated data where the ground truth was known, and outperformed four other methods. (2) On simulated data designed to display the "common driver" problem, prediction correlation did not introduce false connections between non-interacting driven ROIs. (3) On experimental data, prediction correlation recovered the previously identified network organization of human brain. Prediction correlation scales well to work with hundreds of ROIs, enabling it to assess whole brain interregional connectivity at the single subject level. These results provide an initial validation that prediction correlation can capture the direction of information flow and estimate the duration of extended temporal delays in information flow between regions of interest ROIs based on BOLD signal. This approach not only maintains the high sensitivity to network connectivity provided by the correlation analysis, but also performs well in the estimation of causal information flow in the brain.
Transient radiative energy transfer in incompressible laminar flows
NASA Technical Reports Server (NTRS)
Tiwari, S. N.; Singh, D. J.
1987-01-01
Analysis and numerical procedures are presented to investigate the transient radiative interactions of nongray absorbing-emitting species in laminar fully-developed flows between two parallel plates. The particular species considered are OH, CO, CO2, and H2O and different mixtures of these. Transient and steady-state results are obtained for the temperaure distribution and bulk temperature for different plate spacings, wall temperatures, and pressures. Results, in general, indicate that the rate of radiative heating can be quite high during earlier times. This information is useful in designing thermal protection systems for transient operations.
Analysis of 2D Phase Contrast MRI in Renal Arteries by Self Organizing Maps
NASA Astrophysics Data System (ADS)
Zöllner, Frank G.; Schad, Lothar R.
We present an approach based on self organizing maps to segment renal arteries from 2D PC Cine MR, images to measure blood velocity and flow. Such information are important in grading renal artery stenosis and support the decision on surgical interventions like percu-tan transluminal angioplasty. Results show that the renal arteries could be extracted automatically. The corresponding velocity profiles show high correlation (r=0.99) compared those from manual delineated vessels. Furthermore, the method could detect possible blood flow patterns within the vessel.
Adeeb A. Rahman; Thomas J. Urbanik; Mustafa Mahamid
2006-01-01
This paper presents a model using finite element method to study the response of a typical commercial corrugated fiberboard due to an induced moisture function at one side of the fiberboard. The model predicts how the moisture diffusion will permeate through the fiberboardâs layers(medium and liners) providing information on moisture content at any given point...
Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel
2012-04-01
High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.
Time-Dependent Simulations of Turbopump Flows
NASA Technical Reports Server (NTRS)
Kris, Cetin C.; Kwak, Dochan
2001-01-01
The objective of the current effort is to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine, including high-fidelity unsteady turbopump flow analysis. This capability is needed to support the design of pump sub-systems for advanced space transportation vehicles that are likely to involve liquid propulsion systems. To date, computational tools for design/analysis of turbopump flows are based on relatively lower fidelity methods. An unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available for real-world engineering applications. The present effort will provide developers with information such as transient flow phenomena at start up, impact of non-uniform inflows, system vibration and impact on the structure. In the proposed paper, the progress toward the capability of complete simulation of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. The relative motion of the grid systems for the rotor-stator interaction was obtained using overset grid techniques. Time-accuracy of the scheme has been evaluated with simple test cases. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on Origin 2000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability will be presented along with the performance of parallel versions of the code.
NASA Astrophysics Data System (ADS)
Patra, Bishnubrata; Peng, Chien-Chung; Liao, Wei-Hao; Lee, Chau-Hwang; Tung, Yi-Chung
2016-02-01
Three-dimensional (3D) tumor spheroid possesses great potential as an in vitro model to improve predictive capacity for pre-clinical drug testing. In this paper, we combine advantages of flow cytometry and microfluidics to perform drug testing and analysis on a large number (5000) of uniform sized tumor spheroids. The spheroids are formed, cultured, and treated with drugs inside a microfluidic device. The spheroids can then be harvested from the device without tedious operation. Due to the ample cell numbers, the spheroids can be dissociated into single cells for flow cytometry analysis. Flow cytometry provides statistical information in single cell resolution that makes it feasible to better investigate drug functions on the cells in more in vivo-like 3D formation. In the experiments, human hepatocellular carcinoma cells (HepG2) are exploited to form tumor spheroids within the microfluidic device, and three anti-cancer drugs: Cisplatin, Resveratrol, and Tirapazamine (TPZ), and their combinations are tested on the tumor spheroids with two different sizes. The experimental results suggest the cell culture format (2D monolayer vs. 3D spheroid) and spheroid size play critical roles in drug responses, and also demonstrate the advantages of bridging the two techniques in pharmaceutical drug screening applications.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Conductance Steamflow relationship
Whitney Trainor-Guitton
2015-04-01
These histograms represent our calibration of conductance of a volcanic geothermal field (with a clay cap) and the observed steam flow rates. See the following paper for further description: Trainor-Guitton, Hoversten,Nordquist, Intani, Value of information analysis using geothermal field data: accounting for multiple interpretations & determining new drilling locations. SEG Abstracts 2015.
Nardi, Valentina; Pulluqi, Olja; Abramson, Jeremy S; Dal Cin, Paola; Hasserjian, Robert P
2015-06-01
Bone marrow (BM) evaluation is an important part of lymphoma staging, which guides patient management. Although positive staging marrow is defined as morphologically identifiable disease, such samples often also include flow cytometric analysis and conventional karyotyping. Cytogenetic analysis is a labor-intensive and costly procedure and its utility in this setting is uncertain. We retrospectively reviewed pathological reports of 526 staging marrow specimens in which conventional karyotyping had been performed. All samples originated from a single institution from patients with previously untreated Hodgkin and non-Hodgkin lymphomas presenting in an extramedullary site. Cytogenetic analysis revealed clonal abnormalities in only eight marrow samples (1.5%), all of which were positive for lymphoma by morphologic evaluation. Flow cytometry showed a small clonal lymphoid population in three of the 443 morphologically negative marrow samples (0.7%). Conventional karyotyping is rarely positive in lymphoma staging marrow samples and, in our cohort, the BM karyotype did not contribute clinically relevant information in the vast majority of cases. Our findings suggest that karyotyping should not be performed routinely on BM samples taken to stage previously diagnosed extramedullary lymphomas unless there is pathological evidence of BM involvement by lymphoma. © 2015 Wiley Periodicals, Inc.
Nandi, Anjan K; Sumana, Annagiri; Bhattacharya, Kunal
2014-12-06
Social insects provide an excellent platform to investigate flow of information in regulatory systems since their successful social organization is essentially achieved by effective information transfer through complex connectivity patterns among the colony members. Network representation of such behavioural interactions offers a powerful tool for structural as well as dynamical analysis of the underlying regulatory systems. In this paper, we focus on the dominance interaction networks in the tropical social wasp Ropalidia marginata-a species where behavioural observations indicate that such interactions are principally responsible for the transfer of information between individuals about their colony needs, resulting in a regulation of their own activities. Our research reveals that the dominance networks of R. marginata are structurally similar to a class of naturally evolved information processing networks, a fact confirmed also by the predominance of a specific substructure-the 'feed-forward loop'-a key functional component in many other information transfer networks. The dynamical analysis through Boolean modelling confirms that the networks are sufficiently stable under small fluctuations and yet capable of more efficient information transfer compared to their randomized counterparts. Our results suggest the involvement of a common structural design principle in different biological regulatory systems and a possible similarity with respect to the effect of selection on the organization levels of such systems. The findings are also consistent with the hypothesis that dominance behaviour has been shaped by natural selection to co-opt the information transfer process in such social insect species, in addition to its primal function of mediation of reproductive competition in the colony. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Anglés, Marc; Folch, Albert; Oms, Oriol; Maestro, Eudald; Mas-Pla, Josep
2017-12-01
Hydrogeological models of mountain regions present the opportunity to understand the role of geological factors on groundwater resources. The effects of sedimentary facies and fracture distribution on groundwater flow and resource exploitation are studied in the ancient fan delta of Sant Llorenç de Munt (central Catalonia, Spain) by integrating geological field observations (using sequence stratigraphy methods) and hydrogeological data (pumping tests, hydrochemistry and environmental isotopes). A comprehensive analysis of data portrays the massif as a single unit, constituted by different compartments determined by specific layers and sets of fractures. Two distinct flow systems—local and regional—are identified based on pumping test analysis as well as hydrochemical and isotopic data. Drawdown curves derived from pumping tests indicate that the behavior of the saturated layers, whose main porosity is given by the fracture network, corresponds to a confined aquifer. Pumping tests also reflect a double porosity within the system and the occurrence of impervious boundaries that support a compartmentalized model for the whole aquifer system. Hydrochemical data and associated spatial evolution show the result of water-rock interaction along the flow lines. Concentration of magnesium, derived from dolomite dissolution, is a tracer of the flow-path along distinct stratigraphic units. Water stable isotopes indicate that evaporation (near a 5% loss) occurs in a thick unsaturated zone within the massif before infiltration reaches the water table. The hydrogeological analysis of this outcropping system provides a methodology for the conceptualization of groundwater flow in similar buried systems where logging and hydrogeological information are scarce.
Load flow and state estimation algorithms for three-phase unbalanced power distribution systems
NASA Astrophysics Data System (ADS)
Madvesh, Chiranjeevi
Distribution load flow and state estimation are two important functions in distribution energy management systems (DEMS) and advanced distribution automation (ADA) systems. Distribution load flow analysis is a tool which helps to analyze the status of a power distribution system under steady-state operating conditions. In this research, an effective and comprehensive load flow algorithm is developed to extensively incorporate the distribution system components. Distribution system state estimation is a mathematical procedure which aims to estimate the operating states of a power distribution system by utilizing the information collected from available measurement devices in real-time. An efficient and computationally effective state estimation algorithm adapting the weighted-least-squares (WLS) method has been developed in this research. Both the developed algorithms are tested on different IEEE test-feeders and the results obtained are justified.
Progress on a Rayleigh Scattering Mass Flux Measurement Technique
NASA Technical Reports Server (NTRS)
Mielke-Fagan, Amy F.; Clem, Michelle M.; Elam, Kristie A.; Hirt, Stefanie M.
2010-01-01
A Rayleigh scattering diagnostic has been developed to provide mass flux measurements in wind tunnel flows. Spectroscopic molecular Rayleigh scattering is an established flow diagnostic tool that has the ability to provide simultaneous density and velocity measurements in gaseous flows. Rayleigh scattered light from a focused 10 Watt continuous-wave laser beam is collected and fiber-optically transmitted to a solid Fabry-Perot etalon for spectral analysis. The circular interference pattern that contains the spectral information that is needed to determine the flow properties is imaged onto a CCD detector. Baseline measurements of density and velocity in the test section of the 15 cm x 15 cm Supersonic Wind Tunnel at NASA Glenn Research Center are presented as well as velocity measurements within a supersonic combustion ramjet engine isolator model installed in the tunnel test section.
Assessment of Geometry and In-Flow Effects on Contra-Rotating Open Rotor Broadband Noise Predictions
NASA Technical Reports Server (NTRS)
Zawodny, Nikolas S.; Nark, Douglas M.; Boyd, D. Douglas, Jr.
2015-01-01
Application of previously formulated semi-analytical models for the prediction of broadband noise due to turbulent rotor wake interactions and rotor blade trailing edges is performed on the historical baseline F31/A31 contra-rotating open rotor configuration. Simplified two-dimensional blade element analysis is performed on cambered NACA 4-digit airfoil profiles, which are meant to serve as substitutes for the actual rotor blade sectional geometries. Rotor in-flow effects such as induced axial and tangential velocities are incorporated into the noise prediction models based on supporting computational fluid dynamics (CFD) results and simplified in-flow velocity models. Emphasis is placed on the development of simplified rotor in-flow models for the purpose of performing accurate noise predictions independent of CFD information. The broadband predictions are found to compare favorably with experimental acoustic results.
Instability of cooperative adaptive cruise control traffic flow: A macroscopic approach
NASA Astrophysics Data System (ADS)
Ngoduy, D.
2013-10-01
This paper proposes a macroscopic model to describe the operations of cooperative adaptive cruise control (CACC) traffic flow, which is an extension of adaptive cruise control (ACC) traffic flow. In CACC traffic flow a vehicle can exchange information with many preceding vehicles through wireless communication. Due to such communication the CACC vehicle can follow its leader at a closer distance than the ACC vehicle. The stability diagrams are constructed from the developed model based on the linear and nonlinear stability method for a certain model parameter set. It is found analytically that CACC vehicles enhance the stabilization of traffic flow with respect to both small and large perturbations compared to ACC vehicles. Numerical simulation is carried out to support our analytical findings. Based on the nonlinear stability analysis, we will show analytically and numerically that the CACC system better improves the dynamic equilibrium capacity over the ACC system. We have argued that in parallel to microscopic models for CACC traffic flow, the newly developed macroscopic will provide a complete insight into the dynamics of intelligent traffic flow.
Flow cytometric analysis of cell-surface and intracellular antigens in leukemia diagnosis.
Knapp, W; Strobl, H; Majdic, O
1994-12-15
New technology allows highly sensitive flow cytometric detection and quantitative analysis of intracellular antigens in normal and malignant hemopoietic cells. With this technology, the earliest stages of myeloid and lymphoid differentiation can easily and reliably be identified using antibodies directed against (pro-)myeloperoxidase/MPO, CD22 and CD3 antigens, respectively. Particularly for the analysis of undifferentiated acute myeloblastic leukemia (AML) cells, the immunological demonstration of intracellular MPO or its enzymatically inactive proforms is highly relevant, since other myeloid marker molecules such as CD33, CD13, or CDw65 are either not restricted to the granulomonocytic lineage or appear later in differentiation. By combining MPO staining with staining for lactoferrin (LF), undifferentiated cells can be distinguished from the granulomonocytic maturation compartment in bone marrow, since LF is selectively expressed from the myelocyte stage of differentiation onward. The list of informative intracellular antigens to be used in leukemia cell analysis will certainly expand in the near future. One candidate, intracellular CD68, has already been tested by us, and results are presented. Also dealt within this article are surface marker molecules not (as yet) widely used in leukemia cell analysis but with the potential to provide important additional information. Among them are the surface structures CD15, CD15s, CDw65, CD79a (MB-1), CD79b (B29), CD87 (uPA-R), and CD117 (c-kit).
Estimating magnitude and frequency of floods using the PeakFQ 7.0 program
Veilleux, Andrea G.; Cohn, Timothy A.; Flynn, Kathleen M.; Mason, Jr., Robert R.; Hummel, Paul R.
2014-01-01
Flood-frequency analysis provides information about the magnitude and frequency of flood discharges based on records of annual maximum instantaneous peak discharges collected at streamgages. The information is essential for defining flood-hazard areas, for managing floodplains, and for designing bridges, culverts, dams, levees, and other flood-control structures. Bulletin 17B (B17B) of the Interagency Advisory Committee on Water Data (IACWD; 1982) codifies the standard methodology for conducting flood-frequency studies in the United States. B17B specifies that annual peak-flow data are to be fit to a log-Pearson Type III distribution. Specific methods are also prescribed for improving skew estimates using regional skew information, tests for high and low outliers, adjustments for low outliers and zero flows, and procedures for incorporating historical flood information. The authors of B17B identified various needs for methodological improvement and recommended additional study. In response to these needs, the Advisory Committee on Water Information (ACWI, successor to IACWD; http://acwi.gov/, Subcommittee on Hydrology (SOH), Hydrologic Frequency Analysis Work Group (HFAWG), has recommended modest changes to B17B. These changes include adoption of a generalized method-of-moments estimator denoted the Expected Moments Algorithm (EMA) (Cohn and others, 1997) and a generalized version of the Grubbs-Beck test for low outliers (Cohn and others, 2013). The SOH requested that the USGS implement these changes in a user-friendly, publicly accessible program.
Shift in Global Tantalum Mine Production, 2000–2014
Bleiwas, Donald I.; Papp, John F.; Yager, Thomas R.
2015-12-10
One of the activities of the U.S. Geological Survey National Minerals Information Center (USGS-NMIC) is to analyze global supply chains and characterize major components of mineral and material flows from ore extraction through processing to first tier products. These analyses support the core mission of the USGS-NMIC as the Federal entity responsible for the collection, analysis, and dissemination of objective, unbiased, factual information on minerals essential to the U.S. economy and national security.
Discovering Hidden Controlling Parameters using Data Analytics and Dimensional Analysis
NASA Astrophysics Data System (ADS)
Del Rosario, Zachary; Lee, Minyong; Iaccarino, Gianluca
2017-11-01
Dimensional Analysis is a powerful tool, one which takes a priori information and produces important simplifications. However, if this a priori information - the list of relevant parameters - is missing a relevant quantity, then the conclusions from Dimensional Analysis will be incorrect. In this work, we present novel conclusions in Dimensional Analysis, which provide a means to detect this failure mode of missing or hidden parameters. These results are based on a restated form of the Buckingham Pi theorem that reveals a ridge function structure underlying all dimensionless physical laws. We leverage this structure by constructing a hypothesis test based on sufficient dimension reduction, allowing for an experimental data-driven detection of hidden parameters. Both theory and examples will be presented, using classical turbulent pipe flow as the working example. Keywords: experimental techniques, dimensional analysis, lurking variables, hidden parameters, buckingham pi, data analysis. First author supported by the NSF GRFP under Grant Number DGE-114747.
A grid-embedding transonic flow analysis computer program for wing/nacelle configurations
NASA Technical Reports Server (NTRS)
Atta, E. H.; Vadyak, J.
1983-01-01
An efficient grid-interfacing zonal algorithm was developed for computing the three-dimensional transonic flow field about wing/nacelle configurations. the algorithm uses the full-potential formulation and the AF2 approximate factorization scheme. The flow field solution is computed using a component-adaptive grid approach in which separate grids are employed for the individual components in the multi-component configuration, where each component grid is optimized for a particular geometry such as the wing or nacelle. The wing and nacelle component grids are allowed to overlap, and flow field information is transmitted from one grid to another through the overlap region using trivariate interpolation. This report represents a discussion of the computational methods used to generate both the wing and nacelle component grids, the technique used to interface the component grids, and the method used to obtain the inviscid flow solution. Computed results and correlations with experiment are presented. also presented are discussions on the organization of the wing grid generation (GRGEN3) and nacelle grid generation (NGRIDA) computer programs, the grid interface (LK) computer program, and the wing/nacelle flow solution (TWN) computer program. Descriptions of the respective subroutines, definitions of the required input parameters, a discussion on interpretation of the output, and the sample cases illustrating application of the analysis are provided for each of the four computer programs.
Dynamics of traffic flow with real-time traffic information
NASA Astrophysics Data System (ADS)
Yokoya, Yasushi
2004-01-01
We studied dynamics of traffic flow with real-time information provided. Provision of the real-time traffic information based on advancements in telecommunication technology is expected to facilitate the efficient utilization of available road capacity. This system has a potentiality of not only engineering for road usage but also the science of complexity series. In the system, the information plays a role of feedback connecting microscopic and macroscopic phenomena beyond the hierarchical structure of statistical physics. In this paper, we tried to clarify how the information works in a network of traffic flow from the perspective of statistical physics. The dynamical feature of the traffic flow is abstracted by a contrastive study between the nonequilibrium statistical physics and a computer simulation based on cellular automaton. We found that the information disrupts the local equilibrium of traffic flow by a characteristic dissipation process due to interaction between the information and individual vehicles. The dissipative structure was observed in the time evolution of traffic flow driven far from equilibrium as a consequence of the breakdown of the local-equilibrium hypothesis.
Flows of engineered nanomaterials through the recycling process in Switzerland.
Caballero-Guzman, Alejandro; Sun, Tianyin; Nowack, Bernd
2015-02-01
The use of engineered nanomaterials (ENMs) in diverse applications has increased during the last years and this will likely continue in the near future. As the number of applications increase, more and more waste with nanomaterials will be generated. A portion of this waste will enter the recycling system, for example, in electronic products, textiles and construction materials. The fate of these materials during and after the waste management and recycling operations is poorly understood. The aim of this work is to model the flows of nano-TiO2, nano-ZnO, nano-Ag and CNT in the recycling system in Switzerland. The basis for this study is published information on the ENMs flows on the Swiss system. We developed a method to assess their flow after recycling. To incorporate the uncertainties inherent to the limited information available, we applied a probabilistic material flow analysis approach. The results show that the recycling processes does not result in significant further propagation of nanomaterials into new products. Instead, the largest proportion will flow as waste that can subsequently be properly handled in incineration plants or landfills. Smaller fractions of ENMs will be eliminated or end up in materials that are sent abroad to undergo further recovery processes. Only a reduced amount of ENMs will flow back to the productive process of the economy in a limited number of sectors. Overall, the results suggest that risk assessment during recycling should focus on occupational exposure, release of ENMs in landfills and incineration plants, and toxicity assessment in a small number of recycled inputs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Stramaglia, Sebastiano; Angelini, Leonardo; Wu, Guorong; Cortes, Jesus M; Faes, Luca; Marinazzo, Daniele
2016-12-01
We develop a framework for the analysis of synergy and redundancy in the pattern of information flow between subsystems of a complex network. The presence of redundancy and/or synergy in multivariate time series data renders difficulty to estimate the neat flow of information from each driver variable to a given target. We show that adopting an unnormalized definition of Granger causality, one may put in evidence redundant multiplets of variables influencing the target by maximizing the total Granger causality to a given target, over all the possible partitions of the set of driving variables. Consequently, we introduce a pairwise index of synergy which is zero when two independent sources additively influence the future state of the system, differently from previous definitions of synergy. We report the application of the proposed approach to resting state functional magnetic resonance imaging data from the Human Connectome Project showing that redundant pairs of regions arise mainly due to space contiguity and interhemispheric symmetry, while synergy occurs mainly between nonhomologous pairs of regions in opposite hemispheres. Redundancy and synergy, in healthy resting brains, display characteristic patterns, revealed by the proposed approach. The pairwise synergy index, here introduced, maps the informational character of the system at hand into a weighted complex network: the same approach can be applied to other complex systems whose normal state corresponds to a balance between redundant and synergetic circuits.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
NASA Astrophysics Data System (ADS)
Allen, G. H.; David, C. H.; Andreadis, K. M.; Emery, C. M.; Famiglietti, J. S.
2017-12-01
Earth observing satellites provide valuable near real-time (NRT) information about flood occurrence and magnitude worldwide. This NRT information can be used in early flood warning systems and other flood management applications to save lives and mitigate flood damage. However, these NRT products are only useful to early flood warning systems if they are quickly made available, with sufficient time for flood mitigation actions to be implemented. More specifically, NRT data latency, or the time period between the satellite observation and when the user has access to the information, must be less than the time it takes a flood to travel from the flood observation location to a given downstream point of interest. Yet the paradigm that "lower latency is always better" may not necessarily hold true in river systems due to tradeoffs between data latency and data quality. Further, the existence of statistical breaks in the global distribution of flood wave travel time (i.e. a jagged statistical distribution) would represent preferable latencies for river-observation NRT remote sensing products. Here we present a global analysis of flood wave velocity (i.e. flow celerity) and travel time. We apply a simple kinematic wave model to a global hydrography dataset and calculate flow wave celerity and travel time during bankfull flow conditions. Bankfull flow corresponds to the condition of maximum celerity and thus we present the "worst-case scenario" minimum flow wave travel time. We conduct a similar analysis with respect to the time it takes flood waves to reach the next downstream city, as well as the next downstream reservoir. Finally, we conduct these same analyses, but with regards to the technical capabilities of the planned Surface Water and Ocean Topography (SWOT) satellite mission, which is anticipated to provide waterbody elevation and extent measurements at an unprecedented spatial and temporal resolution. We validate these results with discharge records from paired USGS gauge stations located along a diverse collection of river reaches. These results provide a scientific rationale for optimizing the utility of existing and future NRT river-observation products.
NASA Astrophysics Data System (ADS)
Jones, L. M.; Bawden, G. W.; Bowers, J.; Cannon, S.; Cox, D. A.; Fisher, R.; Keeley, J.; Perry, S. C.; Plumlee, G. S.; Wood, N. J.
2009-12-01
The “Station” fire, the largest fire in the history of Los Angeles County in southern California, began on August 26, 2009 and as of the abstract deadline had burned over 150,000 acres of the Angeles National Forest. This fire creates both a demand and an opportunity for hazards science to be used by the communities directly hit by the fire, as well as those downstream of possible postfire impacts. The Multi Hazards Demonstration Project of the USGS is deploying several types of scientific response, including 1) evaluation of potential debris-flow hazards and associated risk, 2) monitoring physical conditions in burned areas and the hydrologic response to rainstorms, 3) increased streamflow monitoring, 4) ash analysis and ground water contamination, 5) ecosystem response and endangered species rescue, 6) lidar data acquisition for evaluations of biomass loss, detailed mapping of the physical processes that lead to debris-flow generation, and other geologic investigations. The Multi Hazards Demonstration Project is working with the southern California community to use the resulting information to better manage the social consequences of the fire and its secondary hazards. In particular, we are working with Los Angeles County to determine what information they need to prioritize recovery efforts. For instance, maps of hazards specific to debris flow potential can help identify the highest priority areas for debris flow mitigation efforts. These same maps together with ecosystem studies will help land managers determine whether individuals from endangered species should be removed to zoos or other refuges during the rainy months. The ash analysis will help water managers prevent contamination to water supplies. Plans are just beginning for a public information campaign with Los Angeles County about the risk posed by potential debris flows that should be underway in December. Activities from the fire response will support the development of the Wildfire Scenario in 2011, which will examine implications of land-use decisions in the frequency of fires in southern California.
Integrated Analysis of Flow, Form, and Function for River Management and Design Testing
NASA Astrophysics Data System (ADS)
Lane, B. A. A.; Pasternack, G. B.; Sandoval Solis, S.
2017-12-01
Rivers are highly complex, dynamic systems that support numerous ecosystem functions including transporting sediment, modulating biogeochemical processes, and regulating habitat availability for native species. The extent and timing of these functions is largely controlled by the interplay of hydrologic dynamics (i.e. flow) and the shape and composition of the river corridor (i.e. form). This study applies synthetic channel design to the evaluation of river flow-form-function linkages, with the aim of evaluating these interactions across a range of flows and forms to inform process-driven management efforts with limited data and financial requirements. In an application to California's Mediterranean-montane streams, the interacting roles of channel form, water year type, and hydrologic impairment were evaluated across a suite of ecosystem functions related to hydrogeomorphic processes, aquatic habitat, and riparian habitat. Channel form acted as the dominant control on hydrogeomorphic processes considered, while water year type controlled salmonid habitat functions. Streamflow alteration for hydropower increased redd dewatering risk and altered aquatic habitat availability and riparian recruitment dynamics. Study results highlight critical tradeoffs in ecosystem function performance and emphasize the significance of spatiotemporal diversity of flow and form at multiple scales for maintaining river ecosystem integrity. The approach is broadly applicable and extensible to other systems and ecosystem functions, where findings can be used to characterize complex controls on river ecosystems, assess impacts of proposed flow and form alterations, and inform river restoration strategies.
Streamflow in the upper Santa Cruz River basin, Santa Cruz and Pima Counties, Arizona
Condes de la Torre, Alberto
1970-01-01
Streamflow records obtained in the upper Santa Cruz River basin of southern Arizona, United States, and northern Sonora, Mexico, have been analyzed to aid in the appraisal of the surface-water resources of the area. Records are available for 15 sites, and the length of record ranges from 60 years for the gaging station on the Santa .Cruz River at Tucson to 6 years for Pantano Wash near Vail. The analysis provides information on flow duration, low-flow frequency magnitude, flood-volume frequency and magnitude, and storage requirements to maintain selected draft rates. Flood-peak information collected from the gaging stations has been projected on a regional basis from which estimates of flood magnitude and frequency may be made for any site in the basin. Most streams in the 3,503-square-mile basin are ephemeral. Ground water sustains low flows only at Santa Cruz River near Nogales, Sonoita Creek near Patagonia, and Pantano Wash near Vail. Elsewhere, flow occurs only in direct response to precipitation. The median number of days per year in which there is no flow ranges from 4 at Sonoita Creek near Patagonia to 335 at Rillito Creek near Tomson. The streamflow is extremely variable from year to year, and annual flows have a coefficient of variation close to or exceeding unity at most stations. Although the amount of flow in the basin is small most of the time, the area is subject to floods. Most floods result from high-intensity precipitation caused by thunderstorms during the period ,July to September. Occasionally, when snowfall at the lower altitudes is followed by rain, winter floods produce large volumes of flow.
Reference manual for generation and analysis of Habitat Time Series: version II
Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.
1990-01-01
The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered by the Aquatic Systems Branch of the National Ecology Research Center. For more information about the TSLIB software, refer to the Memorandum of Understanding. Chapter 1 provides a brief introduction to the Instream Flow Incremental Methodology and TSLIB. Other chapters in this manual provide information on the different aspects of using the models. The information contained in the other chapters includes (2) acquisition, entry, manipulation, and listing of streamflow data; (3) entry, manipulation, and listing of the habitat-versus-streamflow function; (4) transferring streamflow data; (5) water resources systems analysis; (6) generation and analysis of daily streamflow and habitat values; (7) generation of the time series of monthly habitats; (8) manipulation, analysis, and display of month time series data; and (9) generation, analysis, and display of annual time series data. Each section includes documentation for the programs therein with at least one page of information for each program, including a program description, instructions for running the program, and sample output. The Appendixes contain the following: (A) sample file formats; (B) descriptions of default filenames; (C) alphabetical summary of batch-procedure files; (D) installing and running TSLIB on a microcomputer; (E) running TSLIB on a CDC Cyber computer; (F) using the TSLIB user interface program (RTSM); and (G) running WATSTORE on the USGS Amdahl mainframe computer. The number for this version of TSLIB--Version II-- is somewhat arbitrary, as the TSLIB programs were collected into a library some time ago; but operators tended to use and manage them as individual programs. Therefore, we will consider the group of programs from the past that were only on the CDC Cyber computer as Version 0; the programs from the past that were on both the Cyber and the IBM-compatible microcomputer as Version I; and the programs contained in this reference manual as Version II.
NASA Astrophysics Data System (ADS)
Iftekhar, Ahmed Tashfin; Ho, Jenny Che-Ting; Mellinger, Axel; Kaya, Tolga
2017-03-01
Sweat-based physiological monitoring has been intensively explored in the last decade with the hopes of developing real-time hydration monitoring devices. Although the content of sweat (electrolytes, lactate, urea, etc.) provides significant information about the physiology, it is also very important to know the rate of sweat at the time of sweat content measurements because the sweat rate is known to alter the concentrations of sweat compounds. We developed a calorimetric based flow rate sensor using PolydimethylSiloxane that is suitable for sweat rate applications. Our simple approach on using temperature-based flow rate detection can easily be adapted to multiple sweat collection and analysis devices. Moreover, we have developed a 3D finite element analysis model of the device using COMSOL Multiphysics™ and verified the flow rate measurements. The experiment investigated flow rate values from 0.3 μl/min up to 2.1 ml/min, which covers the human sweat rate range (0.5 μl/min-10 μl/min). The 3D model simulations and analytical model calculations covered an even wider range in order to understand the main physical mechanisms of the device. With a verified 3D model, different environmental heat conditions could be further studied to shed light on the physiology of the sweat rate.
Accuracy and Tuning of Flow Parsing for Visual Perception of Object Motion During Self-Motion
Niehorster, Diederick C.
2017-01-01
How do we perceive object motion during self-motion using visual information alone? Previous studies have reported that the visual system can use optic flow to identify and globally subtract the retinal motion component resulting from self-motion to recover scene-relative object motion, a process called flow parsing. In this article, we developed a retinal motion nulling method to directly measure and quantify the magnitude of flow parsing (i.e., flow parsing gain) in various scenarios to examine the accuracy and tuning of flow parsing for the visual perception of object motion during self-motion. We found that flow parsing gains were below unity for all displays in all experiments; and that increasing self-motion and object motion speed did not alter flow parsing gain. We conclude that visual information alone is not sufficient for the accurate perception of scene-relative motion during self-motion. Although flow parsing performs global subtraction, its accuracy also depends on local motion information in the retinal vicinity of the moving object. Furthermore, the flow parsing gain was constant across common self-motion or object motion speeds. These results can be used to inform and validate computational models of flow parsing. PMID:28567272
Real-Time Aerodynamic Flow and Data Visualization in an Interactive Virtual Environment
NASA Technical Reports Server (NTRS)
Schwartz, Richard J.; Fleming, Gary A.
2005-01-01
Significant advances have been made to non-intrusive flow field diagnostics in the past decade. Camera based techniques are now capable of determining physical qualities such as surface deformation, surface pressure and temperature, flow velocities, and molecular species concentration. In each case, extracting the pertinent information from the large volume of acquired data requires powerful and efficient data visualization tools. The additional requirement for real time visualization is fueled by an increased emphasis on minimizing test time in expensive facilities. This paper will address a capability titled LiveView3D, which is the first step in the development phase of an in depth, real time data visualization and analysis tool for use in aerospace testing facilities.
Assured Information Flow Capping Architecture.
1985-05-01
Air Control System Deployment, ESD-TR-71-371, AD 733 584, Electronic Systems Division, AFSC, Hanscom Air Force Base, MA, November 1971. 3. I. Gitman and...H. Frank, "Economic Analysis of Integrated Voice and Data Networks: A Case Study," Proceedings of the IEEE, November 1978. 4. H. Frank and I. Gitman ... Gitman , "Study Shows Packet Switching Best for Voice Traffic, Too," Data Communications, March 1979. ___ "Economic Analysis of Integrated Voice and
Inducer analysis/pump model development
NASA Astrophysics Data System (ADS)
Cheng, Gary C.
1994-03-01
Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.
Inducer analysis/pump model development
NASA Technical Reports Server (NTRS)
Cheng, Gary C.
1994-01-01
Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.
Koltun, G.F.
2001-01-01
This report provides data and methods to aid in the hydrologic design or evaluation of impounding reservoirs and side-channel reservoirs used for water supply in Ohio. Data from 117 streamflow-gaging stations throughout Ohio were analyzed by means of nonsequential-mass-curve-analysis techniques to develop relations between storage requirements, water demand, duration, and frequency. Information also is provided on minimum runoff for selected durations and frequencies. Systematic record lengths for the streamflow-gaging stations ranged from about 10 to 75 years; however, in many cases, additional streamflow record was synthesized. For impounding reservoirs, families of curves are provided to facilitate the estimation of storage requirements as a function of demand and the ratio of the 7-day, 2-year low flow to the mean annual flow. Information is provided with which to evaluate separately the effects of evaporation on storage requirements. Comparisons of storage requirements for impounding reservoirs determined by nonsequential-mass-curve-analysis techniques with storage requirements determined by annual-mass-curve techniques that employ probability routing to account for carryover-storage requirements indicate that large differences in computed required storages can result from the two methods, particularly for conditions where demand cannot be met from within-year storage. For side-channel reservoirs, tables of demand-storage-frequency information are provided for a primary pump relation consisting of one variable-speed pump with a pumping capacity that ranges from 0.1 to 20 times demand. Tables of adjustment ratios are provided to facilitate determination of storage requirements for 19 other pump sets consisting of assorted combinations of fixed-speed pumps or variable-speed pumps with aggregate pumping capacities smaller than or equal to the primary pump relation. The effects of evaporation on side-channel reservoir storage requirements are incorporated into the storage-requirement estimates. The effects of an instream-flow requirement equal to the 80-percent-duration flow are also incorporated into the storage-requirement estimates.
Estimates of Median Flows for Streams on the 1999 Kansas Surface Water Register
Perry, Charles A.; Wolock, David M.; Artman, Joshua C.
2004-01-01
The Kansas State Legislature, by enacting Kansas Statute KSA 82a?2001 et. seq., mandated the criteria for determining which Kansas stream segments would be subject to classification by the State. One criterion for the selection as a classified stream segment is based on the statistic of median flow being equal to or greater than 1 cubic foot per second. As specified by KSA 82a?2001 et. seq., median flows were determined from U.S. Geological Survey streamflow-gaging-station data by using the most-recent 10 years of gaged data (KSA) for each streamflow-gaging station. Median flows also were determined by using gaged data from the entire period of record (all-available hydrology, AAH). Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating median flows for uncontrolled stream segments. The drainage area of the gaging stations on uncontrolled stream segments used in the regression analyses ranged from 2.06 to 12,004 square miles. A logarithmic transformation of the data was needed to develop the best linear relation for computing median flows. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. Tobit analyses of KSA data yielded a model standard error of prediction of 0.285 logarithmic units, and the best equations using Tobit analyses of AAH data had a model standard error of prediction of 0.250 logarithmic units. These regression equations and an interpolation procedure were used to compute median flows for the uncontrolled stream segments on the 1999 Kansas Surface Water Register. Measured median flows from gaging stations were incorporated into the regression-estimated median flows along the stream segments where available. The segments that were uncontrolled were interpolated using gaged data weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled segments of Kansas streams, the median flow information was interpolated between gaging stations using only gaged data weighted by drainage area. Of the 2,232 total stream segments on the Kansas Surface Water Register, 34.5 percent of the segments had an estimated median streamflow of less than 1 cubic foot per second when the KSA analysis was used. When the AAH analysis was used, 36.2 percent of the segments had an estimated median streamflow of less than 1 cubic foot per second. This report supercedes U.S. Geological Survey Water-Resources Investigations Report 02?4292.
Transborder Flow of Computerized Information: Controls and Restrictions.
ERIC Educational Resources Information Center
Shrader, Erwin
Of major concern to United States position and policy in the telecommunications and information areas is "transborder data flow," the transferring of computer stored data between nations. Many European nations, including France, Austria, and West Germany, have enacted laws regulating the flow of information leaving the country where it…
49 CFR 831.13 - Flow and dissemination of accident or incident information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 7 2010-10-01 2010-10-01 false Flow and dissemination of accident or incident...) NATIONAL TRANSPORTATION SAFETY BOARD ACCIDENT/INCIDENT INVESTIGATION PROCEDURES § 831.13 Flow and dissemination of accident or incident information. (a) Release of information during the field investigation...
NASA Astrophysics Data System (ADS)
de la Mata, Tamara; Llano, Carlos
2013-07-01
Recent literature on border effect has fostered research on informal barriers to trade and the role played by network dependencies. In relation to social networks, it has been shown that intensity of trade in goods is positively correlated with migration flows between pairs of countries/regions. In this article, we investigate whether such a relation also holds for interregional trade of services. We also consider whether interregional trade flows in services linked with tourism exhibit spatial and/or social network dependence. Conventional empirical gravity models assume the magnitude of bilateral flows between regions is independent of flows to/from regions located nearby in space, or flows to/from regions related through social/cultural/ethic network connections. With this aim, we provide estimates from a set of gravity models showing evidence of statistically significant spatial and network (demographic) dependence in the bilateral flows of the trade of services considered. The analysis has been applied to the Spanish intra- and interregional monetary flows of services from the accommodation, restaurants and travel agencies for the period 2000-2009, using alternative datasets for the migration stocks and definitions of network effects.
Performance analysis of axial flow pump on gap changing between impeller and guide vane
NASA Astrophysics Data System (ADS)
Wang, W. J.; Liang, Q. H.; Wang, Y.; Yang, Y.; Yin, G.; Shi, X. X.
2013-12-01
In order to study the influence on gap changing of the static and dynamic components in axial flow pump, the axial flow pump model (TJ04-ZL-06) that used in the eastern of south-to-north water diversion project was selected. Steady turbulence field with different gaps was simulated by standard κ-ε turbulence model and double-time stepping methods. Information on the pressure distribution and velocity distribution of impeller surfaces were obtained. Then, calculated results were compared with the test results and analyzed. The results show that the performance of pump is not sensitive with the axial gap width under design conditions and the large flow rate condition. With increasing gap width, it will be improved in low flow rate condition. The attack angle of impeller inlet in small flow rate condition become small and the flow separation phenomenon can be observed in this condition. The axial velocity distribution of impeller outlet is nonlinear and to increase the axial gap is to improve the flow pattern near the hub effectively. The trend of calculating results is identical with test. It will play a guiding role to the axial pump operation and design in south-to-north water diversion project.
Using borehole flow data to characterize the hydraulics of flow paths in operating wellfields
Paillet, F.; Lundy, J.
2004-01-01
Understanding the flow paths in the vicinity of water well intakes is critical in the design of effective wellhead protection strategies for heterogeneous carbonate aquifers. High-resolution flow logs can be combined with geophysical logs and borehole-wall-image logs (acoustic televiewer) to identify the porous beds, solution openings, and fractures serving as conduits connecting the well bore to the aquifer. Qualitative methods of flow log analysis estimate the relative transmissivity of each water-producing zone, but do not indicate how those zones are connected to the far-field aquifer. Borehole flow modeling techniques can be used to provide quantitative estimates of both transmissivity and far-field hydraulic head in each producing zone. These data can be used to infer how the individual zones are connected with each other, and to the surrounding large-scale aquifer. Such information is useful in land-use planning and the design of well intakes to prevent entrainment of contaminants into water-supply systems. Specific examples of flow log applications in the identification of flow paths in operating wellfields are given for sites in Austin and Faribault, Minnesota. Copyright ASCE 2004.
Value flow mapping: Using networks to inform stakeholder analysis
NASA Astrophysics Data System (ADS)
Cameron, Bruce G.; Crawley, Edward F.; Loureiro, Geilson; Rebentisch, Eric S.
2008-02-01
Stakeholder theory has garnered significant interest from the corporate community, but has proved difficult to apply to large government programs. A detailed value flow exercise was conducted to identify the value delivery mechanisms among stakeholders for the current Vision for Space Exploration. We propose a method for capturing stakeholder needs that explicitly recognizes the outcomes required of the value creating organization. The captured stakeholder needs are then translated into input-output models for each stakeholder, which are then aggregated into a network model. Analysis of this network suggests that benefits are infrequently linked to the root provider of value. Furthermore, it is noted that requirements should not only be written to influence the organization's outputs, but also to influence the propagation of benefit further along the value chain. A number of future applications of this model to systems architecture and requirement analysis are discussed.
Informational analysis for compressive sampling in radar imaging.
Zhang, Jingxiong; Yang, Ke
2015-03-24
Compressive sampling or compressed sensing (CS) works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs). Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.
Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel
2016-02-01
Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Including geological information in the inverse problem of palaeothermal reconstruction
NASA Astrophysics Data System (ADS)
Trautner, S.; Nielsen, S. B.
2003-04-01
A reliable reconstruction of sediment thermal history is of central importance to the assessment of hydrocarbon potential and the understanding of basin evolution. However, only rarely do sedimentation history and borehole data in the form of present day temperatures and vitrinite reflectance constrain the past thermal evolution to a useful level of accuracy (Gallagher and Sambridge,1992; Nielsen,1998; Trautner and Nielsen,2003). This is reflected in the inverse solutions to the problem of determining heat flow history from borehole data: The recent heat flow is constrained by data while older values are governed by the chosen a prior heat flow. In this paper we reduce this problem by including geological information in the inverse problem. Through a careful analysis of geological and geophysical data the timing of the tectonic processes, which may influence heat flow, can be inferred. The heat flow history is then parameterised to allow for the temporal variations characteristic of the different tectonic events. The inversion scheme applies a Markov chain Monte Carlo (MCMC) approach (Nielsen and Gallagher, 1999; Ferrero and Gallagher,2002), which efficiently explores the model space and futhermore samples the posterior probability distribution of the model. The technique is demonstrated on wells in the northern North Sea with emphasis on the stretching event in Late Jurassic. The wells are characterised by maximum sediment temperature at the present day, which is the worst case for resolution of the past thermal history because vitrinite reflectance is determined mainly by the maximum temperature. Including geological information significantly improves the thermal resolution. Ferrero, C. and Gallagher,K.,2002. Stochastic thermal history modelling.1. Constraining heat flow histories and their uncertainty. Marine and Petroleum Geology, 19, 633-648. Gallagher,K. and Sambridge, M., 1992. The resolution of past heat flow in sedimentary basins from non-linear inversion of geochemical data: the smoothest model approach, with synthetic examples. Geophysical Journal International, 109, 78-95. Nielsen, S.B, 1998. Inversion and sensitivity analysis in basin modelling. Geoscience 98. Keele University, UK, Abstract Volume, 56. Nielsen, S.B. and Gallagher, K., 1999. Efficient sampling of 3-D basin modelling scenarios. Extended Abstracts Volume, 1999 AAPG International Conference &Exhibition, Birmingham, England, September 12-15, 1999, p. 369 - 372. Trautner S. and Nielsen, S.B., 2003. 2-D inverse thermal modelling in the Norwegian shelf using Fast Approximate Forward (FAF) solutions. In R. Marzi and Duppenbecker, S. (Ed.), Multi-Dimensional Basin Modeling, AAPG, in press.
2012-10-01
REPORT 3. DATES COVERED (From - To) MAR 2010 – APR 2012 4 . TITLE AND SUBTITLE IMPLICATIONS OF MULT-CORE ARCHITECTURES ON THE DEVELOPMENT OF...Framework for Multicore Information Flow Analysis ...................................... 23 4 4.1 A Hypothetical Reference Architecture... 4 Figure 2: Pentium II Block Diagram
Programmers manual for static and dynamic reusable surface insulation stresses (resist)
NASA Technical Reports Server (NTRS)
Ogilvie, P. L.; Levy, A.; Austin, F.; Ojalvo, I. U.
1974-01-01
Programming information for the RESIST program for the dynamic and thermal stress analysis of the space shuttle surface insulation is presented. The overall flow chart of the program, overlay chart, data set allocation, and subprogram calling sequence are given along with a brief description of the individual subprograms and typical subprogram output.
ERIC Educational Resources Information Center
Palazotto, Anthony N.; And Others
This report is the result of a pilot program to seek out ways for developing an educational institution's transportation flow. Techniques and resulting statistics are discussed. Suggestions for additional uses of the information obtained are indicated. (Author)
Data management for Computer-Aided Engineering (CAE)
NASA Technical Reports Server (NTRS)
Bryant, W. A.; Smith, M. R.
1984-01-01
Analysis of data flow through the design and manufacturing processes has established specific information management requirements and identified unique problems. The application of data management technology to the engineering/manufacturing environment addresses these problems. An overview of the IPAD prototype data base management system, representing a partial solution to these problems, is presented here.
2006-06-01
KMO ) for the CFMCC staff. That officer had a daily meeting with all of the CFMCC’s collateral duty knowledge managers (KM) to discuss information...analyses of process steps) and mentored by the KMO , could enhance knowledge creation and utilization while not jeopardizing work flows. Clearly in
A Conceptual Framework for Analysis of Communication in Rural Social Systems.
ERIC Educational Resources Information Center
Axinn, George H.
This paper describes a five-component system with ten major internal linkages which may be used as a model for studying information flow in any rural agricultural social system. The major components are production, supply, marketing, research, and extension education. In addition, definitions are offered of the crucial variables affecting…
The Origin of Elevated Th in the Eratosthenian Lava Flows in the Procellarum KREEP Terrane
NASA Technical Reports Server (NTRS)
Gillis, J. J.; Jolliff, B. L.; Korotev, R. L.; Lawrence, D. J.
2002-01-01
Clementine spectral reflectance and compositional data, Lunar Prospector gamma ray and neutron spectrometer data, and sample analysis of lunar soils are used to examine the origin of high-Th in Eratosthenian basalts of the Procellarum KREEP. Additional information is contained in the original extended abstract.
Government Style as a Factor in Information Flow: Television Programming in Argentina, l979-l988.
ERIC Educational Resources Information Center
John, Jeffrey Alan
Noting that Argentina's recent history is particularly useful for analysis of the varying effects that differing government styles can have on a single mass communication system, a study compared Argentine (specifically Buenos Aires) television's 1979 programming schedule, prepared during a military dictatorship, with recent schedules prepared…
DOT National Transportation Integrated Search
2017-11-15
Microsimulation modeling is a tool used by practitioners and researchers to predict and evaluate the flow of traffic on real transportation networks. These models are used in practice to inform decisions and thus must reflect a high level of accuracy...
Risk management and measuring productivity with POAS--point of act system.
Akiyama, Masanori; Kondo, Tatsuya
2007-01-01
The concept of our system is not only to manage material flows, but also to provide an integrated management resource, a means of correcting errors in medical treatment, and applications to EBM through the data mining of medical records. Prior to the development of this system, electronic processing systems in hospitals did a poor job of accurately grasping medical practice and medical material flows. With POAS (Point of Act System), hospital managers can solve the so-called, "man, money, material, and information" issues inherent in the costs of healthcare. The POAS system synchronizes with each department system, from finance and accounting, to pharmacy, to imaging, and allows information exchange. We can manage Man, Material, Money and Information completely by this system. Our analysis has shown that this system has a remarkable investment effect - saving over four million dollars per year - through cost savings in logistics and business process efficiencies. In addition, the quality of care has been improved dramatically while error rates have been reduced - nearly to zero in some cases.
Variable cycle control model for intersection based on multi-source information
NASA Astrophysics Data System (ADS)
Sun, Zhi-Yuan; Li, Yue; Qu, Wen-Cong; Chen, Yan-Yan
2018-05-01
In order to improve the efficiency of traffic control system in the era of big data, a new variable cycle control model based on multi-source information is presented for intersection in this paper. Firstly, with consideration of multi-source information, a unified framework based on cyber-physical system is proposed. Secondly, taking into account the variable length of cell, hysteresis phenomenon of traffic flow and the characteristics of lane group, a Lane group-based Cell Transmission Model is established to describe the physical properties of traffic flow under different traffic signal control schemes. Thirdly, the variable cycle control problem is abstracted into a bi-level programming model. The upper level model is put forward for cycle length optimization considering traffic capacity and delay. The lower level model is a dynamic signal control decision model based on fairness analysis. Then, a Hybrid Intelligent Optimization Algorithm is raised to solve the proposed model. Finally, a case study shows the efficiency and applicability of the proposed model and algorithm.
Ramoni, Marco F.
2010-01-01
The field of synthetic biology holds an inspiring vision for the future; it integrates computational analysis, biological data and the systems engineering paradigm in the design of new biological machines and systems. These biological machines are built from basic biomolecular components analogous to electrical devices, and the information flow among these components requires the augmentation of biological insight with the power of a formal approach to information management. Here we review the informatics challenges in synthetic biology along three dimensions: in silico, in vitro and in vivo. First, we describe state of the art of the in silico support of synthetic biology, from the specific data exchange formats, to the most popular software platforms and algorithms. Next, we cast in vitro synthetic biology in terms of information flow, and discuss genetic fidelity in DNA manipulation, development strategies of biological parts and the regulation of biomolecular networks. Finally, we explore how the engineering chassis can manipulate biological circuitries in vivo to give rise to future artificial organisms. PMID:19906839
Williams, P Stephen
2016-05-01
Asymmetrical flow field-flow fractionation (As-FlFFF) has become the most commonly used of the field-flow fractionation techniques. However, because of the interdependence of the channel flow and the cross flow through the accumulation wall, it is the most difficult of the techniques to optimize, particularly for programmed cross flow operation. For the analysis of polydisperse samples, the optimization should ideally be guided by the predicted fractionating power. Many experimentalists, however, neglect fractionating power and rely on light scattering detection simply to confirm apparent selectivity across the breadth of the eluted peak. The size information returned by the light scattering software is assumed to dispense with any reliance on theory to predict retention, and any departure of theoretical predictions from experimental observations is therefore considered of no importance. Separation depends on efficiency as well as selectivity, however, and efficiency can be a strong function of retention. The fractionation of a polydisperse sample by field-flow fractionation never provides a perfectly separated series of monodisperse fractions at the channel outlet. The outlet stream has some residual polydispersity, and it will be shown in this manuscript that the residual polydispersity is inversely related to the fractionating power. Due to the strong dependence of light scattering intensity and its angular distribution on the size of the scattering species, the outlet polydispersity must be minimized if reliable size data are to be obtained from the light scattering detector signal. It is shown that light scattering detection should be used with careful control of fractionating power to obtain optimized analysis of polydisperse samples. Part I is concerned with isocratic operation of As-FlFFF, and part II with programmed operation.
Voronin, Lois M.; Cauller, Stephen J.
2017-07-31
Elevated concentrations of nitrogen in groundwater that discharges to surface-water bodies can degrade surface-water quality and habitats in the New Jersey Coastal Plain. An analysis of groundwater flow in the Kirkwood-Cohansey aquifer system and deeper confined aquifers that underlie the Barnegat Bay–Little Egg Harbor (BB-LEH) watershed and estuary was conducted by using groundwater-flow simulation, in conjunction with a particle-tracking routine, to provide estimates of groundwater flow paths and travel times to streams and the BB-LEH estuary.Water-quality data from the Ambient Groundwater Quality Monitoring Network, a long-term monitoring network of wells distributed throughout New Jersey, were used to estimate the initial nitrogen concentration in recharge for five different land-use classes—agricultural cropland or pasture, agricultural orchard or vineyard, urban non-residential, urban residential, and undeveloped. Land use at the point of recharge within the watershed was determined using a geographic information system (GIS). Flow path starting locations were plotted on land-use maps for 1930, 1973, 1986, 1997, and 2002. Information on the land use at the time and location of recharge, time of travel to the discharge location, and the point of discharge were determined for each simulated flow path. Particle-tracking analysis provided the link from the point of recharge, along the particle flow path, to the point of discharge, and the particle travel time. The travel time of each simulated particle established the recharge year. Land use during the year of recharge was used to define the nitrogen concentration associated with each flow path. The recharge-weighted average nitrogen concentration for all flow paths that discharge to the Toms River upstream from streamflow-gaging station 01408500 or to the BB-LEH estuary was calculated.Groundwater input into the Barnegat Bay–Little Egg Harbor estuary from two main sources— indirect discharge from base flow to streams that eventually flow into the bay and groundwater discharge directly into the estuary and adjoining coastal wetlands— is summarized by quantity, travel time, and estimated nitrogen concentration. Simulated average groundwater discharge to streams in the watershed that flow into the BB-LEH estuary is approximately 400 million gallons per day. Particle-tracking results indicate that the travel time of 56 percent of this discharge is less than 7 years. Fourteen percent of the groundwater discharge to the streams in the BB-LEH watershed has a travel time of less than 7 years and originates in urban land. Analysis of flow-path simulations indicate that approximately 13 percent of the total groundwater flow through the study area discharges directly to the estuary and adjoining coastal wetlands (approximately 64 million gallons per day). The travel time of 19 percent of this discharge is less than 7 years. Ten percent of this discharge (1 percent of the total groundwater flow through the study area) originates in urban areas and has a travel time of less than 7 years. Groundwater that discharges to the streams that flow into the BB-LEH, in general, has shorter travel times, and a higher percentage of it originates in urban areas than does direct groundwater discharge to the Barnegat Bay–Little Egg Harbor estuary.The simulated average nitrogen concentration in groundwater that discharges to the Toms River, upstream from streamflow-gaging station 01408500 was computed and compared to summary concentrations determined from analysis of multiple surface-water samples. The nitrogen concentration in groundwater that discharges directly to the estuary and adjoining coastal wetlands is a current data gap. The particle tracking methodology used in this study provides an estimate of this concentration."
Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol
2009-01-01
Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693
Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol
2009-01-01
Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.
Reporting of participant flow diagrams in published reports of randomized trials
2011-01-01
Background Reporting of the flow of participants through each stage of a randomized trial is essential to assess the generalisability and validity of its results. We assessed the type and completeness of information reported in CONSORT (Consolidated Standards of Reporting Trials) flow diagrams published in current reports of randomized trials. Methods A cross sectional review of all primary reports of randomized trials which included a CONSORT flow diagram indexed in PubMed core clinical journals (2009). We assessed the proportion of parallel group trial publications reporting specific items recommended by CONSORT for inclusion in a flow diagram. Results Of 469 primary reports of randomized trials, 263 (56%) included a CONSORT flow diagram of which 89% (237/263) were published in a CONSORT endorsing journal. Reports published in CONSORT endorsing journals were more likely to include a flow diagram (62%; 237/380 versus 29%; 26/89). Ninety percent (236/263) of reports which included a flow diagram had a parallel group design, of which 49% (116/236) evaluated drug interventions, 58% (137/236) were multicentre, and 79% (187/236) compared two study groups, with a median sample size of 213 participants. Eighty-one percent (191/236) reported the overall number of participants assessed for eligibility, 71% (168/236) the number excluded prior to randomization and 98% (231/236) the overall number randomized. Reasons for exclusion prior to randomization were more poorly reported. Ninety-four percent (223/236) reported the number of participants allocated to each arm of the trial. However, only 40% (95/236) reported the number who actually received the allocated intervention, 67% (158/236) the number lost to follow up in each arm of the trial, 61% (145/236) whether participants discontinued the intervention during the trial and 54% (128/236) the number included in the main analysis. Conclusions Over half of published reports of randomized trials included a diagram showing the flow of participants through the trial. However, information was often missing from published flow diagrams, even in articles published in CONSORT endorsing journals. If important information is not reported it can be difficult and sometimes impossible to know if the conclusions of that trial are justified by the data presented. PMID:22141446
NASA Technical Reports Server (NTRS)
Estes, J. E.; Eisgruber, L.
1981-01-01
In the second half of the 1980's NASA can expect to face difficult choices among alternative fundamental and applied research, and development projects that could potentially lead to improvements in the information systems used to manage renewable resources. The working group on information utilization and evaluation believes that effective choices cannot be made without a better understanding of the current and prospective problems and opportunities involved in the application of remote sensing to improve renewable research information systems. A renewable resources information system is defined in a broad context to include a flow of data/information from: acquisition through processing, storage, integration with other data, analysis, graphic presentation, decision making, and assessment of the affects of those decisions.
Centrifugal and Axial Pump Design and Off-Design Performance Prediction
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1995-01-01
A meanline pump-flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump-flow code PUMPA was written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design-point rotor efficiency and slip factors are obtained from empirical correlations to rotor-specific speed and geometry. The pump code can model axial, inducer, mixed-flow, and centrifugal pumps and can model multistage pumps in series. The rapid input setup and computer run time for this meanline pump flow code make it an effective analysis and conceptual design tool. The map-generation capabilities of the code provide the information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of PUMPA permit the user to do parametric design space exploration of candidate pump configurations and to provide head-flow maps for engine system evaluation.
Pressure modulation algorithm to separate cerebral hemodynamic signals from extracerebral artifacts
Baker, Wesley B.; Parthasarathy, Ashwin B.; Ko, Tiffany S.; Busch, David R.; Abramson, Kenneth; Tzeng, Shih-Yu; Mesquita, Rickson C.; Durduran, Turgut; Greenberg, Joel H.; Kung, David K.; Yodh, Arjun G.
2015-01-01
Abstract. We introduce and validate a pressure measurement paradigm that reduces extracerebral contamination from superficial tissues in optical monitoring of cerebral blood flow with diffuse correlation spectroscopy (DCS). The scheme determines subject-specific contributions of extracerebral and cerebral tissues to the DCS signal by utilizing probe pressure modulation to induce variations in extracerebral blood flow. For analysis, the head is modeled as a two-layer medium and is probed with long and short source-detector separations. Then a combination of pressure modulation and a modified Beer-Lambert law for flow enables experimenters to linearly relate differential DCS signals to cerebral and extracerebral blood flow variation without a priori anatomical information. We demonstrate the algorithm’s ability to isolate cerebral blood flow during a finger-tapping task and during graded scalp ischemia in healthy adults. Finally, we adapt the pressure modulation algorithm to ameliorate extracerebral contamination in monitoring of cerebral blood oxygenation and blood volume by near-infrared spectroscopy. PMID:26301255
Modern and Unconventional Approaches to Karst Hydrogeology
NASA Astrophysics Data System (ADS)
Sukop, M. C.
2017-12-01
Karst hydrogeology is frequently approached from a hydrograph/statistical perspective where precipitation/recharge inputs are converted to output hydrographs and the conversion process reflects the hydrology of the system. Karst catchments show hydrological response to short-term meteorological events and to long-term variation of large-scale atmospheric circulation. Modern approaches to analysis of these data include, for example, multiresolution wavelet techniques applied to understand relations between karst discharge and climate fields. Much less effort has been directed towards direct simulation of flow fields and transport phenomena in karst settings. This is primarily due to the lack of information on the detailed physical geometry of most karst systems. New mapping, sampling, and modeling techniques are beginning to enable direct simulation of flow and transport. A Conduit Flow Process (CFP) add-on to the USGS ModFlow model became available in 2007. FEFLOW and similar models are able to represent flows in individual conduits. Lattice Boltzmann models have also been applied to flow modeling in karst systems. Regarding quantitative measurement of karst system geometry, at scales to 0.1 m, X-ray computed tomography enables good detection of detailed (sub-millimeter) pore space in karstic rocks. Three-dimensional printing allows reconstruction of fragile high porosity rocks, and surrogate samples generated this way can then be subjected to laboratory testing. Borehole scales can be accessed with high-resolution ( 0.001 m) Digital Optical Borehole Imaging technologies and can provide virtual samples more representative of the true nature of karst aquifers than can obtained from coring. Subsequent extrapolation of such samples can generate three-dimensional models suitable for direct modeling of flow and transport. Finally, new cave mapping techniques are beginning to provide information than can be applied to direct simulation of flow. Due to flow rates and cave diameter, very high Reynolds number flows may be encountered.
Findings from an Organizational Network Analysis to Support Local Public Health Management
Caldwell, Michael; Rockoff, Maxine L.; Gebbie, Kristine; Carley, Kathleen M.; Bakken, Suzanne
2008-01-01
We assessed the feasibility of using organizational network analysis in a local public health organization. The research setting was an urban/suburban county health department with 156 employees. The goal of the research was to study communication and information flow in the department and to assess the technique for public health management. Network data were derived from survey questionnaires. Computational analysis was performed with the Organizational Risk Analyzer. Analysis revealed centralized communication, limited interdependencies, potential knowledge loss through retirement, and possible informational silos. The findings suggested opportunities for more cross program coordination but also suggested the presences of potentially efficient communication paths and potentially beneficial social connectedness. Managers found the findings useful to support decision making. Public health organizations must be effective in an increasingly complex environment. Network analysis can help build public health capacity for complex system management. PMID:18481183
Integrated Information Technology Policy Analysis Research, CSUSB
2010-10-01
cience fields in order to combine efforts to better understand multiple network s systems, including technical, biological and social networks...Flowing Valued Information (FVI) project has been discussed at the Network cience Workshops linked form the Center website and the FVI reports and
Laurent, Christophe; Beaucourt, Luc
2005-01-01
A hard- and software solution has been conceived, realized, produced and used to gather clinical information about disaster victims in the field in such a way that it makes the different efforts made by mass casualty incident management managers and first responders work more efficient, ergonomic, safe and useful for further scientific and statistic analysis.
ERIC Educational Resources Information Center
Agenbroad, James E.; And Others
Included in this volume of appendices to LI 000 979 are acquisitions flow charts; a current operations questionnaire; an algorithm for splitting the Library of Congress call number; analysis of the Machine-Readable Cataloging (MARC II) format; production problems and decisions; operating procedures for information transmittal in the New England…
A Novel Statistical Analysis and Interpretation of Flow Cytometry Data
2013-03-31
the resulting residuals appear random. In the work that follows, I∗ = 200. The values of B and b̂j are known from the experiment. Notice that the...conjunction with the model parameter vector in a two- stage process. Unfortunately two- stage estimation may cause some parameters of the mathematical model to...information theoretic criteria such as Akaike’s Information Criterion (AIC). From (4.3), it follows that the scaled residuals rjk = λjI[n̂](tj , zk; ~q
Laser fluorescence fluctuation excesses in molecular immunology experiments
NASA Astrophysics Data System (ADS)
Galich, N. E.; Filatov, M. V.
2007-04-01
A novel approach to statistical analysis of flow cytometry fluorescence data have been developed and applied for population analysis of blood neutrophils stained with hydroethidine during respiratory burst reaction. The staining based on intracellular oxidation hydroethidine to ethidium bromide, which intercalate into cell DNA. Fluorescence of the resultant product serves as a measure of the neutrophil ability to generate superoxide radicals after induction respiratory burst reaction by phorbol myristate acetate (PMA). It was demonstrated that polymorphonuclear leukocytes of persons with inflammatory diseases showed a considerably changed response. Cytofluorometric histograms obtained have unique information about condition of neutrophil population what might to allow a determination of the pathology processes type connecting with such inflammation. A novel approach to histogram analysis is based on analysis of high-momentum dynamic of distribution. The features of fluctuation excesses of distribution have unique information about disease under consideration.