Sample records for queuing theory based

1. Study on combat effectiveness of air defense missile weapon system based on queuing theory

Zhao, Z. Q.; Hao, J. X.; Li, L. J.

2017-01-01

Queuing Theory is a method to analyze the combat effectiveness of air defense missile weapon system. The model of service probability based on the queuing theory was constructed, and applied to analyzing the combat effectiveness of "Sidewinder" and "Tor-M1" air defense missile weapon system. Finally aimed at different targets densities, the combat effectiveness of different combat units of two types' defense missile weapon system is calculated. This method can be used to analyze the usefulness of air defense missile weapon system.

2. Queuing Theory and Reference Transactions.

ERIC Educational Resources Information Center

Terbille, Charles

1995-01-01

Examines the implications of applying the queuing theory to three different reference situations: (1) random patron arrivals; (2) random durations of transactions; and (3) use of two librarians. Tables and figures represent results from spreadsheet calculations of queues for each reference situation. (JMV)

3. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

Sun, Y.; Li, Y. P.; Huang, G. H.

2012-06-01

In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

4. Queuing Theory Based Co-Channel Interference Analysis Approach for High-Density Wireless Local Area Networks

PubMed Central

Zhang, Jie; Han, Guangjie; Qian, Yujie

2016-01-01

Increased co-channel interference (CCI) in wireless local area networks (WLANs) is bringing serious resource constraints to today’s high-density wireless environments. CCI in IEEE 802.11-based networks is inevitable due to the nature of the carrier sensing mechanism however can be reduced by resource optimization approaches. That means the CCI analysis is basic, but also crucial for an efficient resource management. In this article, we present a novel CCI analysis approach based on the queuing theory, which considers the randomness of end users’ behavior and the irregularity and complexity of network traffic in high-density WLANs that adopts the M/M/c queuing model for CCI analysis. Most of the CCIs occur when multiple networks overlap and trigger channel contentions; therefore, we use the ratio of signal-overlapped areas to signal coverage as a probabilistic factor to the queuing model to analyze the CCI impacts in highly overlapped WLANs. With the queuing model, we perform simulations to see how the CCI influences the quality of service (QoS) in high-density WLANs. PMID:27563896

5. Queuing Theory Based Co-Channel Interference Analysis Approach for High-Density Wireless Local Area Networks.

PubMed

Zhang, Jie; Han, Guangjie; Qian, Yujie

2016-08-23

Increased co-channel interference (CCI) in wireless local area networks (WLANs) is bringing serious resource constraints to today's high-density wireless environments. CCI in IEEE 802.11-based networks is inevitable due to the nature of the carrier sensing mechanism however can be reduced by resource optimization approaches. That means the CCI analysis is basic, but also crucial for an efficient resource management. In this article, we present a novel CCI analysis approach based on the queuing theory, which considers the randomness of end users' behavior and the irregularity and complexity of network traffic in high-density WLANs that adopts the M/M/c queuing model for CCI analysis. Most of the CCIs occur when multiple networks overlap and trigger channel contentions; therefore, we use the ratio of signal-overlapped areas to signal coverage as a probabilistic factor to the queuing model to analyze the CCI impacts in highly overlapped WLANs. With the queuing model, we perform simulations to see how the CCI influences the quality of service (QoS) in high-density WLANs.

6. An application of queuing theory to waterfowl migration

USGS Publications Warehouse

Sojda, Richard S.; Cornely, John E.; Fredrickson, Leigh H.; Rizzoli, A.E.; Jakeman, A.J.

2002-01-01

There has always been great interest in the migration of waterfowl and other birds. We have applied queuing theory to modelling waterfowl migration, beginning with a prototype system for the Rocky Mountain Population of trumpeter swans (Cygnus buccinator) in Western North America. The queuing model can be classified as a D/BB/28 system, and we describe the input sources, service mechanism, and network configuration of queues and servers. The intrinsic nature of queuing theory is to represent the spatial and temporal characteristics of entities and how they move, are placed in queues, and are serviced. The service mechanism in our system is an algorithm representing how swans move through the flyway based on seasonal life cycle events. The system uses an observed number of swans at each of 27 areas for a breeding season as input and simulates their distribution through four seasonal steps. The result is a simulated distribution of birds for the subsequent year's breeding season. The model was built as a multiagent system with one agent handling movement algorithms, with one facilitating user interface, and with one to seven agents representing specific geographic areas for which swan management interventions can be implemented. The many parallels in queuing model servers and service mechanisms with waterfowl management areas and annual life cycle events made the transfer of the theory to practical application straightforward.

7. Queuing theory models for computer networks

NASA Technical Reports Server (NTRS)

Galant, David C.

1989-01-01

A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

8. Queuing Theory: An Alternative Approach to Educational Research.

ERIC Educational Resources Information Center

Huyvaert, Sarah H.

Queuing theory is examined in this paper in order to determine if the theory could be applied in educational settings. It is defined as a form of operations research that uses mathematical formulas and/or computer simulation to study wait and congestion in a system and, through the study of these visible phenomena, to discover malfunctions within…

9. Application of queuing theory in production-inventory optimization

2015-07-01

This paper presents a mathematical model for an inventory control system in which customers' demands and suppliers' service time are considered as stochastic parameters. The proposed problem is solved through queuing theory for a single item. In this case, transitional probabilities are calculated in steady state. Afterward, the model is extended to the case of multi-item inventory systems. Then, to deal with the complexity of this problem, a new heuristic algorithm is developed. Finally, the presented bi-level inventory-queuing model is implemented as a case study in Electroestil Company.

10. APPLICATION OF QUEUING THEORY TO INFORMATION SYSTEMS DESIGN.

DTIC Science & Technology

information systems . A scheme of classifying queuing models and queuing literature was developed, and was used to classify most of the known priority...Finally, procedures for the application of the queuing model to structuring of information systems were outlined. (Author)

11. Application of queuing theory in inventory systems with substitution flexibility

Seyedhoseini, S. M.; Rashid, Reza; Kamalpour, Iman; Zangeneh, Erfan

2015-01-01

Considering the competition in today's business environment, tactical planning of a supply chain becomes more complex than before. In many multi-product inventory systems, substitution flexibility can improve profits. This paper aims to prepare a comprehensive substitution inventory model, where an inventory system with two substitute products with ignorable lead time has been considered, and effects of simultaneous ordering have been examined. In this paper, demands of customers for both of the products have been regarded as stochastic parameters, and queuing theory has been used to construct a mathematical model. The model has been coded by C++, and it has been analyzed due to a real example, where the results indicate efficiency of proposed model.

12. On the application of queuing theory for analysis of twin data.

PubMed

Kuravsky, L S; Malykh, S B

2000-06-01

A mathematical model based on queuing theory is used to study the dynamics of environmental influence on twin pairs. The model takes into consideration genetic factors and effects of nonshared environment. Histograms are exploited as base analysed characteristics, with the method of minimum chi-square yielding estimated characteristics. The proposed technique was applied to analysis of longitudinal data for MZ and DZ twins. It was shown that the same environment impact may yield different contributions to final variances of the IQ parameters under investigation. Magnitudes of these contributions depend on the genetic factor represented by distributions of an analysed parameter at the point of birth.

13. An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory

PubMed Central

Mousavi, Ali; Clarkson, P. John; Young, Terry

2015-01-01

This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation. PMID:27170899

14. An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory.

PubMed

Komashie, Alexander; Mousavi, Ali; Clarkson, P John; Young, Terry

2015-01-01

This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation.

15. Characterizing and Managing Intrusion Detection System (IDS) Alerts with Multi-Server/Multi-Priority Queuing Theory

DTIC Science & Technology

2014-12-26

CHARACTERIZING AND MANAGING INTRUSION DETECTION SYSTEM (IDS) ALERTS WITH MULTI-SERVER/MULTI-PRIORITY...subject to copyright protection in the United States. AFIT-ENV-MS-14-D-24 CHARACTERIZING AND MANAGING INTRUSION DETECTION SYSTEM (IDS) ALERTS WITH...IDS) ALERTS WITH MULTI-SERVER/MULTI-PRIORITY QUEUING THEORY Christopher C. Olsen, BS Captain, USAF Approved

16. Allocation of computer ports within a terminal switching network: an application of queuing theory to gandalf port contenders

SciTech Connect

Vahle, M.O.

1982-03-01

Queuing theory is applied to the problem of assigning computer ports within a terminal switching network to maximize the likelihood of instant connect. A brief background of the network is included to focus on the statement of the problem.

17. Determination of newborn special care bed requirements by application of queuing theory to 1975-1976 morbidity experience.

PubMed

Morriss, F H; Adcock, E W; Denson, S E; Stoerner, J W; Malloy, M H; Johnson, C A; Decker, M

1978-04-01

The movement of newborn infants from the delivery room of a level III perinatal center to nursing units that provided different levels of care was prospectively documented for 1975 and 1976. These data were employed in a computer modeling experiment based on sequential queuing theory to determine the relationships between numbers of available intermediate and maximum care nursery beds, the probability that a given newborn arrival could not be accommodated, and the occupancy rates for each level of care. The nursery bed requirements for the level III center were used to estimate the number of special care beds needed by the regional Health Service Area.

18. Queuing theory to guide the implementation of a heart failure inpatient registry program.

PubMed

Zai, Adrian H; Farr, Kit M; Grant, Richard W; Mort, Elizabeth; Ferris, Timothy G; Chueh, Henry C

2009-01-01

OBJECTIVE The authors previously implemented an electronic heart failure registry at a large academic hospital to identify heart failure patients and to connect these patients with appropriate discharge services. Despite significant improvements in patient identification and connection rates, time to connection remained high, with an average delay of 3.2 days from the time patients were admitted to the time connections were made. Our objective for this current study was to determine the most effective solution to minimize time to connection. DESIGN We used a queuing theory model to simulate 3 different potential solutions to decrease the delay from patient identification to connection with discharge services. MEASUREMENTS The measures included average rate at which patients were being connected to the post discharge heart failure services program, average number of patients in line, and average patient waiting time. RESULTS Using queuing theory model simulations, we were able to estimate for our current system the minimum rate at which patients need to be connected (262 patients/mo), the ideal patient arrival rate (174 patients/mo) and the maximal patient arrival rate that could be achieved by adding 1 extra nurse (348 patients/mo). CONCLUSIONS Our modeling approach was instrumental in helping us characterize key process parameters and estimate the impact of adding staff on the time between identifying patients with heart failure and connecting them with appropriate discharge services.

19. Modeling panel detection frequencies by queuing system theory: an application in gas chromatography olfactometry.

PubMed

Bult, Johannes H F; van Putten, Bram; Schifferstein, Hendrik N J; Roozen, Jacques P; Voragen, Alphons G J; Kroeze, Jan H A

2004-10-01

In continuous vigilance tasks, the number of coincident panel responses to stimuli provides an index of stimulus detectability. To determine whether this number is due to chance, panel noise levels have been approximated by the maximum coincidence level obtained in stimulus-free conditions. This study proposes an alternative method by which to assess noise levels, derived from queuing system theory (QST). Instead of critical coincidence levels, QST modeling estimates the duration of coinciding responses in the absence of stimuli. The proposed method has the advantage over previous approaches that it yields more reliable noise estimates and allows for statistical testing. The method was applied in an olfactory detection experiment using 16 panelists in stimulus-present and stimulus-free conditions. We propose that QST may be used as an alternative to signal detection theory for analyzing data from continuous vigilance tasks.

20. Application of queuing theory to patient satisfaction at a tertiary hospital in Nigeria

PubMed Central

Ameh, Nkeiruka; Sabo, B.; Oyefabi, M. O.

2013-01-01

Background: Queuing theory is the mathematical approach to the analysis of waiting lines in any setting where arrival rate of subjects is faster than the system can handle. It is applicable to healthcare settings where the systems have excess capacity to accommodate random variations. Materials and Methods: A cross-sectional descriptive survey was done. Questionnaires were administered to patients who attended the general outpatient department. Observations were also made on the queuing model and the service discipline at the clinic. Questions were meant to obtain demographic characteristics and the time spent on the queue by patients before being seen by a doctor, time spent with the doctor, their views about the time spent on the queue and useful suggestions on how to reduce the time spent on the queue. A total of 210 patients were surveyed. Results: Majority of the patients (164, 78.1%) spent 2 h or less on the queue before being seen by a doctor and less than 1 h to see the doctor. Majority of the patients (144, 68.5%) were satisfied with the time they spent on the queue before being seen by a doctor. Useful suggestions proffered by the patients to decrease the time spent on the queue before seeing a doctor at the clinic included: that more doctors be employed (46, 21.9%), that doctors should come to work on time (25, 11.9%), that first-come-first served be observed strictly (32, 15.2%) and others suggested that the records staff should desist from collecting bribes from patients in order to place their cards before others. The queuing method employed at the clinic is the multiple single channel type and the service discipline is priority service. The patients who spent less time on the queue (<1 h) before seeing the doctor were more satisfied than those who spent more time (P < 0.05). Conclusion: The study has revealed that majority of the patients were satisfied with the practice at the general outpatient department. However, there is a need to employ

1. Spreadsheet Analysis Of Queuing In A Computer Network

NASA Technical Reports Server (NTRS)

Galant, David C.

1992-01-01

Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

2. T-cell activation: A queuing theory analysis at low agonist density.

PubMed

Wedagedera, J R; Burroughs, N J

2006-09-01

We analyze a simple linear triggering model of the T-cell receptor (TCR) within the framework of queuing theory, in which TCRs enter the queue upon full activation and exit by downregulation. We fit our model to four experimentally characterized threshold activation criteria and analyze their specificity and sensitivity: the initial calcium spike, cytotoxicity, immunological synapse formation, and cytokine secretion. Specificity characteristics improve as the time window for detection increases, saturating for time periods on the timescale of downregulation; thus, the calcium spike (30 s) has low specificity but a sensitivity to single-peptide MHC ligands, while the cytokine threshold (1 h) can distinguish ligands with a 30% variation in the complex lifetime. However, a robustness analysis shows that these properties are degraded when the queue parameters are subject to variation-for example, under stochasticity in the ligand number in the cell-cell interface and population variation in the cellular threshold. A time integration of the queue over a period of hours is shown to be able to control parameter noise efficiently for realistic parameter values when integrated over sufficiently long time periods (hours), the discrimination characteristics being determined by the TCR signal cascade kinetics (a kinetic proofreading scheme). Therefore, through a combination of thresholds and signal integration, a T cell can be responsive to low ligand density and specific to agonist quality. We suggest that multiple threshold mechanisms are employed to establish the conditions for efficient signal integration, i.e., coordinate the formation of a stable contact interface.

3. Effects of diversity and procrastination in priority queuing theory: the different power law regimes.

PubMed

Saichev, A; Sornette, D

2010-01-01

Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law approximately 1/t(alpha) with 0queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a "time deficit" control parameter beta and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail approximately 1/t(1/2), resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/t(alpha), with alpha is an element of (0.5,infinity), including the well-known case 1/t. We also study the effect of "procrastination," defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.

4. Effects of diversity and procrastination in priority queuing theory: The different power law regimes

Saichev, A.; Sornette, D.

2010-01-01

Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law ˜1/tα with 0<α≤1 over a time scale of years. We present a simple model for this persistence phenomenon, framed within the standard priority queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a “time deficit” control parameter β and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail ˜1/t1/2 , resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/tα , with αɛ(0.5,∞) , including the well-known case 1/t . We also study the effect of “procrastination,” defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.

5. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

SciTech Connect

Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

2015-03-10

Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

6. Improving queuing service at McDonald's

Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

2014-07-01

Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

7. Queuing theory under competitive social foraging may explain a mathematical equivalence of delay and probability in impulsive decision-making.

PubMed

Takahashi, Taiki

2006-01-01

Intertemporal and probabilistic decision-making has been studied in psychiatry, ecology, and neuroeconomics. Because drug addicts and psycopaths often make risky decisions (e.g., drug misuse and aggression), investigation into types of impulsivity in intertemporal and probabilistic choices (delay and probability discounting) are important for psychiatric treatments. Studies in behavioral ecology proposed that delay and probability discounting are mediated by the same psychological process, because a decrease in probability of winning corresponds to an increase in delay until winning. According to this view, odds-against winning (=1/p-1) in probabilistic choice corresponds to delay in intertemporal choice. This hypothesis predicts that preference of gambling (low degree of probability discounting) may be associated with patience, rather than impulsivity or impatience, in intertemporal choice (low degree of delay discounting). However, recent empirical evidence in psychiatric research employing pathological gamblers indicates that pathological gamblers are impulsive in intertemporal choice (high degrees of delay discounting). However, a hyperbolic discounting function (usually adopted to explain intertemporal choice) with odds-against (instead of delay) explain experimental data in probabilistic choice dramatically well. Therefore, an alternative explanation is required for the hypothetical equivalence of odds-against to delay. We propose that queuing theory (often adopted for analyzing computer network traffic) under a competitive social foraging condition may explain the equivalence. Our hypothesis may help understand impulsivity of psychiatrics in social behavior (e.g., aggression and antisocial behavior) in addition to non-social impulsivity in reward-seeking (e.g., substance misuse).

8. A soft computing-based approach to optimise queuing-inventory control problem

2015-04-01

In this paper, a multi-product continuous review inventory control problem within batch arrival queuing approach (MQr/M/1) is developed to find the optimal quantities of maximum inventory. The objective function is to minimise summation of ordering, holding and shortage costs under warehouse space, service level and expected lost-sales shortage cost constraints from retailer and warehouse viewpoints. Since the proposed model is Non-deterministic Polynomial-time hard, an efficient imperialist competitive algorithm (ICA) is proposed to solve the model. To justify proposed ICA, both ganetic algorithm and simulated annealing algorithm are utilised. In order to determine the best value of algorithm parameters that result in a better solution, a fine-tuning procedure is executed. Finally, the performance of the proposed ICA is analysed using some numerical illustrations.

9. Design and Implementation of High-Speed Input-Queued Switches Based on a Fair Scheduling Algorithm

Hu, Qingsheng; Zhao, Hua-An

To increase both the capacity and the processing speed for input-queued (IQ) switches, we proposed a fair scalable scheduling architecture (FSSA). By employing FSSA comprised of several cascaded sub-schedulers, a large-scale high performance switches or routers can be realized without the capacity limitation of monolithic device. In this paper, we present a fair scheduling algorithm named FSSA_DI based on an improved FSSA where a distributed iteration scheme is employed, the scheduler performance can be improved and the processing time can be reduced as well. Simulation results show that FSSA_DI achieves better performance on average delay and throughput under heavy loads compared to other existing algorithms. Moreover, a practical 64 × 64 FSSA using FSSA_DI algorithm is implemented by four Xilinx Vertex-4 FPGAs. Measurement results show that the data rates of our solution can be up to 800Mbps and the tradeoff between performance and hardware complexity has been solved peacefully.

10. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

PubMed

He, Xinhua; Hu, Wenfa

2014-01-01

This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

11. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

PubMed Central

He, Xinhua

2014-01-01

This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

12. Using queuing theory to analyse the government's 4-H completion time target in accident and emergency departments.

PubMed

Mayhew, L; Smith, D

2008-03-01

This paper uses a queuing model to evaluate completion times in Accident and Emergency (A&E) departments in the light of the Government target of completing and discharging 98% of patients inside 4 h. It illustrates how flows though an A&E can be accurately represented as a queuing process, how outputs can be used to visualise and interpret the 4-h Government target in a simple way and how the model can be used to assess the practical achievability of A&E targets in the future. The paper finds that A&E targets have resulted in significant improvements in completion times and thus deal with a major source of complaint by users of the National Health Service in the U.K. It suggests that whilst some of this improvement is attributable to better management, some is also due to the way some patients in A&E are designated and therefore counted through the system. It finds for example that the current target would not have been possible without some form of patient re-designation or re-labelling taking place. Further it finds that the current target is so demanding that the integrity of reported performance is open to question. Related incentives and demand management issues resulting from the target are also briefly discussed.

13. Human Factors of Queuing: A Library Circulation Model.

ERIC Educational Resources Information Center

Mansfield, Jerry W.

1981-01-01

Classical queuing theories and their accompanying service facilities totally disregard the human factors in the name of efficiency. As library managers we need to be more responsive to human needs in the design of service points and make every effort to minimize queuing and queue frustration. Five references are listed. (Author/RAA)

14. A queuing model for road traffic simulation

SciTech Connect

Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

2015-03-10

We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.

15. Development of Markov Chain-Based Queuing Model and Wireless Infrastructure for EV to Smart Meter Communication in V2G

Santoshkumar; Udaykumar, R. Y.

2015-04-01

The electrical vehicles (EVs) can be connected to the grid for power transaction. The vehicle-to-grid (V2G) supports the grid requirements and helps in maintaining the load demands. The grid control center (GCC), aggregator and EV are three key entities in V2G communication. The GCC sends the information about power requirements to the aggregator. The aggregator after receiving the information from the GCC sends the information to the EVs. Based on the information, the interested EV owners participate in power transaction with the grid. The aggregator facilitates the EVs by providing the parking and charging slot. In this paper the queuing model for EVs connected to the grid and development of wireless infrastructure for the EV to Smart Meter communication is proposed. The queuing model is developed and simulated. The path loss models for WiMAX are analyzed and compared. Also, the physical layer of WiMAX protocol is modeled and simulated for the EV to Smart Meter communication in V2G.

16. Quality and operations of portable X-ray examination procedures in the emergency room: queuing theory at work.

PubMed

Abujudeh, Hani; Vuong, Bill; Baker, Stephen R

2005-07-01

The objective of this study was to evaluate the operation of the portable X-ray machine in relation to examinations ordered by the Emergency Department at the University of Medicine and Dentistry of New Jersey, as well as to identify any bottlenecks hindering the performance of the aforementioned system. To do so, the activity of the portable X-ray was monitored in the period from 8 June 2004 to 24 June 2004, as well as from 6 July 2004 to 12 July 2004, yielding 11 days of data and 116 individual X-ray examinations. During observation times was noted for various checkpoints in the procedure. Using the data gathered, the average input, output, processing times, and variance were calculated. In turn, these values were used to calculate the response times for the Ordering Phase (5.502 min), traveling (2.483 min), Examination Phase (4.453 min), returning (3.855 min), Order Processing Phase (2.962 min), and the Development Phase (3.437 min). These phases were combined for a total of 22.721 min from the time the examination was placed to the time the X-ray films were uploaded to the PACS computer network. Based on these calculations, the Ordering Phase was determined to be the single largest bottleneck in the portable X-ray system. The Examination Phase also represented the second largest bottleneck for a combined total of 44% of the total response time.

17. Ambulance deployment with the hypercube queuing model.

PubMed

Larson, R C

1982-01-01

A computer-implemented mathematical model has been developed to assist planners in the spatial deployment and dispatching of ambulances. The model incorporates uncertainties in the arrival times, locations, and service requirements of patients, building on the branch of operations research known as queuing theory. Several system-performance measures are generated by the model, including mean neighborhood-specific response times, mean utilization of each ambulance, and statistical profiles of ambulance response patterns. This model has been implemented by the Department of Health and Hospitals of the City of Boston.

18. Capacity Utilization Study for Aviation Security Cargo Inspection Queuing System

SciTech Connect

Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E; Brumback, Daryl L

2010-01-01

In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

19. Capacity utilization study for aviation security cargo inspection queuing system

Allgood, Glenn O.; Olama, Mohammed M.; Lake, Joe E.; Brumback, Daryl

2010-04-01

In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system's ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

20. Queuing Time Prediction Using WiFi Positioning Data in an Indoor Scenario

PubMed Central

Shu, Hua; Song, Ci; Pei, Tao; Xu, Lianming; Ou, Yang; Zhang, Libin; Li, Tao

2016-01-01

Queuing is common in urban public places. Automatically monitoring and predicting queuing time can not only help individuals to reduce their wait time and alleviate anxiety but also help managers to allocate resources more efficiently and enhance their ability to address emergencies. This paper proposes a novel method to estimate and predict queuing time in indoor environments based on WiFi positioning data. First, we use a series of parameters to identify the trajectories that can be used as representatives of queuing time. Next, we divide the day into equal time slices and estimate individuals’ average queuing time during specific time slices. Finally, we build a nonstandard autoregressive (NAR) model trained using the previous day’s WiFi estimation results and actual queuing time to predict the queuing time in the upcoming time slice. A case study comparing two other time series analysis models shows that the NAR model has better precision. Random topological errors caused by the drift phenomenon of WiFi positioning technology (locations determined by a WiFi positioning system may drift accidently) and systematic topological errors caused by the positioning system are the main factors that affect the estimation precision. Therefore, we optimize the deployment strategy during the positioning system deployment phase and propose a drift ratio parameter pertaining to the trajectory screening phase to alleviate the impact of topological errors and improve estimates. The WiFi positioning data from an eight-day case study conducted at the T3-C entrance of Beijing Capital International Airport show that the mean absolute estimation error is 147 s, which is approximately 26.92% of the actual queuing time. For predictions using the NAR model, the proportion is approximately 27.49%. The theoretical predictions and the empirical case study indicate that the NAR model is an effective method to estimate and predict queuing time in indoor public areas. PMID:27879663

1. Queuing Time Prediction Using WiFi Positioning Data in an Indoor Scenario.

PubMed

Shu, Hua; Song, Ci; Pei, Tao; Xu, Lianming; Ou, Yang; Zhang, Libin; Li, Tao

2016-11-22

Queuing is common in urban public places. Automatically monitoring and predicting queuing time can not only help individuals to reduce their wait time and alleviate anxiety but also help managers to allocate resources more efficiently and enhance their ability to address emergencies. This paper proposes a novel method to estimate and predict queuing time in indoor environments based on WiFi positioning data. First, we use a series of parameters to identify the trajectories that can be used as representatives of queuing time. Next, we divide the day into equal time slices and estimate individuals' average queuing time during specific time slices. Finally, we build a nonstandard autoregressive (NAR) model trained using the previous day's WiFi estimation results and actual queuing time to predict the queuing time in the upcoming time slice. A case study comparing two other time series analysis models shows that the NAR model has better precision. Random topological errors caused by the drift phenomenon of WiFi positioning technology (locations determined by a WiFi positioning system may drift accidently) and systematic topological errors caused by the positioning system are the main factors that affect the estimation precision. Therefore, we optimize the deployment strategy during the positioning system deployment phase and propose a drift ratio parameter pertaining to the trajectory screening phase to alleviate the impact of topological errors and improve estimates. The WiFi positioning data from an eight-day case study conducted at the T3-C entrance of Beijing Capital International Airport show that the mean absolute estimation error is 147 s, which is approximately 26.92% of the actual queuing time. For predictions using the NAR model, the proportion is approximately 27.49%. The theoretical predictions and the empirical case study indicate that the NAR model is an effective method to estimate and predict queuing time in indoor public areas.

2. Modeling and simulation of M/M/c queuing pharmacy system with adjustable parameters

2016-02-01

This paper studies a discrete event simulation (DES) as a computer based modelling that imitates a real system of pharmacy unit. M/M/c queuing theo is used to model and analyse the characteristic of queuing system at the pharmacy unit of Hospital Tuanku Fauziah, Kangar in Perlis, Malaysia. The input of this model is based on statistical data collected for 20 working days in June 2014. Currently, patient waiting time of pharmacy unit is more than 15 minutes. The actual operation of the pharmacy unit is a mixed queuing server with M/M/2 queuing model where the pharmacist is referred as the server parameters. DES approach and ProModel simulation software is used to simulate the queuing model and to propose the improvement for queuing system at this pharmacy system. Waiting time for each server is analysed and found out that Counter 3 and 4 has the highest waiting time which is 16.98 and 16.73 minutes. Three scenarios; M/M/3, M/M/4 and M/M/5 are simulated and waiting time for actual queuing model and experimental queuing model are compared. The simulation results show that by adding the server (pharmacist), it will reduce patient waiting time to a reasonable improvement. Almost 50% average patient waiting time is reduced when one pharmacist is added to the counter. However, it is not necessary to fully utilize all counters because eventhough M/M/4 and M/M/5 produced more reduction in patient waiting time, but it is ineffective since Counter 5 is rarely used.

3. Application of queuing model in Dubai's busiest megaplex

Bhagchandani, Maneesha; Bajpai, Priti

2013-09-01

This paper provides a study and analysis of the extremely busy booking counters at the Megaplex in Dubai using the queuing model and simulation. Dubai is an emirate in UAE with a multicultural population. Majority of the population in Dubai is foreign born. Cinema is one of the major forms of entertainment. There are more than 13 megaplexes each with a number of screens ranging from 3 to 22. They screen movies in English, Arabic, Hindi and other languages. It has been observed that during the weekends megaplexes attract a large number of crowd resulting in long queues at the booking counters. One of the busiest megaplex was selected for the study. Queuing theory satisfies the model when tested in real time situation. The concepts of arrival rate, service rate, utilization rate, waiting time in the system, average number of people in the queue, using Little's Theorem and M/M/s queuing model along with simulation software have been used to suggest an empirical solution. The aim of the paper is twofold-To assess the present situation at the Megaplex and give recommendations to optimize the use of booking counters.

4. Queuing register uses fluid logic elements

NASA Technical Reports Server (NTRS)

1966-01-01

Queuing register /a multistage bit-shifting device/ uses a series of pure fluid elements to perform the required logic operations. The register has several stages of three-state pure fluid elements combined with two-input NOR gates.

5. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital

PubMed Central

Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili

2016-01-01

In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

6. Modeling patient flows using a queuing network with blocking.

PubMed

Koizumi, Naoru; Kuno, Eri; Smith, Tony E

2005-02-01

The downsizing and closing of state mental health institutions in Philadelphia in the 1990's led to the development of a continuum care network of residential-based services. Although the diversity of care settings increased, congestion in facilities caused many patients to unnecessarily spend extra days in intensive facilities. This study applies a queuing network system with blocking to analyze such congestion processes. "Blocking" denotes situations where patients are turned away from accommodations to which they are referred, and are thus forced to remain in their present facilities until space becomes available. Both mathematical and simulation results are presented and compared. Although queuing models have been used in numerous healthcare studies, the inclusion of blocking is still rare. We found that, in Philadelphia, the shortage of a particular type of facilities may have created "upstream blocking". Thus removal of such facility-specific bottlenecks may be the most efficient way to reduce congestion in the system as a whole.

7. Queuing Models of Tertiary Storage

NASA Technical Reports Server (NTRS)

Johnson, Theodore

1996-01-01

Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

8. Predicting age-related differences in visual information processing using a two-stage queuing model.

PubMed

Ellis, R D; Goldberg, J H; Detweiler, M C

1996-05-01

Recent work on age-related differences in some types of visual information processing has qualitatively stated that younger adults are able to develop parallel processing capability, while older adults remain serial processors. A mathematical model based on queuing theory was used to quantitatively predict and parameterize age-related differences in the perceptual encoding and central decision-making aspects of a multiple-frame search task. Statistical results indicated main effects for frame duration, display load, age group, and session of practice. Comparison of the full model and a restricted model indicated an efficient contribution of the encoding speed parameter. The best-fitting parameter set indicated that (1) younger participants processed task information with a two-channel parallel system, while older participants were serial processors; and (2) perceptual encoding had a large impact on age-related differences in task performance. Results are discussed with implications for human factors design principles.

9. Guanosine tetraphosphate as a global regulator of bacterial RNA synthesis: a model involving RNA polymerase pausing and queuing.

PubMed

Bremer, H; Ehrenberg, M

1995-05-17

A recently reported comparison of stable RNA (rRNA, tRNA) and mRNA synthesis rates in ppGpp-synthesizing and ppGpp-deficient (delta relA delta spoT) bacteria has suggested that ppGpp inhibits transcription initiation from stable RNA promoters, as well as synthesis of (bulk) mRNA. Inhibition of stable RNA synthesis occurs mainly during slow growth of bacteria when cytoplasmic levels of ppGpp are high. In contrast, inhibition of mRNA occurs mainly during fast growth when ppGpp levels are low, and it is associated with a partial inactivation of RNA polymerase. To explain these observations it has been proposed that ppGpp causes transcriptional pausing and queuing during the synthesis of mRNA. Polymerase queuing requires high rates of transcription initiation in addition to polymerase pausing, and therefore high concentrations of free RNA polymerase. These conditions are found in fast growing bacteria. Furthermore, the RNA polymerase queues lead to a promoter blocking when RNA polymerase molecules stack up from the pause site back to the (mRNA) promoter. This occurs most frequently at pause sites close to the promoter. Blocking of mRNA promoters diverts RNA polymerase to stable RNA promoters. In this manner ppGpp could indirectly stimulate synthesis of stable RNA at high growth rates. In the present work a mathematical analysis, based on the theory of queuing, is presented and applied to the global control of transcription in bacteria. This model predicts the in vivo distribution of RNA polymerase over stable RNA and mRNA genes for both ppGpp-synthesizing and ppGpp-deficient bacteria in response to different environmental conditions. It also shows how small changes in basal ppGpp concentrations can produce large changes in the rate of stable RNA synthesis.

ERIC Educational Resources Information Center

Lawrence, David

2002-01-01

Discusses the Americans with Disabilities (ADA) and Uniform Federal Accessibility Standards (UFAS) regulations regarding public facilities' crowd control stanchions and queuing systems. The major elements are protruding objects and wheelchair accessibility. Describes how to maintain compliance with the regulations and offers a list of additional…

11. Priority Queuing On A Parallel Data Bus

NASA Technical Reports Server (NTRS)

Wallis, D. E.

1985-01-01

Queuing strategy for communications along shared data bus minimizes number of data lines while always assuring user of highest priority given access to bus. New system handles up to 32 user demands on 17 data lines that previously serviced only 17 demands.

12. Modeling Patient Flows Using a Queuing Network with Blocking

PubMed Central

KUNO, ERI; SMITH, TONY E.

2015-01-01

The downsizing and closing of state mental health institutions in Philadelphia in the 1990’s led to the development of a continuum care network of residential-based services. Although the diversity of care settings increased, congestion in facilities caused many patients to unnecessarily spend extra days in intensive facilities. This study applies a queuing network system with blocking to analyze such congestion processes. “Blocking” denotes situations where patients are turned away from accommodations to which they are referred, and are thus forced to remain in their present facilities until space becomes available. Both mathematical and simulation results are presented and compared. Although queuing models have been used in numerous healthcare studies, the inclusion of blocking is still rare. We found that, in Philadelphia, the shortage of a particular type of facilities may have created “upstream blocking”. Thus removal of such facility-specific bottlenecks may be the most efficient way to reduce congestion in the system as a whole. PMID:15782512

13. Improving hospital bed occupancy and resource utilization through queuing modeling and evolutionary computation.

PubMed

Belciug, Smaranda; Gorunescu, Florin

2015-02-01

Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application.

14. Theory-Based Stakeholder Evaluation

ERIC Educational Resources Information Center

Hansen, Morten Balle; Vedung, Evert

2010-01-01

This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

15. Some queuing network models of computer systems

NASA Technical Reports Server (NTRS)

Herndon, E. S.

1980-01-01

Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

16. Density profiles of the exclusive queuing process

2012-12-01

The exclusive queuing process (EQP) incorporates the exclusion principle into classic queuing models. It is characterized by, in addition to the entrance probability α and exit probability β, a third parameter: the hopping probability p. The EQP can be interpreted as an exclusion process of variable system length. Its phase diagram in the parameter space (α,β) is divided into a convergent phase and a divergent phase by a critical line which consists of a curved part and a straight part. Here we extend previous studies of this phase diagram. We identify subphases in the divergent phase, which can be distinguished by means of the shape of the density profile, and determine the velocity of the system length growth. This is done for EQPs with different update rules (parallel, backward sequential and continuous time). We also investigate the dynamics of the system length and the number of customers on the critical line. They are diffusive or subdiffusive with non-universal exponents that also depend on the update rules.

17. Queuing network approach for building evacuation planning

Ishak, Nurhanis; Khalid, Ruzelan; Baten, Md. Azizul; Nawawi, Mohd. Kamal Mohd.

2014-12-01

The complex behavior of pedestrians in a limited space layout can explicitly be modeled using an M/G/C/C state dependent queuing network. This paper implements the approach to study pedestrian flows through various corridors in a topological network. The best arrival rates and their impacts to the corridors' performances in terms of the throughput, blocking probability, expected number of occupants in the system and expected travel time were first measured using the M/G/C/C analytical model. These best arrival rates were then fed to its Network Flow Programming model to find the best arrival rates to source corridors and routes optimizing the network's total throughput. The analytical results were then validated using a simulation model. Various results of this study can be used to support the current Standard Operating Procedures (SOP) to efficiently and safely evacuate people in emergency cases.

18. NQS - NETWORK QUEUING SYSTEM, VERSION 2.0 (UNIX VERSION)

NASA Technical Reports Server (NTRS)

Walter, H.

1994-01-01

The Network Queuing System, NQS, is a versatile batch and device queuing facility for a single Unix computer or a group of networked computers. With the Unix operating system as a common interface, the user can invoke the NQS collection of user-space programs to move batch and device jobs freely around the different computer hardware tied into the network. NQS provides facilities for remote queuing, request routing, remote status, queue status controls, batch request resource quota limits, and remote output return. This program was developed as part of an effort aimed at tying together diverse UNIX based machines into NASA's Numerical Aerodynamic Simulator Processing System Network. This revision of NQS allows for creating, deleting, adding and setting of complexes that aid in limiting the number of requests to be handled at one time. It also has improved device-oriented queues along with some revision of the displays. NQS was designed to meet the following goals: 1) Provide for the full support of both batch and device requests. 2) Support all of the resource quotas enforceable by the underlying UNIX kernel implementation that are relevant to any particular batch request and its corresponding batch queue. 3) Support remote queuing and routing of batch and device requests throughout the NQS network. 4) Support queue access restrictions through user and group access lists for all queues. 5) Enable networked output return of both output and error files to possibly remote machines. 6) Allow mapping of accounts across machine boundaries. 7) Provide friendly configuration and modification mechanisms for each installation. 8) Support status operations across the network, without requiring a user to log in on remote target machines. 9) Provide for file staging or copying of files for movement to the actual execution machine. To support batch and device requests, NQS v.2 implements three queue types--batch, device and pipe. Batch queues hold and prioritize batch requests

19. Modeling and simulation of queuing system for customer service improvement: A case study

Xian, Tan Chai; Hong, Chai Weng; Hawari, Nurul Nazihah

2016-10-01

This study aims to develop a queuing model at UniMall by using discrete event simulation approach in analyzing the service performance that affects customer satisfaction. The performance measures that considered in this model are such as the average time in system, the total number of student served, the number of student in waiting queue, the waiting time in queue as well as the maximum length of buffer. ARENA simulation software is used to develop a simulation model and the output is analyzed. Based on the analysis of output, it is recommended that management of UniMall consider introducing shifts and adding another payment counter in the morning.

20. A queueing theory based model for business continuity in hospitals.

PubMed

Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

2013-01-01

Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.

1. An application of a queuing model for sea states

Loffredo, L.; Monbaliu, J.; Anderson, C.

2012-04-01

Unimodal approaches in design practice have shown inconsistencies in terms of directionality and limitations for accurate sea states description. Spectral multimodality needs to be included in the description of the wave climate. It can provide information about the coexistence of different wave systems originating from different meteorological events, such as locally generated wind waves and swell systems from distant storms. A 20 years dataset (1989-2008) for a location on the North Sea (K13, 53.2°N 3.2°E) has been retrieved from the ECMWF ERA- Interim re-analysis data archive, providing a consistent and homogeneous dataset. The work focuses on the joint and conditional probability distributions of wind sea and swell systems. For marine operations and design applications, critical combinations of wave systems may exist. We define a critical sea state on the basis of a set of thresholds, which can be not necessarily extreme, the emphasis is given to the dangerous combination of different wave systems concerning certain operations (i.e. small vessels navigation, dredging). The distribution of non-operability windows is described by a point process model with random and independent events, whose occurrences and lengths can be described only probabilistically. These characteristics allow to treat the emerging patterns as a part of a queuing system. According to this theory, generally adopted for several applications including traffic flows and waiting lines, the input process describes the sequence of requests for a service and the service mechanism the length of time that these requests will occupy the facilities. For weather-driven processes at sea an alternating renewal process appears as a suitable model. It consists of a sequence of critical events (period of inoperability), each of random duration, separated by calms, also of random durations. Inoperability periods and calms are assumed independent. In this model it is not possible more than one critical

2. Time-varying priority queuing models for human dynamics

Jo, Hang-Hyun; Pan, Raj Kumar; Kaski, Kimmo

2012-06-01

Queuing models provide insight into the temporal inhomogeneity of human dynamics, characterized by the broad distribution of waiting times of individuals performing tasks. We theoretically study the queuing model of an agent trying to execute a task of interest, the priority of which may vary with time due to the agent's “state of mind.” However, its execution is disrupted by other tasks of random priorities. By considering the priority of the task of interest either decreasing or increasing algebraically in time, we analytically obtain and numerically confirm the bimodal and unimodal waiting time distributions with power-law decaying tails, respectively. These results are also compared to the updating time distribution of papers in arXiv.org and the processing time distribution of papers in Physical Review journals. Our analysis helps to understand human task execution in a more realistic scenario.

3. Using multi-class queuing network to solve performance models of e-business sites.

PubMed

Zheng, Xiao-ying; Chen, De-ren

2004-01-01

Due to e-business's variety of customers with different navigational patterns and demands, multi-class queuing network is a natural performance model for it. The open multi-class queuing network(QN) models are based on the assumption that no service center is saturated as a result of the combined loads of all the classes. Several formulas are used to calculate performance measures, including throughput, residence time, queue length, response time and the average number of requests. The solution technique of closed multi-class QN models is an approximate mean value analysis algorithm (MVA) based on three key equations, because the exact algorithm needs huge time and space requirement. As mixed multi-class QN models, include some open and some closed classes, the open classes should be eliminated to create a closed multi-class QN so that the closed model algorithm can be applied. Some corresponding examples are given to show how to apply the algorithms mentioned in this article. These examples indicate that multi-class QN is a reasonably accurate model of e-business and can be solved efficiently.

4. Modified weighted fair queuing for packet scheduling in mobile WiMAX networks

Satrya, Gandeva B.; Brotoharsono, Tri

2013-03-01

The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.

5. NAS Requirements Checklist for Job Queuing/Scheduling Software

NASA Technical Reports Server (NTRS)

Jones, James Patton

1996-01-01

The increasing reliability of parallel systems and clusters of computers has resulted in these systems becoming more attractive for true production workloads. Today, the primary obstacle to production use of clusters of computers is the lack of a functional and robust Job Management System for parallel applications. This document provides a checklist of NAS requirements for job queuing and scheduling in order to make most efficient use of parallel systems and clusters for parallel applications. Future requirements are also identified to assist software vendors with design planning.

6. Based on regular expression matching of evaluation of the task performance in WSN: a queue theory approach.

PubMed

Wang, Jie; Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo

2014-01-01

Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability.

7. A message-queuing framework for STAR's online monitoring and metadata collection

Arkhipkin, D.; Lauret, J.; Betts, W.

2011-12-01

We report our experience on migrating STARs Online Services (Run Control System, Data Acquisition System, Slow Control System and Subsystem Monitoring) from direct read/write database accesses to a modern non-blocking message-oriented infrastructure. Based on the Advanced Messaging Queuing Protocol (AMQP) and standards, this novel approach does not specify the message data structure, allowing great flexibility in its use. After careful consideration, we chose Google Protocol Buffers as our primary (de)serialization format for structured data exchange. This migration allows us to reduce the overall system complexity and greatly improve the reliability of the metadata collection and the performance of our online services in general. We will present this new framework through its software architecture overview, providing details about our staged and non-disruptive migration process as well as details of the implementation of pluggable components to provide future improvements without compromising stability and availability of services.

8. A message-queuing framework for STAR's online monitoring and metadata collection

SciTech Connect

Arkhipkin, D.; Lauret, J.; Betts, W.

2011-12-23

We report our experience on migrating STARs Online Services (Run Control System, Data Acquisition System, Slow Control System and Subsystem Monitoring) from direct read/write database accesses to a modern non-blocking message-oriented infrastructure. Based on the Advanced Messaging Queuing Protocol (AMQP) and standards, this novel approach does not specify the message data structure, allowing great flexibility in its use. After careful consideration, we chose Google Protocol Buffers as our primary (de)serialization format for structured data exchange. This migration allows us to reduce the overall system complexity and greatly improve the reliability of the metadata collection and the performance of our online services in general. We will present this new framework through its software architecture overview, providing details about our staged and non-disruptive migration process as well as details of the implementation of pluggable components to provide future improvements without compromising stability and availability of services.

9. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

Duan, Haoran

1997-12-01

This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a

10. Vocation in theology-based nursing theories.

PubMed

Lundmark, Mikael

2007-11-01

By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.

11. Computer-based theory of strategies

SciTech Connect

Findler, N.V.

1983-01-01

Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, utility theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. The author also discusses three long-term projects which aim at automatically analyzing and synthesizing strategies. 27 references.

12. Queuing model of a traffic bottleneck with bimodal arrival rate

Woelki, Marko

2016-06-01

This paper revisits the problem of tuning the density in a traffic bottleneck by reduction of the arrival rate when the queue length exceeds a certain threshold, studied recently for variants of totally asymmetric simple exclusion process (TASEP) and Burgers equation. In the present approach, a simple finite queuing system is considered and its contrasting “phase diagram” is derived. One can observe one jammed region, one low-density region and one region where the queue length is equilibrated around the threshold. Despite the simplicity of the model the physics is in accordance with the previous approach: The density is tuned at the threshold if the exit rate lies in between the two arrival rates.

13. Evaluation of Job Queuing/Scheduling Software: Phase I Report

NASA Technical Reports Server (NTRS)

Jones, James Patton

1996-01-01

The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.

14. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

NASA Technical Reports Server (NTRS)

Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

1997-01-01

The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

15. Spectrally queued feature selection for robotic visual odometery

Pirozzo, David M.; Frederick, Philip A.; Hunt, Shawn; Theisen, Bernard; Del Rose, Mike

2011-01-01

Over the last two decades, research in Unmanned Vehicles (UV) has rapidly progressed and become more influenced by the field of biological sciences. Researchers have been investigating mechanical aspects of varying species to improve UV air and ground intrinsic mobility, they have been exploring the computational aspects of the brain for the development of pattern recognition and decision algorithms and they have been exploring perception capabilities of numerous animals and insects. This paper describes a 3 month exploratory applied research effort performed at the US ARMY Research, Development and Engineering Command's (RDECOM) Tank Automotive Research, Development and Engineering Center (TARDEC) in the area of biologically inspired spectrally augmented feature selection for robotic visual odometry. The motivation for this applied research was to develop a feasibility analysis on multi-spectrally queued feature selection, with improved temporal stability, for the purposes of visual odometry. The intended application is future semi-autonomous Unmanned Ground Vehicle (UGV) control as the richness of data sets required to enable human like behavior in these systems has yet to be defined.

16. Basing quantum theory on information processing

Barnum, Howard

2008-03-01

I consider information-based derivations of the quantum formalism, in a framework encompassing quantum and classical theory and a broad spectrum of theories serving as foils to them. The most ambitious hope for such a derivation is a role analogous to Einstein's development of the dynamics and kinetics of macroscopic bodies, and later of their gravitational interactions, on the basis of simple principles with clear operational meanings and experimental consequences. Short of this, it could still provide a principled understanding of the features of quantum mechanics that account for its greater-than-classical information-processing power, helping guide the search for new quantum algorithms and protocols. I summarize the convex operational framework for theories, and discuss information-processing in theories therein. Results include the fact that information that can be obtained without disturbance is inherently classical, generalized no-cloning and no-broadcasting theorems, exponentially secure bit commitment in all non-classical theories without entanglement, properties of theories that allow teleportation, and properties of theories that allow ``remote steering'' of ensembles using entanglement. Joint work with collaborators including Jonathan Barrett, Matthew Leifer, Alexander Wilce, Oscar Dahlsten, and Ben Toner.

17. Jigsaw Cooperative Learning: Acid-Base Theories

ERIC Educational Resources Information Center

Tarhan, Leman; Sesen, Burcin Acar

2012-01-01

This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

18. A hybrid queuing strategy for network traffic on scale-free networks

Cai, Kai-Quan; Yu, Lu; Zhu, Yan-Bo

2017-02-01

In this paper, a hybrid queuing strategy (HQS) is proposed in traffic dynamics model on scale-free networks, where the delivery priority of packets in the queue is related to their distance to destination and the queue length of next jump. We compare the performance of the proposed HQS with that of the traditional first-in-first-out (FIFO) queuing strategy and the shortest-remaining-path-first (SRPF) queuing strategy proposed by Du et al. It is observed that the network traffic efficiency utilizing HQS with suitable value of parameter h can be further improved in the congestion state. Our work provides new insights for the understanding of the networked-traffic systems.

19. Modelling pedestrian travel time and the design of facilities: a queuing approach.

PubMed

Rahman, Khalidur; Ghani, Noraida Abdul; Kamil, Anton Abdulbasah; Mustafa, Adli; Kabir Chowdhury, Md Ahmed

2013-01-01

Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities.

20. Evacuation time estimate for total pedestrian evacuation using a queuing network model and volunteered geographic information

Kunwar, Bharat; Simini, Filippo; Johansson, Anders

2016-02-01

Estimating city evacuation time is a nontrivial problem due to the interaction between thousands of individual agents, giving rise to various collective phenomena, such as bottleneck formation, intermittent flow, and stop-and-go waves. We present a mean field approach to draw relationships between road network spatial attributes, the number of evacuees, and the resultant evacuation time estimate (ETE). Using volunteered geographic information, we divide 50 United Kingdom cities into a total of 704 catchment areas (CAs) which we define as an area where all agents share the same nearest exit node. 90% of the agents are within ≈6 ,847 m of CA exit nodes with ≈13 ,778 agents/CA. We establish a characteristic flow rate from catchment area attributes (population, distance to exit node, and exit node width) and a mean flow rate in a free-flow regime by simulating total evacuations using an agent based "queuing network" model. We use these variables to determine a relationship between catchment area attributes and resultant ETEs. This relationship could enable emergency planners to make a rapid appraisal of evacuation strategies and help support decisions in the run up to a crisis.

1. Impact of Mandatory HIV Screening in the Emergency Department: A Queuing Study.

PubMed

Liu, Nan; Stone, Patricia W; Schnall, Rebecca

2016-04-01

To improve HIV screening rates, New York State in 2010 mandated that all persons 13-64 years receiving health care services, including care in emergency departments (EDs), be offered HIV testing. Little attention has been paid to the effect of screening on patient flow. Time-stamped ED visit data from patients eligible for HIV screening, 7,844 of whom were seen by providers and 767 who left before being seen by providers, were retrieved from electronic health records in one adult ED. During day shifts, 10% of patients left without being seen, and during evening shifts, 5% left without being seen. All patients seen by providers were offered testing, and 6% were tested for HIV. Queuing models were developed to evaluate the effect of HIV screening on ED length of stay, patient waiting time, and rate of leaving without being seen. Base case analysis was conducted using actual testing rates, and sensitivity analyses were conducted to evaluate the impact of increasing the testing rate. Length of ED stay of patients who received HIV tests was 24 minutes longer on day shifts and 104 minutes longer on evening shifts than for patients not tested for HIV. Increases in HIV testing rate were estimated to increase waiting time for all patients, including those who left without being seen. Our simulation suggested that incorporating HIV testing into ED patient visits not only adds to practitioner workload but also increases patient waiting time significantly during busy shifts, which may increase the rate of leaving without being seen.

2. Organic magnetoresistance based on hopping theory

Yang, Fu-Jiang; Xie, Shi-Jie

2014-09-01

For the organic magnetoresistance (OMAR) effect, we suggest a spin-related hopping of carriers (polarons) based on Marcus theory. The mobility of polarons is calculated with the master equation (ME) and then the magnetoresistance (MR) is obtained. The theoretical results are consistent with the experimental observation. Especially, the sign inversion of the MR under different driving bias voltages found in the experiment is predicted. Besides, the effects of molecule disorder, hyperfine interaction (HFI), polaron localization, and temperature on the MR are investigated.

3. The Scope of Usage-Based Theory

PubMed Central

Ibbotson, Paul

2013-01-01

Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552

4. Theoretical description of metabolism using queueing theory.

PubMed

Evstigneev, Vladyslav P; Holyavka, Marina G; Khrapatiy, Sergii V; Evstigneev, Maxim P

2014-09-01

A theoretical description of the process of metabolism has been developed on the basis of the Pachinko model (see Nicholson and Wilson in Nat Rev Drug Discov 2:668-676, 2003) and the queueing theory. The suggested approach relies on the probabilistic nature of the metabolic events and the Poisson distribution of the incoming flow of substrate molecules. The main focus of the work is an output flow of metabolites or the effectiveness of metabolism process. Two simplest models have been analyzed: short- and long-living complexes of the source molecules with a metabolizing point (Hole) without queuing. It has been concluded that the approach based on queueing theory enables a very broad range of metabolic events to be described theoretically from a single probabilistic point of view.

5. Towards a Faith-Based Program Theory: A Reconceptualization of Program Theory

ERIC Educational Resources Information Center

Harden, Mark G.

2006-01-01

A meta-program theory is proposed to overcome the limitations and improve the use of program theory as an approach to faith-based program evaluation. The essentials for understanding religious organizations, their various programs, and faith and spirituality are discussed to support a rationale for developing a faith-based program theory that…

6. MODELING AND PERFORMANCE EVALUATION FOR AVIATION SECURITY CARGO INSPECTION QUEUING SYSTEM

SciTech Connect

Allgood, Glenn O; Olama, Mohammed M; Rose, Terri A; Brumback, Daryl L

2009-01-01

Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and throughput. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures will reduce the overall cost and shipping delays associated with the new inspection requirements.

7. Generalized theory of diffusion based on kinetic theory

Schäfer, T.

2016-10-01

We propose to use spin hydrodynamics, a two-fluid model of spin propagation, as a generalization of the diffusion equation. We show that in the dense limit spin hydrodynamics reduces to Fick's law and the diffusion equation. In the opposite limit spin hydrodynamics is equivalent to a collisionless Boltzmann treatment of spin propagation. Spin hydrodynamics avoids unphysical effects that arise when the diffusion equation is used to describe to a strongly interacting gas with a dilute corona. We apply spin hydrodynamics to the problem of spin diffusion in a trapped atomic gas. We find that the observed spin relaxation rate in the high-temperature limit [Sommer et al., Nature (London) 472, 201 (2011), 10.1038/nature09989] is consistent with the diffusion constant predicted by kinetic theory.

8. A Multiple Constraint Queuing Model for Predicting Current and Future Terminal Area Capacities

NASA Technical Reports Server (NTRS)

Meyn, Larry A.

2004-01-01

A new queuing model is being developed to evaluate the capacity benefits of several new concepts for terminal airspace operations. The major innovation is the ability to support a wide variety of multiple constraints for modeling the scheduling logic of several concepts. Among the constraints modeled are in-trail separation, separation between aircraft landing on parallel runways, in-trail separation at terminal area entry points, and permissible terminal area flight times.

9. An Improved Call Admission Control Mechanism with Prioritized Handoff Queuing Scheme for BWA Networks

Chowdhury, Prasun; Saha Misra, Iti

2014-10-01

Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.

10. A Cost Function Based on Learning Theory.

DTIC Science & Technology

COSTS, MATHEMATICAL MODELS), (*LEARNING, THEORY), PARTIAL DIFFERENTIAL EQUATIONS, INVENTORY CONTROL, DYNAMIC PROGRAMMING, CALCULUS OF VARIATIONS, LEARNING CURVES, INDUSTRIAL PRODUCTION , MANAGEMENT ENGINEERING, THESES.

11. Feature-Based Binding and Phase Theory

ERIC Educational Resources Information Center

Antonenko, Andrei

2012-01-01

Current theories of binding cannot provide a uniform account for many facts associated with the distribution of anaphors, such as long-distance binding effects and the subject-orientation of monomorphemic anaphors. Further, traditional binding theory is incompatible with minimalist assumptions. In this dissertation I propose an analysis of…

12. Theory Based Approaches to Learning. Implications for Adult Educators.

ERIC Educational Resources Information Center

Bolton, Elizabeth B.; Jones, Edward V.

This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

13. State variable theories based on Hart's formulation

SciTech Connect

Korhonen, M.A.; Hannula, S.P.; Li, C.Y.

1985-01-01

In this paper a review of the development of a state variable theory for nonelastic deformation is given. The physical and phenomenological basis of the theory and the constitutive equations describing macroplastic, microplastic, anelastic and grain boundary sliding enhanced deformation are presented. The experimental and analytical evaluation of different parameters in the constitutive equations are described in detail followed by a review of the extensive experimental work on different materials. The technological aspects of the state variable approach are highlighted by examples of the simulative and predictive capabilities of the theory. Finally, a discussion of general capabilities, limitations and future developments of the theory and particularly the possible extensions to cover an even wider range of deformation or deformation-related phenomena is presented.

14. Theory-Based University Admissions Testing for a New Millennium

ERIC Educational Resources Information Center

Sternberg, Robert J.

2004-01-01

This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

15. Continuing Bonds in Bereavement: An Attachment Theory Based Perspective

ERIC Educational Resources Information Center

Field, Nigel P.; Gao, Beryl; Paderna, Lisa

2005-01-01

An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual…

16. Maximum entropy principle based estimation of performance distribution in queueing theory.

PubMed

He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

2014-01-01

In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated.

17. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

Code of Federal Regulations, 2013 CFR

2013-04-01

... 23 Highways 1 2013-04-01 2013-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

18. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

Code of Federal Regulations, 2012 CFR

2012-04-01

... 23 Highways 1 2012-04-01 2012-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

19. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

Code of Federal Regulations, 2014 CFR

2014-04-01

... 23 Highways 1 2014-04-01 2014-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

20. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

Code of Federal Regulations, 2011 CFR

2011-04-01

... 23 Highways 1 2011-04-01 2011-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

1. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

Code of Federal Regulations, 2010 CFR

2010-04-01

... 23 Highways 1 2010-04-01 2010-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

2. Recursive renormalization group theory based subgrid modeling

NASA Technical Reports Server (NTRS)

Zhou, YE

1991-01-01

Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

3. Aviation security cargo inspection queuing simulation model for material flow and accountability

SciTech Connect

Olama, Mohammed M; Allgood, Glenn O; Rose, Terri A; Brumback, Daryl L

2009-01-01

Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.

4. THEORY OF REGENERATION BASED ON MASS ACTION

PubMed Central

Loeb, Jacques

1923-01-01

1. The writer's older experiment, proving that equal masses of isolated sister leaves of Bryophyllum regenerate under equal conditions and in equal time equal masses (in dry weight) of shoots and roots, is confirmed. It is shown that in the dark this regeneration is reduced to a small fraction of that observed in light. 2. The writer's former observation is confirmed, that when a piece of stem inhibits or diminishes the regeneration in a leaf, the dry weight of the stem increases by as much or more than the weight by which the regeneration in the leaf is diminished. It is shown that this is also true when the axillary bud in the stem is removed or when the regeneration occurs in the dark. 3. These facts show that the regeneration of an isolated leaf of Bryophyllum is determined by the mass of material available or formed in the leaf during the experiment and that such a growth does not occur in a leaf connected with a normal plant for the reason that in the latter case the material available or formed in the leaf flows into the stem where it is consumed for normal growth. 4. It is shown that the sap sent out by a leaf in the descending current of a stem is capable of increasing also the rate of growth of shoots in the basal parts of the leaf when the sap has an opportunity to reach the anlagen for such shoots. 5. The fact that a defoliated piece of stem forms normally no shoots in its basal part therefore demands an explanation of the polar character of regeneration which lays no or less emphasis on the chemical difference between ascending and descending sap than does Sachs' theory of specific root- or shoot-forming substances (though such substances may in reality exist), but which uses as a basis the general mass relation as expressed in the first three statements of this summary. 6. It is suggested that the polar character of the regeneration in a stem of Bryophyllum is primarily due to the fact that the descending sap reaches normally only the root

5. Graph-based linear scaling electronic structure theory

Niklasson, Anders M. N.; Mniszewski, Susan M.; Negre, Christian F. A.; Cawkwell, Marc J.; Swart, Pieter J.; Mohd-Yusof, Jamal; Germann, Timothy C.; Wall, Michael E.; Bock, Nicolas; Rubensson, Emanuel H.; Djidjev, Hristo

2016-06-01

We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.

6. Graph-based linear scaling electronic structure theory.

PubMed

Niklasson, Anders M N; Mniszewski, Susan M; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Rubensson, Emanuel H; Djidjev, Hristo

2016-06-21

We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.

7. Theory of friction based on brittle fracture

USGS Publications Warehouse

Byerlee, J.D.

1967-01-01

A theory of friction is presented that may be more applicable to geologic materials than the classic Bowden and Tabor theory. In the model, surfaces touch at the peaks of asperities and sliding occurs when the asperities fail by brittle fracture. The coefficient of friction, ??, was calculated from the strength of asperities of certain ideal shapes; for cone-shaped asperities, ?? is about 0.1 and for wedge-shaped asperities, ?? is about 0.15. For actual situations which seem close to the ideal model, observed ?? was found to be very close to 0.1, even for materials such as quartz and calcite with widely differing strengths. If surface forces are present, the theory predicts that ?? should decrease with load and that it should be higher in a vacuum than in air. In the presence of a fluid film between sliding surfaces, ?? should depend on the area of the surfaces in contact. Both effects are observed. The character of wear particles produced during sliding and the way in which ?? depends on normal load, roughness, and environment lend further support to the model of friction presented here. ?? 1967 The American Institute of Physics.

8. Theory-Based Approaches to the Concept of Life

ERIC Educational Resources Information Center

El-Hani, Charbel Nino

2008-01-01

In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

9. Task-Based Language Teaching and Expansive Learning Theory

ERIC Educational Resources Information Center

Robertson, Margaret

2014-01-01

Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…

10. Evaluation Theory in Problem-Based Learning Approach.

ERIC Educational Resources Information Center

Hsu, Yu-chen

The purpose of this paper is to review evaluation theories and techniques in both the medical and educational fields and to propose an evaluation theory to explain the condition variables, the method variables, and the outcome variables of student assessment in a problem-based learning (PBL) approach. The PBL definition and process are presented,…

11. Unifying ecology and macroevolution with individual-based theory.

PubMed

Rosindell, James; Harmon, Luke J; Etienne, Rampal S

2015-05-01

A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology.

12. Unifying ecology and macroevolution with individual-based theory

PubMed Central

Rosindell, James; Harmon, Luke J; Etienne, Rampal S

2015-01-01

A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. PMID:25818618

13. Index Theory-Based Algorithm for the Gradiometer Inverse Problem

DTIC Science & Technology

2015-03-28

Index Theory-Based Algorithm for the Gradiometer Inverse Problem Robert C. Anderson and Jonathan W. Fitton Abstract: We present an Index Theory...based gravity gradiometer inverse problem algorithm. This algorithm relates changes in the index value computed on a closed curve containing a line...field generated by the positive eigenvector of the gradiometer tensor to the closeness of fit of the proposed inverse solution to the mass and

14. Congestion at Card and Book Catalogs--A Queuing-Theory Approach

ERIC Educational Resources Information Center

Bookstein, Abraham

1972-01-01

This paper attempts to analyze the problem of congestion, using a mathematical model shown to be of value in other similar applications. Three criteria of congestion are considered, and it is found that the conclusion one can draw is sensitive to which of these criteria is paramount. (8 references) (Author/NH)

15. An application of queuing theory to SIS and SEIS epidemic models.

PubMed

Hernandez-Suarez, Carlos M; Castillo-Chavez, Carlos; Lopez, Osval Montesinos; Hernandez-Cuevas, Karla

2010-10-01

In this work we consider every individual of a population to be a server whose state can be either busy (infected) or idle (susceptible). This server approach allows to consider a general distribution for the duration of the infectious state, instead of being restricted to exponential distributions. In order to achieve this we first derive new approximations to quasistationary distribution (QSD) of SIS (Susceptible- Infected- Susceptible) and SEIS (Susceptible- Latent- Infected- Susceptible) stochastic epidemic models. We give an expression that relates the basic reproductive number, R0 and the server utilization,p.

16. Modeling the emergency cardiac in-patient flow: an application of queuing theory.

PubMed

de Bruin, Arnoud M; van Rossum, A C; Visser, M C; Koole, G M

2007-06-01

This study investigates the bottlenecks in the emergency care chain of cardiac in-patient flow. The primary goal is to determine the optimal bed allocation over the care chain given a maximum number of refused admissions. Another objective is to provide deeper insight in the relation between natural variation in arrivals and length of stay and occupancy rates. The strong focus on raising occupancy rates of hospital management is unrealistic and counterproductive. Economies of scale cannot be neglected. An important result is that refused admissions at the First Cardiac Aid (FCA) are primarily caused by unavailability of beds downstream the care chain. Both variability in LOS and fluctuations in arrivals result in large workload variations. Techniques from operations research were successfully used to describe the complexity and dynamics of emergency in-patient flow.

17. Congestion at Card and Book Catalogs--A Queuing Theory Approach.

ERIC Educational Resources Information Center

Bookstein, Abraham

The question of whether a library's catalog should consist of cards arranged in a single alphabetical order (the "dictionary catalog) or be segregated as a separate file is discussed. Development is extended to encompass related problems involved in the creation of a book catalog. A model to study the effects of congestion at the catalog is…

18. Understanding human queuing behaviour at exits: an empirical study

PubMed Central

Wagoum, A. U. Kemloh; Liao, W.

2017-01-01

The choice of the exit to egress from a facility plays a fundamental role in pedestrian modelling and simulation. Yet, empirical evidence for backing up simulation is scarce. In this contribution, we present three new groups of experiments that we conducted in different geometries. We varied parameters such as the width of the doors, the initial location and number of pedestrians which in turn affected their perception of the environment. We extracted and analysed relevant indicators such as distance to the exits and density levels. The results put in evidence the fact that pedestrians use time-dependent information to optimize their exit choice, and that, in congested states, a load balancing over the exits occurs. We propose a minimal modelling approach that covers those situations, especially the cases where the geometry does not show a symmetrical configuration. Most of the models try to achieve the load balancing by simulating the system and solving optimization problems. We show statistically and by simulation that a linear model based on the distance to the exits and the density levels around the exit can be an efficient dynamical alternative. PMID:28280588

19. Address block localization based on graph theory

Gaceb, Djamel; Eglin, Véronique; Lebourgeois, Frank; Emptoz, Hubert

2008-01-01

An efficient mail sorting system is mainly based on an accurate optical recognition of the addresses on the envelopes. However, the localizing of the address block (ABL) should be done before the OCR recognition process. The location step is very crucial as it has a great impact on the global performance of the system. Currently, a good localizing step leads to a better recognition rate. The limit of current methods is mainly caused by modular linear architectures used for ABL: their performances greatly depend on each independent module performance. We are presenting in this paper a new approach for ABL based on a pyramidal data organization and on a hierarchical graph coloring for classification process. This new approach presents the advantage to guarantee a good coherence between different modules and reduces both the computation time and the rejection rate. The proposed method gives a very satisfying rate of 98% of good locations on a set of 750 envelope images.

20. Complexity measurement based on information theory and kolmogorov complexity.

PubMed

Lui, Leong Ting; Terrazas, Germán; Zenil, Hector; Alexander, Cameron; Krasnogor, Natalio

2015-01-01

In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.

1. Kinetic energy decomposition scheme based on information theory.

PubMed

Imamura, Yutaka; Suzuki, Jun; Nakai, Hiromi

2013-12-15

We proposed a novel kinetic energy decomposition analysis based on information theory. Since the Hirshfeld partitioning for electron densities can be formulated in terms of Kullback-Leibler information deficiency in information theory, a similar partitioning for kinetic energy densities was newly proposed. The numerical assessments confirm that the current kinetic energy decomposition scheme provides reasonable chemical pictures for ionic and covalent molecules, and can also estimate atomic energies using a correction with viral ratios.

2. Elastic theory of origami-based metamaterials.

PubMed

Brunck, V; Lechenault, F; Reid, A; Adda-Bedia, M

2016-03-01

Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n-creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.

3. Elastic theory of origami-based metamaterials

Brunck, V.; Lechenault, F.; Reid, A.; Adda-Bedia, M.

2016-03-01

Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n -creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.

4. Computer-Based Integrated Learning Systems: Research and Theory.

ERIC Educational Resources Information Center

Hativa, Nira, Ed.; Becker, Henry Jay, Ed.

1994-01-01

The eight chapters of this theme issue discuss recent research and theory concerning computer-based integrated learning systems. Following an introduction about their theoretical background and current use in schools, the effects of using computer-based integrated learning systems in the elementary school classroom are considered. (SLD)

5. Evaluating hydrological model performance using information theory-based metrics

Technology Transfer Automated Retrieval System (TEKTRAN)

The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

6. Modeling Air Traffic Management Technologies with a Queuing Network Model of the National Airspace System

NASA Technical Reports Server (NTRS)

Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter

1999-01-01

This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.

7. Elastic theory of origami-based metamaterials

Lechenault, Frederic; Brunck, V.; Reid, A.; Adda-Bedia, M.

Origami offers the possibility for new metamaterials whose overall mechanical properties can be programmed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would aim to determine the shape taken by the structure at rest and its mechanical response. We introduce a vector deformation field acting on the imprinted network of creases, that allows to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n-creases meeting at a single vertex, and then extended to origami tesselations. The generalized waterbomb base and the Miura-Ori are treated within this formalism. For the Miura folding, we uncover a phase transition from monostable to two metastable states, that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. This research was supported by the ANR Grant 14-CE07-0031 METAMAT.

8. Snow avalanche friction relation based on extended kinetic theory

Rauter, Matthias; Fischer, Jan-Thomas; Fellin, Wolfgang; Kofler, Andreas

2016-11-01

Rheological models for granular materials play an important role in the numerical simulation of dry dense snow avalanches. This article describes the application of a physically based model from the field of kinetic theory to snow avalanche simulations. The fundamental structure of the so-called extended kinetic theory is outlined and the decisive model behavior for avalanches is identified. A simplified relation, covering the basic features of the extended kinetic theory, is developed and implemented into an operational avalanche simulation software. To test the obtained friction relation, simulation results are compared to velocity and runout observations of avalanches, recorded from different field tests. As reference we utilize a classic phenomenological friction relation, which is commonly applied for hazard estimation. The quantitative comparison is based on the combination of normalized residuals of different observation variables in order to take into account the quality of the simulations in various regards. It is demonstrated that the extended kinetic theory provides a physically based explanation for the structure of phenomenological friction relations. The friction relation derived with the help of the extended kinetic theory shows advantages to the classic phenomenological friction, in particular when different events and various observation variables are investigated.

9. Measurement Theory Based on the Truth Values Violates Local Realism

Nagata, Koji

2017-02-01

We investigate the violation factor of the Bell-Mermin inequality. Until now, we use an assumption that the results of measurement are ±1. In this case, the maximum violation factor is 2( n-1)/2. The quantum predictions by n-partite Greenberger-Horne-Zeilinger (GHZ) state violate the Bell-Mermin inequality by an amount that grows exponentially with n. Recently, a new measurement theory based on the truth values is proposed (Nagata and Nakamura, Int. J. Theor. Phys. 55:3616, 2016). The values of measurement outcome are either +1 or 0. Here we use the new measurement theory. We consider multipartite GHZ state. It turns out that the Bell-Mermin inequality is violated by the amount of 2( n-1)/2. The measurement theory based on the truth values provides the maximum violation of the Bell-Mermin inequality.

10. A Natural Teaching Method Based on Learning Theory.

ERIC Educational Resources Information Center

Smilkstein, Rita

1991-01-01

The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…

11. A Pedagogy of Blending Theory with Community-Based Research

ERIC Educational Resources Information Center

Brown, Kathleen Taylor

2011-01-01

Blending activity theory and community-based research educational applications describes the praxis achieved through the initial design, development, implementation, and assessment of one research methods course as a pedagogy to enhance and improve the outcomes of civic and community engagement for the university, its students, and the community.…

12. Theory-Based Diagnosis and Remediation of Writing Disabilities.

ERIC Educational Resources Information Center

Berninger, Virginia W.; And Others

1991-01-01

Briefly reviews recent trends in research on writing; introduces theory-based model being developed for differential diagnosis of writing disabilities at neuropsychological, linguistic, and cognitive levels; presents cases and patterns in cases that illustrate differential diagnosis of writing disabilities at linguistic level; and suggests…

13. Theory-Based Considerations Influence the Interpretation of Generic Sentences

ERIC Educational Resources Information Center

Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

2010-01-01

Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

14. Project-Based Language Learning: An Activity Theory Analysis

ERIC Educational Resources Information Center

Gibbes, Marina; Carson, Lorna

2014-01-01

This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

15. A Memory-Based Theory of Verbal Cognition

ERIC Educational Resources Information Center

Dennis, Simon

2005-01-01

The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these…

16. Qualitative model-based diagnosis using possibility theory

NASA Technical Reports Server (NTRS)

Joslyn, Cliff

1994-01-01

The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

17. Regularization of identity based solution in string field theory

Zeze, Syoji

2010-10-01

We demonstrate that an Erler-Schnabl type solution in cubic string field theory can be naturally interpreted as a gauge invariant regularization of an identity based solution. We consider a solution which interpolates between an identity based solution and ordinary Erler-Schnabl one. Two gauge invariant quantities, the classical action and the closed string tadpole, are evaluated for finite value of the gauge parameter. It is explicitly checked that both of them are independent of the gauge parameter.

18. Theory-Based Bayesian Models of Inductive Inference

DTIC Science & Technology

2010-06-30

Oxford University Press . 28. Griffiths, T. L. and Tenenbaum, J.B. (2007). Two proposals for causal grammar. In A. Gopnik and L. Schulz (eds.). ( ausal Learning. Oxford University Press . 29. Tenenbaum. J. B.. Kemp, C, Shafto. P. (2007). Theory-based Bayesian models for inductive reasoning. In A. Feeney and E. Heit (eds.). Induction. Cambridge University Press. 30. Goodman, N. D., Tenenbaum, J. B., Griffiths. T. L.. & Feldman, J. (2008). Compositionality in rational analysis: Grammar-based induction for concept

19. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

Zhao, Cunyou; Shi, Dongyan; Wu, Han

Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

20. Theory-based Bayesian models of inductive learning and reasoning.

PubMed

Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

2006-07-01

Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

1. [Brazilian scientific production based on Orem's nursing theory: integrative review].

PubMed

Raimondo, Maria Lúcia; Fegadoli, Débora; Méier, Marineli Joaquim; Wall, Marilene Loewen; Labronici, Liliana Maria; Raimondo-Ferraz, Maria Isabel

2012-01-01

Integrative review, held in the databases LILACS, SciELO and BDENF from January 2005 to May 2009, aimed to summarize the Brazilian scientific production based on Orem's Nursing Theory. We obtained 23 articles, analyzed by simple descriptive statistics. It was found that 100% of the studies focused on adults. Of this total, 65,22% returned to the chronicle diseases. In 39,15% of the searches, the theory was used in full and in 34,80% one of the constructs. 91,30% of publications aimed to the construction and deployment of the structured and theoretically grounded practice of care. It was concluded that the theory has been used as theoretical and philosophical basis to justify the practice of nursing in a variety of situations in order to emphasize the role of the nurse in the care.

2. Computer-based Training in Medicine and Learning Theories.

PubMed

Haag, Martin; Bauch, Matthias; Garde, Sebastian; Heid, Jörn; Weires, Thorsten; Leven, Franz-Josef

2005-01-01

Computer-based training (CBT) systems can efficiently support modern teaching and learning environments. In this paper, we demonstrate on the basis of the case-based CBT system CAMPUS that current learning theories and design principles (Bloom's Taxonomy and practice fields) are (i) relevant to CBT and (ii) are feasible to implement using computer-based training and adequate learning environments. Not all design principles can be fulfilled by the system alone, the integration of the system in adequate teaching and learning environments therefore is essential. Adequately integrated, CBT programs become valuable means to build or support practice fields for learners that build domain knowledge and problem-solving skills. Learning theories and their design principles can support in designing these systems as well as in assessing their value.

3. Correlation theory-based signal processing method for CMF signals

Shen, Yan-lin; Tu, Ya-qing

2016-06-01

Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.

4. A Rolling Element Bearing Fault Diagnosis Approach Based on Multifractal Theory and Gray Relation Theory

PubMed Central

Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying

2016-01-01

Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately. PMID:28036329

5. A Rolling Element Bearing Fault Diagnosis Approach Based on Multifractal Theory and Gray Relation Theory.

PubMed

Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying

2016-01-01

Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately.

6. Infrared small target detection based on Danger Theory

Lan, Jinhui; Yang, Xiao

2009-11-01

To solve the problem that traditional method can't detect the small objects whose local SNR is less than 2 in IR images, a Danger Theory-based model to detect infrared small target is presented in this paper. First, on the analog with immunology, the definition is given, in this paper, to such terms as dangerous signal, antigens, APC, antibodies. Besides, matching rule between antigen and antibody is improved. Prior to training the detection model and detecting the targets, the IR images are processed utilizing adaptive smooth filter to decrease the stochastic noise. Then at the training process, deleting rule, generating rule, crossover rule and the mutation rule are established after a large number of experiments in order to realize immediate convergence and obtain good antibodies. The Danger Theory-based model is built after the training process, and this model can detect the target whose local SNR is only 1.5.

7. Ensemble method: Community detection based on game theory

Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.

2014-08-01

Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.

8. Research on Capturing of Customer Requirements Based on Innovation Theory

junwu, Ding; dongtao, Yang; zhenqiang, Bao

To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.

9. [Study of gene data mining based on informatics theory].

PubMed

Ang, Qing; Wang, Weidong; Wang, Guojing; Peng, Fulai

2012-07-01

By combining with informatics theory, ta system model consisting of feature selection which is based on redundancy and correlation is presented to develop disease classification research with five gene data set (NCI, Lymphoma, Lung, Leukemia, Colon). The result indicates that this modeling method can not only reduce data management computation amount, but also help confirming amount of features, further more improve classification accuracy, and the application of this model has a bright foreground in fields of disease analysis and individual treatment project establishment.

10. A Kendama Learning Robot Based on Bi-directional Theory.

PubMed

Kawato, Mitsuo; Wada, Yasuhiro; Nakano, Eri; Osu, Rieko; Koike, Yasuharu; Gomi, Hiroaki; Gandolfo, Francesca; Schaal, Stefan; Miyamoto, Hiroyuki

1996-11-01

A general theory of movement-pattern perception based on bi-directional theory for sensory-motor integration can be used for motion capture and learning by watching in robotics. We demonstrate our methods using the game of Kendama, executed by the SARCOS Dextrous Slave Arm, which has a very similar kinematic structure to the human arm. Three ingredients have to be integrated for the successful execution of this task. The ingredients are (1) to extract via-points from a human movement trajectory using a forward-inverse relaxation model, (2) to treat via-points as a control variable while reconstructing the desired trajectory from all the via-points, and (3) to modify the via-points for successful execution. In order to test the validity of the via-point representation, we utilized a numerical model of the SARCOS arm, and examined the behavior of the system under several conditions. Copyright 1996 Elsevier Science Ltd.

11. A Danger-Theory-Based Immune Network Optimization Algorithm

PubMed Central

Li, Tao; Xiao, Xin; Shi, Yuanquan

2013-01-01

Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

12. Control theory based airfoil design using the Euler equations

NASA Technical Reports Server (NTRS)

Jameson, Antony; Reuther, James

1994-01-01

This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

13. Ranking streamflow model performance based on Information theory metrics

Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

2016-04-01

The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

14. A model of resurgence based on behavioral momentum theory.

PubMed

Shahan, Timothy A; Sweeney, Mary M

2011-01-01

Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory.

15. Similarity theory based on the Dougherty-Ozmidov length scale

Grachev, Andrey A.; Andreas, Edgar L.; Fairall, Christopher W.; Guest, Peter S.; Persson, P. Ola G.

2015-07-01

Local similarity theory is suggested based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy instead the turbulent fluxes used in the traditional Monin-Obukhov similarity theory. Based on dimensional analysis (Pi theorem), it is shown that any properly scaled statistics of the small-scale turbulence are universal functions of a stability parameter defined as the ratio of a reference height z and the Dougherty-Ozmidov length scale which in the limit of z-less stratification is linearly proportional to the Obukhov length scale. Measurements of atmospheric turbulence made at five levels on a 20-m tower over the Arctic pack ice during the Surface Heat Budget of the Arctic Ocean experiment (SHEBA) are used to examine the behaviour of different similarity functions in the stable boundary layer. It is found that in the framework of this approach the non-dimensional turbulent viscosity is equal to the gradient Richardson number whereas the non-dimensional turbulent thermal diffusivity is equal to the flux Richardson number. These results are a consequence of the approximate local balance between production of turbulence by the mean flow shear and viscous dissipation. The turbulence framework based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy may have practical advantages for estimating turbulence when the fluxes are not directly available.

16. Game Theory and Risk-Based Levee System Design

Hui, R.; Lund, J. R.; Madani, K.

2014-12-01

Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

17. Transportation optimization with fuzzy trapezoidal numbers based on possibility theory.

PubMed

He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

2014-01-01

In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods.

18. Experimental energy consumption of Frame Slotted ALOHA and Distributed Queuing for data collection scenarios.

PubMed

Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier

2014-07-24

Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA.

19. Experimental Energy Consumption of Frame Slotted ALOHA and Distributed Queuing for Data Collection Scenarios

PubMed Central

Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier

2014-01-01

Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839

20. Transitional clerkship: an experiential course based on workplace learning theory.

PubMed

Chittenden, Eva H; Henry, Duncan; Saxena, Varun; Loeser, Helen; O'Sullivan, Patricia S

2009-07-01

Starting clerkships is anxiety provoking for medical students. To ease the transition from preclerkship to clerkship curricula, schools offer classroom-based courses which may not be the best model for preparing learners. Drawing from workplace learning theory, the authors developed a seven-day transitional clerkship (TC) in 2007 at the University of California, San Francisco School of Medicine in which students spent half of the course in the hospital, learning routines and logistics of the wards along with their roles and responsibilities as members of ward teams. Twice, they admitted and followed a patient into the next day as part of a shadow team that had no patient-care responsibilities. Dedicated preceptors gave feedback on oral presentations and patient write-ups. Satisfaction with the TC was higher than with the previous year's classroom-based course. TC students felt clearer about their roles and more confident in their abilities as third-year students compared with previous students. TC students continued to rate the transitional course highly after their first clinical rotation. Preceptors were enthusiastic about the course and expressed willingness to commit to future TC preceptorships. The transitional course models an approach to translating workplace learning theory into practice and demonstrates improved satisfaction, better understanding of roles, and increased confidence among new third-year students.

1. Feature selection with neighborhood entropy-based cooperative game theory.

PubMed

Zeng, Kai; She, Kun; Niu, Xinzheng

2014-01-01

Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones.

2. Theory based design and optimization of materials for spintronics applications

Xu, Tianyi

The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

3. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

ERIC Educational Resources Information Center

Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

2012-01-01

Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

4. Advancing the Development and Application of Theory-Based Evaluation in the Practice of Public Health.

ERIC Educational Resources Information Center

Cole, Galen E.

1999-01-01

Provides strategies for constructing theories of theory-based evaluation and provides examples in the field of public health. Techniques are designed to systematize and bring objectivity to the process of theory construction. Also introduces a framework of program theory. (SLD)

5. A molecularly based theory for electron transfer reorganization energy.

PubMed

Zhuang, Bilin; Wang, Zhen-Gang

2015-12-14

Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule's permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.

6. Release behaviour of clozapine matrix pellets based on percolation theory.

PubMed

Aguilar-de-Leyva, Angela; Sharkawi, Tahmer; Bataille, Bernard; Baylac, Gilles; Caraballo, Isidoro

2011-02-14

The release behaviour of clozapine matrix pellets was studied in order to investigate if it is possible to explain it applying the concepts of percolation theory, previously used in the understanding of the release process of inert and hydrophilic matrix tablets. Thirteen batches of pellets with different proportions of clozapine/microcrystalline cellulose (MCC)/hydroxypropylmethyl cellulose (HPMC) and different clozapine particle size fractions were prepared by extrusion-spheronisation and the release profiles were studied. It has been observed that the distance to the excipient (HPMC) percolation threshold is important to control the release rate. Furthermore, the drug percolation threshold has a big influence in these systems. Batches very close to the drug percolation threshold, show a clear effect of the drug particle size in the release rate. However, this effect is much less evident when there is a bigger distance to the drug percolation threshold, so the release behaviour of clozapine matrix pellets is possible to be explained based on the percolation theory.

7. A molecularly based theory for electron transfer reorganization energy

SciTech Connect

Zhuang, Bilin; Wang, Zhen-Gang

2015-12-14

Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule’s permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.

8. Quantum Hall transitions: An exact theory based on conformal restriction

Bettelheim, E.; Gruzberg, I. A.; Ludwig, A. W. W.

2012-10-01

We revisit the problem of the plateau transition in the integer quantum Hall effect. Here we develop an analytical approach for this transition, and for other two-dimensional disordered systems, based on the theory of “conformal restriction.” This is a mathematical theory that was recently developed within the context of the Schramm-Loewner evolution which describes the “stochastic geometry” of fractal curves and other stochastic geometrical fractal objects in two-dimensional space. Observables elucidating the connection with the plateau transition include the so-called point-contact conductances (PCCs) between points on the boundary of the sample, described within the language of the Chalker-Coddington network model for the transition. We show that the disorder-averaged PCCs are characterized by a classical probability distribution for certain geometric objects in the plane (which we call pictures), occurring with positive statistical weights, that satisfy the crucial so-called restriction property with respect to changes in the shape of the sample with absorbing boundaries; physically, these are boundaries connected to ideal leads. At the transition point, these geometrical objects (pictures) become fractals. Upon combining this restriction property with the expected conformal invariance at the transition point, we employ the mathematical theory of “conformal restriction measures” to relate the disorder-averaged PCCs to correlation functions of (Virasoro) primary operators in a conformal field theory (of central charge c=0). We show how this can be used to calculate these functions in a number of geometries with various boundary conditions. Since our results employ only the conformal restriction property, they are equally applicable to a number of other critical disordered electronic systems in two spatial dimensions, including for example the spin quantum Hall effect, the thermal metal phase in symmetry class D, and classical diffusion in two

9. Quantum theory of a spaser-based nanolaser.

PubMed

Parfenyev, Vladimir M; Vergeles, Sergey S

2014-06-02

We present a quantum theory of a spaser-based nanolaser, under the bad-cavity approximation. We find first- and second-order correlation functions g(1)(τ) and g(2)(τ) below and above the generation threshold, and obtain the average number of plasmons in the cavity. The latter is shown to be of the order of unity near the generation threshold, where the spectral line narrows considerably. In this case the coherence is preserved in a state of active atoms in contradiction to the good-cavity lasers, where the coherence is preserved in a state of photons. The damped oscillations in g(2)(τ) above the generation threshold indicate the unusual character of amplitude fluctuations of polarization and population, which become interconnected in this case. Obtained results allow to understand the fundamental principles of operation of nanolasers.

10. Invulnerability of power grids based on maximum flow theory

Fan, Wenli; Huang, Shaowei; Mei, Shengwei

2016-11-01

The invulnerability analysis against cascades is of great significance in evaluating the reliability of power systems. In this paper, we propose a novel cascading failure model based on the maximum flow theory to analyze the invulnerability of power grids. In the model, node initial loads are built on the feasible flows of nodes with a tunable parameter γ used to control the initial node load distribution. The simulation results show that both the invulnerability against cascades and the tolerance parameter threshold αT are affected by node load distribution greatly. As γ grows, the invulnerability shows the distinct change rules under different attack strategies and different tolerance parameters α respectively. These results are useful in power grid planning and cascading failure prevention.

11. Fiber tracking of brain white matter based on graph theory.

PubMed

Lu, Meng

2015-01-01

Brain white matter tractography is reconstructed via diffusion-weighted magnetic resonance images. Due to the complex structure of brain white matter fiber bundles, fiber crossing and fiber branching are abundant in human brain. And regular methods with diffusion tensor imaging (DTI) can't accurately handle this problem. the biggest problems of the brain tractography. Therefore, this paper presented a novel brain white matter tractography method based on graph theory, so the fiber tracking between two voxels is transformed into locating the shortest path in a graph. Besides, the presented method uses Q-ball imaging (QBI) as the source data instead of DTI, because QBI can provide accurate information about multiple fiber crossing and branching in one voxel using orientation distribution function (ODF). Experiments showed that the presented method can accurately handle the problem of brain white matter fiber crossing and branching, and reconstruct brain tractograhpy both in phantom data and real brain data.

12. Predicting neutron star properties based on chiral effective field theory

2016-09-01

The energy per nucleon as a function of density, known as the nuclear equation of state, is the crucial input in the structure equations of neutron stars and thus establishes the connection between nuclear physics and compact astrophysical objects. More precisely, the pressure which supports the star against gravitational collapse is mostly determined by the nature of the equation of state of highly neutron-rich matter. In this contribution, we will report on our work in progress to calculate neutron star masses and radii. The equation of state is obtained microscopically from Brueckner-Hartree-Fock calculations based on state-of-the-art nuclear forces which have been developed within the framework of chiral effective field theory. The latter has become popular in recent years as a fundamental and systematic approach firmly connected to low-energy quantum chromodynamics. Supported by the Hill Undergraduate Fellowship and the U.S. Department of Energy.

13. Intelligent control based on fuzzy logic and neural net theory

NASA Technical Reports Server (NTRS)

Lee, Chuen-Chien

1991-01-01

In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.

14. Explanation-Based Theory Revision: An Approach to the Problems of Incomplete and Incorrect Theories

DTIC Science & Technology

1988-12-01

19. AISTRACT (Conaim, on revers if necessry and identify by Iock number) Knowledgerintensive Artificial Intelligence systems rely on a model of the...domain, called a domain theory, to fulfill their tasks. A domain theory consists of an encoding of the knowledge required by the system to draw...inferences about situa- tions of interest. Systems that rely on a domain theory face two difficult problems. 1) Their performance is directly related to tht

15. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

ERIC Educational Resources Information Center

2012-01-01

Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

16. Manpower Planning and Personnel Management Models Based on Utility Theory,

DTIC Science & Technology

1980-08-01

and Morgenstern [1947]. 2.3 Assessment of Utility Functions For decision problems with multiple objectives, multiattribute utility theory provides... multiattribute utility theory and applications. In Multiple Criteria Decision Making, M.K. Starr and M. Zelany (eds.), North Holland, Amsterdam. Fishburn...Princeton University Press, Princeton, NJ. Fishburn, P.C. (1977). Multiattribute utilities in expected utility theory . In Conflicting Objectives in

17. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

ERIC Educational Resources Information Center

Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

2012-01-01

Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

18. Evolutionary game theory using agent-based methods.

PubMed

Adami, Christoph; Schossau, Jory; Hintze, Arend

2016-12-01

Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations.

19. Optimisation of a honeybee-colony's energetics via social learning based on queuing delays

Thenius, Ronald; Schmickl, Thomas; Crailsheim, Karl

2008-06-01

Natural selection shaped the foraging-related processes of honeybees in such a way that a colony can react to changing environmental conditions optimally. To investigate this complex dynamic social system, we developed a multi-agent model of the nectar flow inside and outside of a honeybee colony. In a honeybee colony, a temporal caste collects nectar in the environment. These foragers bring their harvest into the colony, where they unload their nectar loads to one or more storer bees. Our model predicts that a cohort of foragers, collecting nectar from a single nectar source, is able to detect changes in quality in other food sources they have never visited, via the nectar processing system of the colony. We identified two novel pathways of forager-to-forager communication. Foragers can gain information about changes in the nectar flow in the environment via changes in their mean waiting time for unloadings and the number of experienced multiple unloadings. This way two distinct groups of foragers that forage on different nectar sources and that never communicate directly can share information via a third cohort of worker bees. We show that this noisy and loosely knotted social network allows a colony to perform collective information processing, so that a single forager has all necessary information available to be able to 'tune' its social behaviour, like dancing or dance-following. This way the net nectar gain of the colony is increased.

20. Stochastic extension of cellular manufacturing systems: a queuing-based analysis

Fardis, Fatemeh; Zandi, Afagh; Ghezavati, Vahidreza

2013-07-01

Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.

1. Queuing for Union Jobs and the Social Return to Schooling. Institute for Research on Poverty Discussion Papers. Report 360-76.

ERIC Educational Resources Information Center

Bishop, John

An analysis of the argument that a market imperfection (wage differentials and queuing caused by unions) raises the marginal social product (MSP) of college education above the average before-tax private wage premium (APP) for college (this discrepancy is called a union-Q-nality) focuses on verifying five hypotheses: (1) Workers with identical…

2. The Energetic Assessment of Frictional Instability Based on Rowe's Theory

Hirata, M.; Muto, J.; Nagahama, H.

2015-12-01

Frictional instability that controls the occurrence of unstable slips has been related to (1) rate and state dependent friction law (Dieterich, 1979; Ruina, 1983) and (2) shear localization in a gouge layer (e.g., Byerlee et al., 1978; Logan et al., 1979). Ikari et al. (2011) indicated that the transitions of frictional parameters obtained from the rate and state dependent friction law involve shear localization. However, the underlining theoretical background for their link has been unknown. Therefore, in this study, we investigate their relation theoretically and experimentally based on Rowe's theory on constant minimum energy ratio (Rowe, 1962) describing particle deformations quantitatively by energetic analysis. In theoretical analysis using analytical dynamics and irreversible thermodynamics, the energetic criterion about frictional instability is obtained; unstable slip occurs at energy ratios below 1. In friction experiments using a gas medium apparatus, simulated fault gouge deforms obeying the Rowe's theory. Additionally, the energy ratios change gradually with shear and show below 1 before the occurrence of unstable slip. Moreover, energy ratios are derived from volume changes. Transition of energy ratios from increase to decrease, which has been confirmed at the end of compaction, indicates the onset of volume increase toward the occurrence of unstable slip. The volume increases likely correspond to the formation of R1-shears with open mode character, which occurs prior to the unstable slip. Shear localization leads to a change in internal friction angle which is a statistical parameter to constitute a energy ratio. In short, changes in internal friction angle play an important role in evolving from being frictionally stable to unstable. From these results, the physical and energetic background for their link between the frictional parameter and shear localization becomes clear.

3. Switching theory-based steganographic system for JPEG images

Cherukuri, Ravindranath C.; Agaian, Sos S.

2007-04-01

Cellular communications constitute a significant portion of the global telecommunications market. Therefore, the need for secured communication over a mobile platform has increased exponentially. Steganography is an art of hiding critical data into an innocuous signal, which provide answers to the above needs. The JPEG is one of commonly used format for storing and transmitting images on the web. In addition, the pictures captured using mobile cameras are in mostly in JPEG format. In this article, we introduce a switching theory based steganographic system for JPEG images which is applicable for mobile and computer platforms. The proposed algorithm uses the fact that energy distribution among the quantized AC coefficients varies from block to block and coefficient to coefficient. Existing approaches are effective with a part of these coefficients but when employed over all the coefficients they show there ineffectiveness. Therefore, we propose an approach that works each set of AC coefficients with different frame work thus enhancing the performance of the approach. The proposed system offers a high capacity and embedding efficiency simultaneously withstanding to simple statistical attacks. In addition, the embedded information could be retrieved without prior knowledge of the cover image. Based on simulation results, the proposed method demonstrates an improved embedding capacity over existing algorithms while maintaining a high embedding efficiency and preserving the statistics of the JPEG image after hiding information.

4. Density functional theory based generalized effective fragment potential method

SciTech Connect

Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

2014-06-28

We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

5. IMMAN: free software for information theory-based chemometric analysis.

PubMed

Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

2015-05-01

The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

6. Physical bases for a triad of biological similarity theories.

PubMed

1986-01-01

The dimensional analysis of physics, based on the MLT-system (M = mass, L = length, T = time), can be applied to the living world, from mycoplasmas (10(-13) g) to the blue whales (10(8) g). Body mass (M), or body weight (W), are utilized as convenient reference systems, since they represent the integrated masses of all elementary particles--at the atomic level--which conform an organism. A triad of biological similarities (mechanical, biological, transport) have been previously described. Each similarity was based on two postulates, of which the first was common to all three, i.e., the constancy of body density; whereas the second postulates were specific for each of the three theories. In this study a physical foundation for these second postulates, based on three universal constants of nature, is presented, these are: 1) the acceleration of gravity (g = LT-2); 2) the velocity of light (c = LT-1); and 3) the mass-specific quantum (h/m = L2T-1). The realm of each of these biological similarities is the following: 1) the gravitational or mechanical similarity (where g = constant), deals mainly with the relationship between a whole organism and its environment, particularly with locomotion. The acceleration of gravity (g) is also one of the determining factors of the "potential" energy (E = m.g.H), where m is the mass, and H is the height above the reference level; 2) the electrodynamic similarity (formerly biological similarity), (c = constant), is able to quantitatively define the internal organization of an organism from both a morphological and a physiological point of view.(ABSTRACT TRUNCATED AT 250 WORDS)

7. An Approach to Theory-Based Youth Programming

ERIC Educational Resources Information Center

Duerden, Mat D.; Gillard, Ann

2011-01-01

A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…

8. Fowler Nordheim theory of carbon nanotube based field emitters

Parveen, Shama; Kumar, Avshish; Husain, Samina; Husain, Mushahid

2017-01-01

Field emission (FE) phenomena are generally explained in the frame-work of Fowler Nordheim (FN) theory which was given for flat metal surfaces. In this work, an effort has been made to present the field emission mechanism in carbon nanotubes (CNTs) which have tip type geometry at nanoscale. High aspect ratio of CNTs leads to large field enhancement factor and lower operating voltages because the electric field strength in the vicinity of the nanotubes tip can be enhanced by thousand times. The work function of nanostructure by using FN plot has been calculated with reverse engineering. With the help of modified FN equation, an important formula for effective emitting area (active area for emission of electrons) has been derived and employed to calculate the active emitting area for CNT field emitters. Therefore, it is of great interest to present a state of art study on the complete solution of FN equation for CNTs based field emitter displays. This manuscript will also provide a better understanding of calculation of different FE parameters of CNTs field emitters using FN equation.

9. Scheduling for indoor visible light communication based on graph theory.

PubMed

Tao, Yuyang; Liang, Xiao; Wang, Jiaheng; Zhao, Chunming

2015-02-09

Visible light communication (VLC) has drawn much attention in the field of high-rate indoor wireless communication. While most existing works focused on point-to-point VLC technologies, few studies have concerned multiuser VLC, where multiple optical access points (APs) transmit data to multiple user receivers. In such scenarios, inter-user interference constitutes the major factor limiting the system performance. Therefore, a proper scheduling scheme has to be proposed to coordinate the interference and optimize the whole system performance. In this work, we aim to maximize the sum rate of the system while taking into account user fairness by appropriately assigning LED lamps to multiple users. The formulated scheduling problem turns out to be a maximum weighted independent set problem. We then propose a novel and efficient resource allocation method based on graph theory to achieve high sum rates. Moreover, we also introduce proportional fairness into our scheduling scheme to ensure the user fairness. Our proposed scheduling scheme can, with low complexity, achieve more multiplexing gains, higher sum rate, and better fairness than the existing works.

10. A model of dissociation based on attachment theory and research.

PubMed

Liotti, Giovanni

2006-01-01

The article offers an historical review of studies on the role played by attachment processes in dissociative psychopathology. The treatise proceeds from Bowlby's first insights, through Main and her collaborators' empirical studies on attachment disorganization, to the first formulation of the hypothesis linking disorganized early attachment to pathological dissociation. Recent research supporting the hypothesis is then reviewed. It is concluded that infant attachment disorganization is in itself a dissociative process, and predisposes the individual to respond with pathological dissociation to later traumas and life stressors. Four implications of this theory are interspersed in the review and are discussed in the final section: (1) pathological dissociation should be viewed as a primarily intersubjective reality hindering the integrative processes of consciousness, rather than as an intrapsychic defense against mental pain; (2) early defenses against attachment-related dissociation are based on interpersonal controlling strategies that inhibit the attachment system; (3) dissociative symptoms emerge as a consequence of the collapse of these defensive strategies in the face of events that powerfully activate the attachment system; (4) psychotherapy of pathological dissociation should be a phase-oriented process focused primarily on achieving attachment security, and only secondarily on trauma work. Research studies on the psychotherapy process could test some predictions of this model.

11. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

PubMed Central

Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

2016-01-01

Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

12. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

PubMed

Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

2016-01-18

Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

13. The Development of an Attribution-Based Theory of Motivation: A History of Ideas

ERIC Educational Resources Information Center

Weiner, Bernard

2010-01-01

The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the…

14. The Application of Carl Rogers' Person-Centered Learning Theory to Web-Based Instruction.

ERIC Educational Resources Information Center

Miller, Christopher T.

This paper provides a review of literature that relates research on Carl Rogers' person-centered learning theory to Web-based learning. Based on the review of the literature, a set of criteria is described that can be used to determine how closely a Web-based course matches the different components of Rogers' person-centered learning theory. Using…

15. Prior individual training and self-organized queuing during group emergency escape of mice from water pool.

PubMed

Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia

2015-01-01

We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using 'actors'. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit-they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ρ of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)-a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ρ = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (ρ = 11.9%) produced a longer average individual escape time.

16. Prior Individual Training and Self-Organized Queuing during Group Emergency Escape of Mice from Water Pool

PubMed Central

Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia

2015-01-01

We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using ‘actors’. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit–they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ρ of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)—a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ρ = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (ρ = 11.9%) produced a longer average individual escape time. PMID:25693170

17. An approach to theory-based youth programming.

PubMed

Duerden, Mat D; Gillard, Ann

2011-01-01

A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development practice, they often remain inaccessible to practitioners. Therefore, the goal of this article is to synthesize two theoretical perspectives, the social development model and self-determination theory, into a practitioner-friendly programming framework. The resulting social development programming model outlines specific components, processes, and outcomes of effective and intentional youth development programs.

18. Capacity and Delay Estimation for Roundabouts Using Conflict Theory

PubMed Central

Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

2014-01-01

To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982

19. Capacity and delay estimation for roundabouts using conflict theory.

PubMed

Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

2014-01-01

To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout.

20. Trends in information theory-based chemical structure codification.

PubMed

Barigye, Stephen J; Marrero-Ponce, Yovani; Pérez-Giménez, Facundo; Bonchev, Danail

2014-08-01

This report offers a chronological review of the most relevant applications of information theory in the codification of chemical structure information, through the so-called information indices. Basically, these are derived from the analysis of the statistical patterns of molecular structure representations, which include primitive global chemical formulae, chemical graphs, or matrix representations. Finally, new approaches that attempt to go "back to the roots" of information theory, in order to integrate other information-theoretic measures in chemical structure coding are discussed.

1. Toward A Brain-Based Theory of Beauty

PubMed Central

Ishizu, Tomohiro; Zeki, Semir

2011-01-01

We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1–9, with 9 being the most beautiful. This allowed us to select three sets of stimuli–beautiful, indifferent and ugly–which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources–musical and visual–and probably by other sources as well. This has led us to formulate a brain-based theory of beauty. PMID:21755004

2. Predicting the Number of Public Computer Terminals Needed for an On-Line Catalog: A Queuing Theory Approach.

ERIC Educational Resources Information Center

Knox, A. Whitney; Miller, Bruce A.

1980-01-01

Describes a method for estimating the number of cathode ray tube terminals needed for public use of an online library catalog. Authors claim method could also be used to estimate needed numbers of microform readers for a computer output microform (COM) catalog. Formulae are included. (Author/JD)

3. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

ERIC Educational Resources Information Center

Han, Gang; Newell, Jay

2014-01-01

This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

4. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

ERIC Educational Resources Information Center

Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

2008-01-01

This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

5. Models for Theory-Based M.A. and Ph.D. Programs.

ERIC Educational Resources Information Center

Botan, Carl; Vasquez, Gabriel

1999-01-01

Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

6. Cluster density functional theory for lattice models based on the theory of Möbius functions

Lafuente, Luis; Cuesta, José A.

2005-08-01

Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Möbius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Möbius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.

7. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

PubMed

Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

2012-09-01

Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change.

8. Percolation theory on interdependent networks based on epidemic spreading

Son, Seung-Woo; Bizhani, Golnoosh; Christensen, Claire; Grassberger, Peter; Paczuski, Maya

2012-01-01

We consider percolation on interdependent locally treelike networks, recently introduced by Buldyrev S. V. et al., Nature, 464 (2010) 1025, and demonstrate that the problem can be simplified conceptually by deleting all references to cascades of failures. Such cascades do exist, but their explicit treatment just complicates the theory —which is a straightforward extension of the usual epidemic spreading theory on a single network. Our method has the added benefits that it is directly formulated in terms of an order parameter and its modular structure can be easily extended to other problems, e.g. to any number of interdependent networks, or to networks with dependency links.

9. Effective Contraceptive Use: An Exploration of Theory-Based Influences

ERIC Educational Resources Information Center

Peyman, N.; Oakley, D.

2009-01-01

The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…

10. Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection

Snarska, M.; Krzych, J.

2006-11-01

Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.

11. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

ERIC Educational Resources Information Center

Waycott, Jenny; Jones, Ann; Scanlon, Eileen

2005-01-01

This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

12. Course Management and Students' Expectations: Theory-Based Considerations

ERIC Educational Resources Information Center

Buckley, M. Ronald; Novicevic, Milorad M.; Halbesleben, Jonathon R. B.; Harvey, Michael

2004-01-01

This paper proposes a theoretical, yet practical, framework for managing the formation process of students unrealistic expectations in a college course. Using relational contracting theory, alternative teacher interventions, aimed at effective management of students expectations about the course, are described. Also, the formation of the student…

13. Videogames, Tools for Change: A Study Based on Activity Theory

ERIC Educational Resources Information Center

Méndez, Laura; Lacasa, Pilar

2015-01-01

Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

14. Ground Movement Analysis Based on Stochastic Medium Theory

PubMed Central

Fei, Meng; Li-chun, Wu; Jia-sheng, Zhang; Guo-dong, Deng; Zhi-hui, Ni

2014-01-01

In order to calculate the ground movement induced by displacement piles driven into horizontal layered strata, an axisymmetric model was built and then the vertical and horizontal ground movement functions were deduced using stochastic medium theory. Results show that the vertical ground movement obeys normal distribution function, while the horizontal ground movement is an exponential function. Utilizing field measured data, parameters of these functions can be obtained by back analysis, and an example was employed to verify this model. Result shows that stochastic medium theory is suitable for calculating the ground movement in pile driving, and there is no need to consider the constitutive model of soil or contact between pile and soil. This method is applicable in practice. PMID:24701184

15. Monopoles in a grand unified theory based on SU(5)

Scott, D. M.

1980-10-01

A new potential is presented which leads to the symmetry-breaking scheme of Georgi and Glashow. Using this potential the classical masses of the particles of the theory are calculated. The model contains monopoles. The vector boson masses and the masses of the monopoles satisfy m 2 ⩾ v 2 ( {y 2}/{4 cos2 θ w} + cos2 θ wg 2) , where v is the length of the Higgs field transforming according to the adjoint representation when the field is in the vacuum, y is the hypercharge of the particle, and g its magnetic charge. θ w is the weak mixing angle. The ' J + T' symmetric monopoles of the theory are explicitly shown to obey a generalised Dirac quantisation condition.

16. Literary pedagogy in nursing: a theory-based perspective.

PubMed

Sakalys, Jurate A

2002-09-01

Using fictional and autobiographical literature in nursing education is a primary way of understanding patients' lived experiences and fostering development of essential relational and reflective thinking skills. Application of literary theory to this pedagogic practice can expand conceptualization of teaching goals, inform specific teaching strategies, and potentially contribute to socially consequential educational outcomes. This article describes a theoretical schema that focuses on pedagogical goals in terms of the three related skills (i.e., reading, interpretation, criticism) of textual competence.

17. Collective learning modeling based on the kinetic theory of active particles.

PubMed

Burini, D; De Lillo, S; Gibelli, L

2016-03-01

This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

18. Circuit theory and model-based inference for landscape connectivity

USGS Publications Warehouse

Hanks, Ephraim M.; Hooten, Mevin B.

2013-01-01

Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.

19. Development and Validation of a Theory Based Screening Process for Suicide Risk

DTIC Science & Technology

2014-09-01

AD_________________ Award Number: W81XWH-11-1-0588 TITLE: Development and Validation of a Theory Based Screening Process for Suicide Risk...DATES COVERED 4. TITLE AND SUBTITLE Development and Validation of a Theory Based Screening Process for Suicide Risk 5a. CONTRACT NUMBER 5b...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The ultimate objective of this study is to assist in increasing the capacity of military-based

20. Diffraction-based optical filtering: Theory and implementation with MEMS

Belikov, Ruslan

An important functionality in many optical systems is to manipulate the spectral content of light. Diffractive optics has been used widely for this purpose. Typically, in such systems a diffractive element essentially acts as an optical filter on the incident beam of light. However, no comprehensive theory of this type of filtering existed. Furthermore, recent advances in MEMS technology have enabled reconfigurable diffractive optical elements, which make it possible to create programmable spectral filters. Such devices can lead to significant advances in many applications and enable new classes of optical instruments and systems. Hence, a need arose to develop an understanding of the capabilities and limitations of such devices. The theory presented in this work answers three main questions: (1) how does one synthesize a diffractive optical element (DOE) for a desired filter; (2) what are the capabilities and limitations on such filters; and (3) what is the best device to use? We present two analytical algorithms to compute the DOE for any complex-valued linear filter, and thus answer question 1. The theory also leads to an understanding that there are fundamental trade-offs between filter complexity, power, error, and spectral range, which answers question 2. We then show that a fully arbitrary DOE is very redundant as a filter, and that we can maintain full functionality by a much simpler device, answering question 3. We then apply the theory to existing devices, which leads to the understanding of their capabilites and limitations. Furthermore, the theory led to the discovery that some well-known MEMS devices, such as the Texas Instruments DMD array, can be used as arbitrary spectral filters. Using the DMD, we demonstrate three applications that can benefit from this technology: correlation spectroscopy, femtosecond pulseshaping, and tunable lasers. In all three applications, we enable functionality never achieved before. The most significant achievement is our

1. A theory-based approach to teaching young children about health: A recipe for understanding

PubMed Central

Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

2011-01-01

The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

2. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

ERIC Educational Resources Information Center

Bresciani, Marilee J.

2011-01-01

The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

3. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

ERIC Educational Resources Information Center

McQuillin, Samuel D.; Lyons, Michael D.

2016-01-01

This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

4. Social Learning Theory Parenting Intervention Promotes Attachment-Based Caregiving in Young Children: Randomized Clinical Trial

ERIC Educational Resources Information Center

O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

2013-01-01

Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…

5. Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory

ERIC Educational Resources Information Center

Johnson, David W.; Johnson, Roger T.; Smith, Karl A.

2014-01-01

Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…

6. Curriculum Design for Junior Life Sciences Based Upon the Theories of Piaget and Skiller. Final Report.

ERIC Educational Resources Information Center

Pearce, Ella Elizabeth

Four seventh grade life science classes, given curriculum materials based upon Piagetian theories of intellectual development and Skinner's theories of secondary reinforcement, were compared with four control classes from the same school districts. Nine students from each class, who(at the pretest) were at the concrete operations stage of…

7. Involving Stakeholders in Programme Theory Specification: Discussion of a Systematic, Consensus-Based Approach

ERIC Educational Resources Information Center

van Urk, Felix; Grant, Sean; Bonell, Chris

2016-01-01

The use of explicit programme theory to guide evaluation is widely recommended. However, practitioners and other partnering stakeholders often initiate programmes based on implicit theories, leaving researchers to explicate them before commencing evaluation. The current study aimed to apply a systematic method to undertake this process. We…

8. The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation

ERIC Educational Resources Information Center

Moorer, Cleamon, Jr.

2014-01-01

This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…

9. Interactive Image Segmentation Framework Based On Control Theory.

PubMed

Zhu, Liangjia; Kolesov, Ivan; Karasev, Peter; Tannenbaum, Allen

2015-02-21

Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design and analyze an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated.

10. Interactive image segmentation framework based on control theory

Zhu, Liangjia; Kolesov, Ivan; Ratner, Vadim; Karasev, Peter; Tannenbaum, Allen

2015-03-01

Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated.

11. Branes in Extended Spacetime: Brane Worldvolume Theory Based on Duality Symmetry.

PubMed

Sakatani, Yuho; Uehara, Shozo

2016-11-04

We propose a novel approach to the brane worldvolume theory based on the geometry of extended field theories: double field theory and exceptional field theory. We demonstrate the effectiveness of this approach by showing that one can reproduce the conventional bosonic string and membrane actions, and the M5-brane action in the weak-field approximation. At a glance, the proposed 5-brane action without approximation looks different from the known M5-brane actions, but it is consistent with the known nonlinear self-duality relation, and it may provide a new formulation of a single M5-brane action. Actions for exotic branes are also discussed.

12. Tumour chemotherapy strategy based on impulse control theory.

PubMed

Ren, Hai-Peng; Yang, Yan; Baptista, Murilo S; Grebogi, Celso

2017-03-06

Chemotherapy is a widely accepted method for tumour treatment. A medical doctor usually treats patients periodically with an amount of drug according to empirical medicine guides. From the point of view of cybernetics, this procedure is an impulse control system, where the amount and frequency of drug used can be determined analytically using the impulse control theory. In this paper, the stability of a chemotherapy treatment of a tumour is analysed applying the impulse control theory. The globally stable condition for prescription of a periodic oscillatory chemotherapeutic agent is derived. The permanence of the solution of the treatment process is verified using the Lyapunov function and the comparison theorem. Finally, we provide the values for the strength and the time interval that the chemotherapeutic agent needs to be applied such that the proposed impulse chemotherapy can eliminate the tumour cells and preserve the immune cells. The results given in the paper provide an analytical formula to guide medical doctors to choose the theoretical minimum amount of drug to treat the cancer and prevent harming the patients because of over-treating.This article is part of the themed issue 'Horizons of cybernetical physics'.

13. Assembly models for Papovaviridae based on tiling theory

Keef, T.; Taormina, A.; Twarock, R.

2005-09-01

A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail.

14. Improving the Impact and Implementation of Disaster Education: Programs for Children Through Theory-Based Evaluation.

PubMed

Johnson, Victoria A; Ronan, Kevin R; Johnston, David M; Peace, Robin

2016-11-01

A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches.

15. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

PubMed

Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

2010-01-01

The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions.

16. An anti-attack model based on complex network theory in P2P networks

Peng, Hao; Lu, Songnian; Zhao, Dandan; Zhang, Aixin; Li, Jianhua

2012-04-01

Complex network theory is a useful way to study many real systems. In this paper, an anti-attack model based on complex network theory is introduced. The mechanism of this model is based on a dynamic compensation process and a reverse percolation process in P2P networks. The main purpose of the paper is: (i) a dynamic compensation process can turn an attacked P2P network into a power-law (PL) network with exponential cutoff; (ii) a local healing process can restore the maximum degree of peers in an attacked P2P network to a normal level; (iii) a restoring process based on reverse percolation theory connects the fragmentary peers of an attacked P2P network together into a giant connected component. In this way, the model based on complex network theory can be effectively utilized for anti-attack and protection purposes in P2P networks.

17. Determination of the sediment carrying capacity based on perturbed theory.

PubMed

Ni, Zhi-hui; Zeng, Qiang; Li-chun, Wu

2014-01-01

According to the previous studies of sediment carrying capacity, a new method of sediment carrying capacity on perturbed theory was proposed. By taking into account the average water depth, average flow velocity, settling velocity, and other influencing factors and introducing the median grain size as one main influencing factor in deriving the new formula, we established a new sediment carrying capacity formula. The coefficients were determined by the principle of dimensional analysis, multiple linear regression method, and the least square method. After that, the new formula was verified through measuring data of natural rivers and flume tests and comparing the verified results calculated by Cao Formula, Zhang Formula, Li Formula, Engelung-Hansen Formula, Ackers-White Formula, and Yang Formula. According to the compared results, it can be seen that the new method is of high accuracy. It could be a useful reference for the determination of sediment carrying capacity.

18. Topological analysis of metabolic networks based on petri net theory.

PubMed

Zevedei-Oancea, Ionela; Schuster, Stefan

2011-01-01

Petri net concepts provide additional tools for the modelling of metabolic networks. Here, the similarities between the counterparts in traditional biochemical modelling and Petri net theory are discussed. For example the stoichiometry matrix of a metabolic network corresponds to the incidence matrix of the Petri net. The flux modes and conservation relations have the T-invariants, respectively, P-invariants as counterparts. We reveal the biological meaning of some notions specific to the Petri net framework (traps, siphons, deadlocks, liveness). We focus on the topological analysis rather than on the analysis of the dynamic behaviour. The treatment of external metabolites is discussed. Some simple theoretical examples are presented for illustration. Also the Petri nets corresponding to some biochemical networks are built to support our results. For example, the role of triose phosphate isomerase (TPI) in Trypanosoma brucei metabolism is evaluated by detecting siphons and traps. All Petri net properties treated in this contribution are exemplified on a system extracted from nucleotide metabolism.

19. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

DTIC Science & Technology

2002-08-01

the measurement noise, as well as the physical model of the forward scattered electric field. The Bayesian algorithms for the Uncertain Permittivity...received at multiple sensors. In this research project a tissue- model -based signal-detection theory approach for the detection of mammary tumors in the...oriented information processors. In this research project a tissue- model - based signal detection theory approach for the detection of mammary tumors in the

20. Cyclic Voltammetric Analysis of Ferrocene Alkanethiol Monolayer Electrode Kinetics Based on Marcus Theory

DTIC Science & Technology

1994-07-01

Analysis of Ferrocene Alkanethiol Monolayer Electrode Kinectics Based on Marcus Theory & AUTHOR(S) N00014-90-J-1230 L. Tender, M. T. Carter, and R. W. Murray...immobilized monolayers in cyclic voltammetry is developed based on the Marcus free energy -rate relation. Numerical calculations show that when the...applied over-potential exceeds ca. 30% of the reorganizational energy of the electrode reaction, voltammetry predicted from Marcus theory differs from

1. Interlaminar Stresses by Refined Beam Theories and the Sinc Method Based on Interpolation of Highest Derivative

NASA Technical Reports Server (NTRS)

Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander

2010-01-01

Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.

2. Wheeling rates based on marginal-cost theory

SciTech Connect

Merrill, H.M.; Erickson, B.W. )

1989-11-01

Knowledge of what rates for wheeling electric power would be, if based on marginal costs, is vital in the debate on how wheeling should be priced. This paper presents the first extensive computations of marginal costs of wheeling, and of rates based on these marginal costs. Sensitivities to losses, constraints, load levels, amount of power wheeled, revenue reconciliation, etc., are examined in the context of two case studies.

3. Treatment motivation in drug users: a theory-based analysis.

PubMed

Longshore, Douglas; Teruya, Cheryl

2006-02-01

Motivation for drug use treatment is widely regarded as crucial to a client's engagement in treatment and success in quitting drug use. Motivation is typically measured with items reflecting high treatment readiness (e.g., perceived need for treatment and commitment to participate) and low treatment resistance (e.g., skepticism regarding benefits of treatment). Building upon reactance theory and the psychotherapeutic construct of resistance, we conceptualized these two aspects of treatment motivation - readiness and resistance - as distinct constructs and examined their predictive power in a sample of 1295 drug-using offenders referred to treatment while on probation. The sample was 60.7% African Americans, 33.5% non-Hispanic Whites, and 21.2% women; their ages ranged from 16 to 63 years old. Interviews occurred at treatment entry and 6 months later. Readiness (but not resistance) predicted treatment retention during the 6-month period. Resistance (but not readiness) predicted drug use, especially among offenders for whom the treatment referral was coercive. These findings suggest that readiness and resistance should both be assessed among clients entering treatment, especially when the referral is coercive. Intake and counseling protocols should address readiness and resistance separately.

4. Resolving defence mechanisms: A perspective based on dissipative structure theory.

PubMed

Zhang, Wei; Guo, Ben-Yu

2017-04-01

Theories and classifications of defence mechanisms are not unified. This study addresses the psychological system as a dissipative structure which exchanges information with the external and internal world. When using defence mechanisms, the cognitive-affective schema of an individual could remain stable and ordered by excluding psychological entropy, obtaining psychological negentropy or by dissipating the energy of self-presentation. From this perspective, defences can be classified into three basic types: isolation, compensation and self-dissipation. However, not every kind of defence mechanisms can actually help the individual. Non-adaptive defences are just functioning as an effective strategy in the short run but can be a harmful approach in the long run, while adaptive defences could instead help the individual as a long-term mechanism. Thus, we would like to suggest that it is more useful for the individual to use more adaptive defence mechanisms and seek out social or interpersonal support when undergoing psychic difficulties. As this model of defences is theoretical at present, we therefore aim to support and enrich this viewpoint with empirical evidence.

5. Improved routing strategy based on gravitational field theory

Song, Hai-Quan; Guo, Jin

2015-10-01

Routing and path selection are crucial for many communication and logistic applications. We study the interaction between nodes and packets and establish a simple model for describing the attraction of the node to the packet in transmission process by using the gravitational field theory, considering the real and potential congestion of the nodes. On the basis of this model, we propose a gravitational field routing strategy that considers the attractions of all of the nodes on the travel path to the packet. In order to illustrate the efficiency of proposed routing algorithm, we introduce the order parameter to measure the throughput of the network by the critical value of phase transition from a free flow phase to a congested phase, and study the distribution of betweenness centrality and traffic jam. Simulations show that, compared with the shortest path routing strategy, the gravitational field routing strategy considerably enhances the throughput of the network and balances the traffic load, and nearly all of the nodes are used efficiently. Project supported by the Technology and Development Research Project of China Railway Corporation (Grant No. 2012X007-D) and the Key Program of Technology and Development Research Foundation of China Railway Corporation (Grant No. 2012X003-A).

6. Cost performance satellite design using queueing theory. [via digital simulation

NASA Technical Reports Server (NTRS)

Hein, G. F.

1975-01-01

A modified Poisson arrival, infinite server queuing model is used to determine the effects of limiting the number of broadcast channels (C) of a direct broadcast satellite used for public service purposes (remote health care, education, etc.). The model is based on the reproductive property of the Poisson distribution. A difference equation has been developed to describe the change in the Poisson parameter. When all initially delayed arrivals reenter the system a (C plus 1) order polynomial must be solved to determine the effective value of the Poisson parameter. When less than 100% of the arrivals reenter the system the effective value must be determined by solving a transcendental equation. The model was used to determine the minimum number of channels required for a disaster warning satellite without degradation in performance. Results predicted by the queuing model were compared with the results of digital simulation.

7. Moderators of Theory-Based Interventions to Promote Physical Activity in 77 Randomized Controlled Trials.

PubMed

Bernard, Paquito; Carayol, Marion; Gourlan, Mathieu; Boiché, Julie; Romain, Ahmed Jérôme; Bortolon, Catherine; Lareyre, Olivier; Ninot, Gregory

2017-04-01

A meta-analysis of randomized controlled trials (RCTs) has recently showed that theory-based interventions designed to promote physical activity (PA) significantly increased PA behavior. The objective of the present study was to investigate the moderators of the efficacy of these theory-based interventions. Seventy-seven RCTs evaluating theory-based interventions were systematically identified. Sample, intervention, methodology, and theory implementation characteristics were extracted, coded by three duos of independent investigators, and tested as moderators of interventions effect in a multiple-meta-regression model. Three moderators were negatively associated with the efficacy of theory-based interventions on PA behavior: intervention length (≥14 weeks; β = -.22, p = .004), number of experimental patients (β = -.10, p = .002), and global methodological quality score (β = -.08, p = .04). Our findings suggest that the efficacy of theory-based interventions to promote PA could be overestimated consequently due to methodological weaknesses of RCTs and that interventions shorter than 14 weeks could maximize the increase of PA behavior.

8. Boundary based on exchange symmetry theory for multilevel simulations. I. Basic theory.

PubMed

Shiga, Motoyuki; Masia, Marco

2013-07-28

In this paper, we lay the foundations for a new method that allows multilevel simulations of a diffusive system, i.e., a system where a flux of particles through the boundaries might disrupt the primary region. The method is based on the use of flexible restraints that maintain the separation between inner and outer particles. It is shown that, by introducing a bias potential that accounts for the exchange symmetry of the system, the correct statistical distribution is preserved. Using a toy model consisting of non-interacting particles in an asymmetric potential well, we prove that the method is formally exact, and that it could be simplified by considering only up to a couple of particle exchanges without a loss of accuracy. A real-world test is then made by considering a hybrid MM(∗)/MM calculation of cesium ion in water. In this case, the single exchange approximation is sound enough that the results superimpose to the exact solutions. Potential applications of this method to many different hybrid QM/MM systems are discussed, as well as its limitations and strengths in comparison to existing approaches.

9. An open-shell restricted Hartree-Fock perturbation theory based on symmetric spin orbitals

NASA Technical Reports Server (NTRS)

Lee, Timothy J.; Jayatilaka, Dylan

1993-01-01

A new open-shell perturbation theory is formulated in terms of symmetric spin orbitals. Only one set of spatial orbitals is required, thereby reducing the number of independent coefficients in the perturbed wavefunctions. For second order, the computational cost is shown to be similar to a closed-shell calculation. This formalism is therefore more efficient than the recently developed RMP, ROMP or RMP-MBPT theories. The perturbation theory described herein was designed to have a close correspondence with our recently proposed coupled-cluster theory based on symmetric spin orbitals. The first-order wavefunction contains contributions from only doubly excited determinants. Equilibrium structures and vibrational frequencies determined from second-order perturbation theory are presented for OH, NH, CH, 02, NH2 and CH2.

10. Competency-based medical education: theory to practice.

PubMed

Frank, Jason R; Snell, Linda S; Cate, Olle Ten; Holmboe, Eric S; Carraccio, Carol; Swing, Susan R; Harris, Peter; Glasgow, Nicholas J; Campbell, Craig; Dath, Deepak; Harden, Ronald M; Iobst, William; Long, Donlin M; Mungroo, Rani; Richardson, Denyse L; Sherbino, Jonathan; Silver, Ivan; Taber, Sarah; Talbot, Martin; Harris, Kenneth A

2010-01-01

Although competency-based medical education (CBME) has attracted renewed interest in recent years among educators and policy-makers in the health care professions, there is little agreement on many aspects of this paradigm. We convened a unique partnership - the International CBME Collaborators - to examine conceptual issues and current debates in CBME. We engaged in a multi-stage group process and held a consensus conference with the aim of reviewing the scholarly literature of competency-based medical education, identifying controversies in need of clarification, proposing definitions and concepts that could be useful to educators across many jurisdictions, and exploring future directions for this approach to preparing health professionals. In this paper, we describe the evolution of CBME from the outcomes movement in the 20th century to a renewed approach that, focused on accountability and curricular outcomes and organized around competencies, promotes greater learner-centredness and de-emphasizes time-based curricular design. In this paradigm, competence and related terms are redefined to emphasize their multi-dimensional, dynamic, developmental, and contextual nature. CBME therefore has significant implications for the planning of medical curricula and will have an important impact in reshaping the enterprise of medical education. We elaborate on this emerging CBME approach and its related concepts, and invite medical educators everywhere to enter into further dialogue about the promise and the potential perils of competency-based medical curricula for the 21st century.

11. A Conceptual Framework Based on Activity Theory for Mobile CSCL

ERIC Educational Resources Information Center

Zurita, Gustavo; Nussbaum, Miguel

2007-01-01

There is a need for collaborative group activities that promote student social interaction in the classroom. Handheld computers interconnected by a wireless network allow people who work on a common task to interact face to face while maintaining the mediation afforded by a technology-based system. Wirelessly interconnected handhelds open up new…

12. Content Based Image Retrieval and Information Theory: A General Approach.

ERIC Educational Resources Information Center

Zachary, John; Iyengar, S. S.; Barhen, Jacob

2001-01-01

Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

13. Scale-invariant entropy-based theory for dynamic ordering

SciTech Connect

Mahulikar, Shripad P. E-mail: spm@aero.iitb.ac.in; Kumari, Priti

2014-09-01

Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations.

14. Modeling powder encapsulation in dosator-based machines: I. Theory.

PubMed

Khawam, Ammar

2011-12-15

Automatic encapsulation machines have two dosing principles: dosing disc and dosator. Dosator-based machines compress the powder to plugs that are transferred into capsules. The encapsulation process in dosator-based capsule machines was modeled in this work. A model was proposed to predict the weight and length of produced plugs. According to the model, the plug weight is a function of piston dimensions, powder-bed height, bulk powder density and precompression densification inside dosator while plug length is a function of piston height, set piston displacement, spring stiffness and powder compressibility. Powder densification within the dosator can be achieved by precompression, compression or both. Precompression densification depends on the powder to piston height ratio while compression densification depends on piston displacement against powder. This article provides the theoretical basis of the encapsulation model, including applications and limitations. The model will be applied to experimental data separately.

15. A Proxy Signature Scheme Based on Coding Theory

Jannati, Hoda; Falahati, Abolfazl

Proxy signature helps the proxy signer to sign messages on behalf of the original signer. This signature is used when the original signer is not available to sign a specific document. In this paper, we introduce a new proxy signature scheme based on Stern's identification scheme whose security depends on syndrome decoding problem. The proposed scheme is the first code-based proxy signature and can be used in a quantum computer. In this scheme, the operations to perform are linear and very simple thus the signature is performed quickly and can be implemented using smart card in a quite efficient way. The proposed scheme also satisfies unforgeability, undeniability, non-transferability and distinguishability properties which are the security requirements for a proxy signature.

16. Specification-based Error Recovery: Theory, Algorithms, and Usability

DTIC Science & Technology

2013-02-01

The basis of the methodology is a view of the specification as a non-deterministic implementation, which may permit a high degree of non-determinism...developed, optimized and rigorously evaluated in this project. It leveraged the Alloy specification language and its SAT-based tool-set as an enabling...a high degree of non-determinism. The key insight is to use likely correct actions by an otherwise erroneous execu- tion to prune the non-determinism

17. Research on e-learning services based on ontology theory

Liu, Rui

2013-07-01

E-learning services can realize network learning resource sharing and interoperability, but they can't realize automatic discovery, implementation and integration of services. This paper proposes a framework of e-learning services based on ontology, the ontology technology is applied to the publication and discovery process of e-learning services, in order to realize accurate and efficient retrieval and utilization of e-learning services.

18. Sensor-Based Collision Avoidance: Theory and Experiments

NASA Technical Reports Server (NTRS)

Seraji, Homayoun; Steele, Robert; Ivlev, Robert

1996-01-01

A new on-line control strategy for sensor-based collision avoidance of manipulators and supporting experimental results are presented in this article. This control strategy is based on nullification of virtual forces applied to the end-effector by a hypothetical spring-plus-damper attached to the object's surface. In the proposed approach, the real-time arm control software continuously monitors the object distance measured by the arm-mounted proximity sensors. When this distance is less than a preset threshold, the collision avoidance control action is initiated to inhibit motion toward the object and thus prevent collision. This is accomplished by employing an outer feedback loop to perturb the end-effector nominal motion trajectory in real-time based on the sensory data. The perturbation is generated by a proportional-plus-integral (PI) collision avoidance controller acting on the difference between the sensed distance and the preset threshold. This approach is computationally very fast, requires minimal modification to the existing manipulator positioning system, and provides the manipulator with an on-line collision avoidance capability to react autonomously and intelligently. A dexterous RRC robotic arm is instrumented with infrared proximity sensors and is operated under the proposed collision avoidance strategy. Experimental results are presented to demonstrate end-effector collision avoidance both with an approaching object and while reaching inside a constricted opening.

19. Microstructure-dependent piezoelectric beam based on modified strain gradient theory

Li, Y. S.; Feng, W. J.

2014-09-01

A microstructure-dependent piezoelectric beam model was developed using a variational formulation, which is based on the modified strain gradient theory and the Timoshenko beam theory. The new model contains three material length scale parameters and can capture the size effect, unlike the classical beam theory. To illustrate the new piezoelectric beam model, the static bending and the free vibration problems of a simply supported beam are numerically solved. These results may be useful in the analysis and design of smart structures that are constructed from piezoelectric materials.

20. Graph Theory-Based Pinning Synchronization of Stochastic Complex Dynamical Networks.

PubMed

Li, Xiao-Jian; Yang, Guang-Hong

2017-02-01

This paper is concerned with the adaptive pinning synchronization problem of stochastic complex dynamical networks (CDNs). Based on algebraic graph theory and Lyapunov theory, pinning controller design conditions are derived, and the rigorous convergence analysis of synchronization errors in the probability sense is also conducted. Compared with the existing results, the topology structures of stochastic CDN are allowed to be unknown due to the use of graph theory. In particular, it is shown that the selection of nodes for pinning depends on the unknown lower bounds of coupling strengths. Finally, an example on a Chua's circuit network is given to validate the effectiveness of the theoretical results.

1. Following Human Footsteps: Proposal of a Decision Theory Based on Human Behavior

NASA Technical Reports Server (NTRS)

Mahmud, Faisal

2011-01-01

Human behavior is a complex nature which depends on circumstances and decisions varying from time to time as well as place to place. The way a decision is made either directly or indirectly related to the availability of the options. These options though appear at random nature, have a solid directional way for decision making. In this paper, a decision theory is proposed which is based on human behavior. The theory is structured with model sets that will show the all possible combinations for making a decision, A virtual and simulated environment is considered to show the results of the proposed decision theory

2. Energy-efficiency analysis of a distributed queuing medium access control protocol for biomedical wireless sensor networks in saturation conditions.

PubMed

Otal, Begonya; Alonso, Luis; Verikoukis, Christos

2011-01-01

The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC) protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs). The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors' energy consumption in order to prolong sensors' battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions). In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs), which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ) MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead.

3. Energy-Efficiency Analysis of a Distributed Queuing Medium Access Control Protocol for Biomedical Wireless Sensor Networks in Saturation Conditions

PubMed Central

Otal, Begonya; Alonso, Luis; Verikoukis, Christos

2011-01-01

The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC) protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs). The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors’ energy consumption in order to prolong sensors’ battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions). In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs), which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ) MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead. PMID:22319351

4. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

5. The Effect Of The Materials Based On Multiple Intelligence Theory Upon The Intelligence Groups' Learning Process

Oral, I.; Dogan, O.

2007-04-01

The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.

6. The Cultures of Contemporary Instructional Design Scholarship, Part Two: Developments Based on Constructivist and Critical Theory Foundations

ERIC Educational Resources Information Center

Willis, Jerry

2011-01-01

This article is the second in a series (see Willis, 2011) that looks at the current status of instructional design scholarship and theory. In this concluding article, the focus is on two cultures of ID work, one based on constructivist and interpretivist theory and the other based on critical theory and critical pedagogy. There are distinct…

7. Theory of zwitterionic molecular-based organic magnets

Shelton, William A.; Aprà, Edoardo; Sumpter, Bobby G.; Saraiva-Souza, Aldilene; Souza Filho, Antonio G.; Nero, Jordan Del; Meunier, Vincent

2011-08-01

We describe a class of organic molecular magnets based on zwitterionic molecules (betaine derivatives) possessing donor, π bridge, and acceptor groups. Using extensive electronic structure calculations we show the electronic ground-state in these systems is magnetic. In addition, we show that the large energy differences computed for the various magnetic states indicate a high Neel temperature. The quantum mechanical nature of the magnetic properties originates from the conjugated π bridge (only p electrons) in cooperation with the molecular donor-acceptor character. The exchange interactions between electron spin are strong, local, and independent on the length of the π bridge.

8. Ductile damage modeling based on void coalescence and percolation theories

SciTech Connect

Tonks, D.L.; Zurek, A.K.; Thissell, W.R.

1995-09-01

A general model for ductile damage in metals is presented. It includes damage induced by shear stress as well as damage caused by volumetric tension. Spallation is included as a special case. Strain induced damage is also treated. Void nucleation and growth are included, and give rise to strain rate effects. Strain rate effects also arise in the model through elastic release wave propagation between damage centers. The underlying physics of the model is the nucleation, growth, and coalescence of voids in a plastically flowing solid. The model is intended for hydrocode based computer simulation. An experimental program is underway to validate the model.

9. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

ERIC Educational Resources Information Center

Chen, Jing

2012-01-01

Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

10. Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance

ERIC Educational Resources Information Center

Burguillo, Juan C.

2010-01-01

This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…

11. Controlling Retrieval during Practice: Implications for Memory-Based Theories of Automaticity

ERIC Educational Resources Information Center

Wilkins, Nicolas J.; Rawson, Katherine A.

2011-01-01

Memory-based processing theories of automaticity assume that shifts from algorithmic to retrieval-based processing underlie practice effects on response times. The current work examined the extent to which individuals can exert control over the involvement of retrieval during skill acquisition and the factors that may influence control. In two…

12. Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again

ERIC Educational Resources Information Center

Ohlsson, Stellan

2016-01-01

The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to…

13. Evaluating the Evidence Base for Relational Frame Theory: A Citation Analysis

ERIC Educational Resources Information Center

Dymond, Simon; May, Richard J.; Munnelly, Anita; Hoon, Alice E.

2010-01-01

Relational frame theory (RFT) is a contemporary behavior-analytic account of language and cognition. Since it was first outlined in 1985, RFT has generated considerable controversy and debate, and several claims have been made concerning its evidence base. The present study sought to evaluate the evidence base for RFT by undertaking a citation…

14. Score Reliability of a Test Composed of Passage-Based Testlets: A Generalizability Theory Perspective.

ERIC Educational Resources Information Center

Lee, Yong-Won

The purpose of this study was to investigate the impact of local item dependence (LID) in passage-based testlets on the test score reliability of an English as a Foreign Language (EFL) reading comprehension test from the perspective of generalizability (G) theory. Definitions and causes of LID in passage-based testlets are reviewed within the…

15. Game Theory Based Trust Model for Cloud Environment

PubMed Central

Gokulnath, K.; Uthariaraj, Rhymend

2015-01-01

The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

16. [The Chinese urban metabolisms based on the emergy theory].

PubMed

Song, Tao; Cai, Jian-Ming; Ni, Pan; Yang, Zhen-Shan

2014-04-01

By using emergy indices of urban metabolisms, this paper analyzed 31 Chinese urban metabolisms' systematic structures and characteristics in 2000 and 2010. The results showed that Chinese urban metabolisms were characterized as resource consumption and coastal external dependency. Non-renewable resource emergy accounted for a higher proportion of the total emergy in the inland cities' urban metabolisms. The emergy of imports and exports accounted for the vast majority of urban metabolic systems in metropolises and coastal cities such as Beijing and Shanghai, showing a significant externally-oriented metabolic characteristic. Based on that, the related policies were put forward: to develop the renewable resource and energy industry; to improve the non-renewable resource and energy utilization efficiencies; to optimize the import and export structure of services, cargo and fuel; and to establish the flexible management mechanism of urban metabolisms.

17. Two-cell theory to measure membrane resistance based on proton flow: Theory development and experimental validation

Das, Susanta K.; Berry, K. J.

A two-cell theory is developed to measure proton exchange membrane (PEM) resistance to proton flow during conduction through a PEM fuel cell. The theoretical framework developed herein is based upon fundamental thermodynamic principles and engineering laws. We made appropriate corrections to develop the theoretical model previously proposed by Babu and Nair (B.V. Babu, N. Nair, J. Energy Edu. Sci. Technol. 13 (2004) 13-20) for measuring membrane resistance to the flow of protons, which is the only ion that travels from one electrode to the other through the membrane. A simple experimental set-up and procedure are also developed to validate the theoretical model predictions. A widely used commercial membrane (Nafion ®) and several in-house membranes are examined to compare relative resistance among membranes. According to the theory, resistance of the proton exchange membrane is directly proportional to the time taken for a specific amount of protons to pass through the membrane. A second order differential equation describes the entire process. The results show that theoretical predictions are in excellent agreement with experimental observations. It is our speculation that the investigation results will open up a route to develop a simple device to measure resistance during membrane manufacturing since electrolyte resistance is one of the key performance drivers for the advancement of fuel cell technology.

18. THEORY OF REGENERATION BASED ON MASS ACTION. II

PubMed Central

Loeb, Jacques

1923-01-01

1. Quantitative proof is furnished that all the material available for shoot and root formation in an isolated leaf of Bryophyllum calycinum flows to those notches where through the influence of gravity or by a more abundant supply of water growth is accelerated. As soon as the acceleration of growth in these notches commences, the growth of shoots and roots in the other notches which may already have started ceases. 2. It had been shown in a preceding paper that the regeneration of an isolated piece of stem may be and frequently is in the beginning not markedly polar, but that after some time the growth of all the roots except those at the base and of all the shoots except those at the apex is suppressed. This analogy with the behavior of regeneration in a leaf in which the growth in one set of notches is accelerated, suggests that in an isolated stem a more rapid growth is favored at the extreme ends (probably by a block of the sap flow at the extreme ends) and that when this happens the total flow of ascending sap goes to the most apical buds and the total flow of the descending sap goes to the most basal roots. As soon as this occurs, the growth of the other roots and shoots is suppressed. PMID:19872063

19. THEORY OF REGENERATION BASED ON MASS ACTION. II.

PubMed

Loeb, J

1923-11-20

1. Quantitative proof is furnished that all the material available for shoot and root formation in an isolated leaf of Bryophyllum calycinum flows to those notches where through the influence of gravity or by a more abundant supply of water growth is accelerated. As soon as the acceleration of growth in these notches commences, the growth of shoots and roots in the other notches which may already have started ceases. 2. It had been shown in a preceding paper that the regeneration of an isolated piece of stem may be and frequently is in the beginning not markedly polar, but that after some time the growth of all the roots except those at the base and of all the shoots except those at the apex is suppressed. This analogy with the behavior of regeneration in a leaf in which the growth in one set of notches is accelerated, suggests that in an isolated stem a more rapid growth is favored at the extreme ends (probably by a block of the sap flow at the extreme ends) and that when this happens the total flow of ascending sap goes to the most apical buds and the total flow of the descending sap goes to the most basal roots. As soon as this occurs, the growth of the other roots and shoots is suppressed.

20. Theory of Carbon Nanotube (CNT)-Based Electron Field Emitters

PubMed Central

Bocharov, Grigory S.; Eletskii, Alexander V.

2013-01-01

Theoretical problems arising in connection with development and operation of electron field emitters on the basis of carbon nanotubes are reviewed. The physical aspects of electron field emission that underlie the unique emission properties of carbon nanotubes (CNTs) are considered. Physical effects and phenomena affecting the emission characteristics of CNT cathodes are analyzed. Effects given particular attention include: the electric field amplification near a CNT tip with taking into account the shape of the tip, the deviation from the vertical orientation of nanotubes and electrical field-induced alignment of those; electric field screening by neighboring nanotubes; statistical spread of the parameters of the individual CNTs comprising the cathode; the thermal effects resulting in degradation of nanotubes during emission. Simultaneous consideration of the above-listed effects permitted the development of the optimization procedure for CNT array in terms of the maximum reachable emission current density. In accordance with this procedure, the optimum inter-tube distance in the array depends on the region of the external voltage applied. The phenomenon of self-misalignment of nanotubes in an array has been predicted and analyzed in terms of the recent experiments performed. A mechanism of degradation of CNT-based electron field emitters has been analyzed consisting of the bombardment of the emitters by ions formed as a result of electron impact ionization of the residual gas molecules.

1. [Modeling continuous scaling of NDVI based on fractal theory].

PubMed

Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng

2013-07-01

Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.

2. A Christian faith-based recovery theory: understanding God as sponsor.

PubMed

Timmons, Shirley M

2012-12-01

This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions.

3. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

Matsumura, Koki; Kawamoto, Masaru

This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

4. Phenomenological theory of a renormalized simplified model based on time-convolutionless mode-coupling theory near the glass transition

Tokuyama, Michio

2017-01-01

The renormalized simplified model is proposed to investigate indirectly how the static structure factor plays an important role in renormalizing a quadratic nonlinear term in the ideal mode-coupling memory function near the glass transition. The renormalized simplified recursion equation is then derived based on the time-convolutionless mode-coupling theory (TMCT) proposed recently by the present author. This phenomenological approach is successfully applied to check from a unified point of view how strong liquids are different from fragile liquids. The simulation results for those two types of liquids are analyzed consistently by the numerical solutions of the recursion equation. Then, the control parameter dependence of the renormalized nonlinear exponent in both types of liquids is fully investigated. Thus, it is shown that there exists a novel difference between the universal behavior in strong liquids and that in fragile liquids not only for their transport coefficients but also for their dynamics.

5. The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories

ERIC Educational Resources Information Center

Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele

2009-01-01

In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…

6. A design method based on photonic crystal theory for Bragg concave diffraction grating

Du, Bingzheng; Zhu, Jingping; Mao, Yuzheng; Li, Bao; Zhang, Yunyao; Hou, Xun

2017-02-01

A design method based on one-dimensional photonic crystal theory (1-D PC theory) is presented to design Bragg concave diffraction grating (Bragg-CDG) for the demultiplexer. With this design method, the reflection condition calculated by the 1-D PC theory can be matched perfectly with the diffraction condition. As a result, the shift of central wavelength of diffraction spectra can be improved, while keeping high diffraction efficiency. Performances of Bragg-CDG for TE and TM-mode are investigated, and the simulation results are consistent with the 1-D PC theory. This design method is expected to be applied to improve the accuracy and efficiency of Bragg-CDG after further research.

7. Perturbation theory for multicomponent fluids based on structural properties of hard-sphere chain mixtures

SciTech Connect

Hlushak, Stepan

2015-09-28

An analytical expression for the Laplace transform of the radial distribution function of a mixture of hard-sphere chains of arbitrary segment size and chain length is used to rigorously formulate the first-order Barker-Henderson perturbation theory for the contribution of the segment-segment dispersive interactions into thermodynamics of the Lennard-Jones chain mixtures. Based on this approximation, a simple variant of the statistical associating fluid theory is proposed and used to predict properties of several mixtures of chains of different lengths and segment sizes. The theory treats the dispersive interactions more rigorously than the conventional theories and provides means for more accurate description of dispersive interactions in the mixtures of highly asymmetric components.

8. Research on Prediction Model of Time Series Based on Fuzzy Theory and Genetic Algorithm

Xiao-qin, Wu

Fuzzy theory is one of the newly adduced self-adaptive strategies,which is applied to dynamically adjust the parameters o genetic algorithms for the purpose of enhancing the performance.In this paper, the financial time series analysis and forecasting as the main case study to the theory of soft computing technology framework that focuses on the fuzzy theory and genetic algorithms(FGA) as a method of integration. the financial time series forecasting model based on fuzzy theory and genetic algorithms was built. the ShangZheng index cards as an example. The experimental results show that FGA perform s much better than BP neural network, not only in the precision, but also in the searching speed.The hybrid algorithm has a strong feasibility and superiority.

9. Theory of normal and superconducting properties of fullerene-based solids

SciTech Connect

Cohen, M.L.

1992-10-01

Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

10. Theory of normal and superconducting properties of fullerene-based solids

SciTech Connect

Cohen, M.L.

1992-10-01

Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ``standard model`` of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

11. Paying for express checkout: competition and price discrimination in multi-server queuing systems.

PubMed

Deck, Cary; Kimbrough, Erik O; Mongrain, Steeve

2014-01-01

We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus.

12. Paying for Express Checkout: Competition and Price Discrimination in Multi-Server Queuing Systems

PubMed Central

Deck, Cary; Kimbrough, Erik O.; Mongrain, Steeve

2014-01-01

We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

13. An anisotropic constitutive equation for the stress tensor of blood based on mixture theory

SciTech Connect

2008-09-12

Based on ideas proposed by Massoudi and Rajagopal (M-R), we develop a model for blood using the theory of interacting continua, that is, the mixture theory. We first provide a brief review of mixture theory, and then discuss certain issues in constitutive modeling of a two-component mixture. In the present formulation, we ignore the biochemistry of blood and assume that blood is composed of red blood cells (RBCs) suspended in plasma, where the plasma behaves as a linearly viscous fluid and the RBCs are modeled as an anisotropic nonlinear density-gradient-type fluid. We obtain a constitutive relation for blood, based on the simplified constitutive relations derived for plasma and RBCs. A simple shear flow is discussed, and an exact solution is obtained for a very special case; for more general cases, it is necessary to solve the nonlinear coupled equations numerically.

14. An anisotropic constitutive equation for the stress tensor of blood based on mixture theory

SciTech Connect

Massoudi, M.; Antaki, J.

2008-01-01

Based on ideas proposed by Massoudi and Rajagopal M-R , we develop a model for blood using the theory of interacting continua, that is, the mixture theory. We first provide a brief review of mixture theory, and then discuss certain issues in constitutive modeling of a two-component mixture. In the present formulation, we ignore the biochemistry of blood and assume that blood is composed of red blood cells RBCs suspended in plasma, where the plasma behaves as a linearly viscous fluid and the RBCs are modeled as an anisotropic nonlinear density-gradient-type fluid. We obtain a constitutive relation for blood, based on the simplified constitutive relations derived for plasma and RBCs. A simple shear flow is discussed, and an exact solution is obtained for a very special case; for more general cases, it is necessary to solve the nonlinear coupled equations numerically.

15. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

PubMed

Cowin, Stephen C; Cardoso, Luis

2012-01-01

This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues.

16. Performance evaluation of an optical hybrid switch with circuit queued reservations and circuit priority preemption.

PubMed

Wong, Eric W M; Zukerman, Moshe

2006-11-13

We provide here a new loss model for an optical hybrid switch that can function as an optical burst switch and/or optical circuit switch. Our model is general as it considers an implementation whereby some of the circuits have preemptive priority over bursts and others are allowed to queue their reservations. We first present an analysis based on a 3-dimension state-space Markov chain that provides exact results for the blocking probabilities of bursts and circuits, the proportion of circuits that are delayed and the mean delay of the circuits that are delayed. Because it is difficult to exactly compute the blocking probability in realistic scenarios with a large number of wavelengths, we derive computationally a scalable and accurate approximations based on reducing the 3-dimension state space into a single dimension. These scalable approximations that can produce performance results in a fraction of a second can readily enable switch dimensioning. Extensive numerical results are presented to demonstrate the accuracy and the use of the new approximations.

17. Development of a Test Battery to Assess Mental Flexibility Based on Sternberg’s Theory of Successful Intelligence

DTIC Science & Technology

2008-01-01

Technical Report 1222 Development of a Test Battery to Assess Mental Flexibility Based on Sternberg’s Theory of Successful Intelligence Cynthia T...GRANT NUMBER Development of a Test Battery to Assess Mental Flexibility Based on DASW01-03-K-0001 Sternberg’s Theory of Successful Intelligence 5b...flexibility was developed based on Sternberg’s theory successful intelligence (1985). New mental flexibility assessment instruments were developed and

18. Preparing Students for Education, Work, and Community: Activity Theory in Task-Based Curriculum Design

ERIC Educational Resources Information Center

Campbell, Chris; MacPherson, Seonaigh; Sawkins, Tanis

2014-01-01

This case study describes how sociocultural and activity theory were applied in the design of a publicly funded, Canadian Language Benchmark (CLB)-based English as a Second Language (ESL) credential program and curriculum for immigrant and international students in postsecondary institutions in British Columbia, Canada. The ESL Pathways Project…

19. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

ERIC Educational Resources Information Center

Fein, Lance; Jones, Don

2015-01-01

This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

20. A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.

ERIC Educational Resources Information Center

Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.

Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…

1. Application of Online Multimedia Courseware in College English Teaching Based on Constructivism Theory

ERIC Educational Resources Information Center

Li, Zhenying

2012-01-01

Based on Constructivism Theory, this paper aims to investigate the application of online multimedia courseware to college English teaching. By making experiments and students' feedback, some experience has been accumulated, and some problems are discovered and certain revelations are acquired as well in English teaching practice, which pave the…

2. Imitation dynamics of vaccine decision-making behaviours based on the game theory.

PubMed

Yang, Junyuan; Martcheva, Maia; Chen, Yuming

2016-01-01

Based on game theory, we propose an age-structured model to investigate the imitation dynamics of vaccine uptake. We first obtain the existence and local stability of equilibria. We show that Hopf bifurcation can occur. We also establish the global stability of the boundary equilibria and persistence of the disease. The theoretical results are supported by numerical simulations.

3. Investigating Learner Attitudes toward E-Books as Learning Tools: Based on the Activity Theory Approach

ERIC Educational Resources Information Center

Liaw, Shu-Sheng; Huang, Hsiu-Mei

2016-01-01

This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…

4. A Practice-Based Theory of Professional Education: Teach For America's Professional Development Model

ERIC Educational Resources Information Center

Gabriel, Rachael

2011-01-01

In 1999, Ball and Cohen proposed a practice-based theory of professional education, which would end inadequate professional development efforts with a more comprehensive approach. Their work has been referenced over the past decade, yet there have been limited attempts to actualize their ideals and research their implications. In this article, I…

5. Predicting Study Abroad Intentions Based on the Theory of Planned Behavior

ERIC Educational Resources Information Center

Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi

2012-01-01

The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…

6. Theory-based Health Education Activities for Third to Sixth Grade Children.

ERIC Educational Resources Information Center

Jaycox, Sharon; And Others

1983-01-01

Eight educational activities based on social learning and social support theory were used as part of a comprehensive cardiovascular risk reduction program for families with children in the elementary grades. Activities focused on changing behavior with regard to diet and exercise. (Author/PP)

7. Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders

ERIC Educational Resources Information Center

Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel

2012-01-01

This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…

8. Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder

ERIC Educational Resources Information Center

Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.

2005-01-01

We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion…

9. Effects of Guided Writing Strategies on Students' Writing Attitudes Based on Media Richness Theory

ERIC Educational Resources Information Center

Lan, Yu-Feng; Hung, Chun-Ling; Hsu, Hung-Ju

2011-01-01

The purpose of this paper is to develop different guided writing strategies based on media richness theory and further evaluate the effects of these writing strategies on younger students' writing attitudes in terms of motivation, enjoyment and anxiety. A total of 66 sixth-grade elementary students with an average age of twelve were invited to…

10. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

ERIC Educational Resources Information Center

Wang, Greg G.; Swanson, Richard A.

2008-01-01

Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

11. Optimal guidance law for cooperative attack of multiple missiles based on optimal control theory

Sun, Xiao; Xia, Yuanqing

2012-08-01

This article considers the problem of optimal guidance laws for cooperative attack of multiple missiles based on the optimal control theory. New guidance laws are presented such that multiple missiles attack a single target simultaneously. Simulation results show the effectiveness of the proposed algorithms.

ERIC Educational Resources Information Center

Ajideh, Parviz

2003-01-01

Describes a study in which an English-as-a-Second-Language reading instructor worked with a group of intermediate students that focused on schema theory-based pre-reading activities. Highlights the students' impressions on the strategies covered during the term. (Author/VWL)

13. Ninter-Networked Interaction: Theory-based Cases in Teaching and Learning.

ERIC Educational Resources Information Center

Saarenkunnas, Maarit; Jarvela, Sanna; Hakkinen, Paivi; Kuure, Leena; Taalas, Peppi; Kunelius, Esa

2000-01-01

Describes the pedagogical framework of an interdisciplinary, international project entitled NINTER (Networked Interaction: Theory-Based Cases in Teaching and Learning). Discusses a pedagogical model for teacher and staff development programs in a networked environment; distributed cognition; cognitive apprenticeship; challenges for educational…

14. Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach

ERIC Educational Resources Information Center

Mainardes, Emerson; Alves, Helena; Raposo, Mario

2013-01-01

In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…

15. Effects of a Theory-Based, Peer-Focused Drug Education Course.

ERIC Educational Resources Information Center

Gonzalez, Gerardo M.

1990-01-01

Describes innovative, theory-based, peer-focused college drug education academic course and its effect on perceived levels of risk associated with the use of alcohol, marijuana, and cocaine. Evaluation of the effects of the course indicated the significant effect on perceived risk of cocaine, but not alcohol or marijuana. (Author/ABL)

16. Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data

ERIC Educational Resources Information Center

Abdullah, Lazim

2011-01-01

Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…

17. Predicting Teachers' Intentions to Implement School-Based Assessment Using the Theory of Planned Behaviour

ERIC Educational Resources Information Center

Yan, Zi

2014-01-01

The theory of planned behaviour (TPB) was used to explore the Hong Kong teachers' intentions to implement school-based assessment (SBA) and the predictors of those intentions. A total of 280 teachers from Hong Kong secondary schools who had been involved in SBA were surveyed. Rasch-calibrated teacher measures were calculated for each of the 6…

18. Item Response Theory with Estimation of the Latent Population Distribution Using Spline-Based Densities

ERIC Educational Resources Information Center

Woods, Carol M.; Thissen, David

2006-01-01

The purpose of this paper is to introduce a new method for fitting item response theory models with the latent population distribution estimated from the data using splines. A spline-based density estimation system provides a flexible alternative to existing procedures that use a normal distribution, or a different functional form, for the…

19. Examining Instruction in MIDI-Based Composition through a Critical Theory Lens

ERIC Educational Resources Information Center

Louth, Paul

2013-01-01

This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

20. Examining Instruction in MIDI-based Composition through a Critical Theory Lens

ERIC Educational Resources Information Center

Louth, Paul

2013-01-01

This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

1. An Instructional Design Theory for Interactions in Web-Based Learning Environments.

ERIC Educational Resources Information Center

Lee, Miyoung; Paulus, Trena

This study developed and formatively evaluated an instructional design theory to guide designers in selecting when and how to utilize interactions as instructional methods in a Web-based distance learning higher education environment. Research questions asked: What are the types and outcomes of interactions between participants in a Web-based…

2. Investigating Acceptance toward Mobile Learning to Assist Individual Knowledge Management: Based on Activity Theory Approach

ERIC Educational Resources Information Center

Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei

2010-01-01

Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…

3. Implications and Applications of Modern Test Theory in the Context of Outcomes Based Education.

ERIC Educational Resources Information Center

Andrich, David

2002-01-01

Uses a framework previously developed to relate outcomes based education and B. Bloom's "Taxonomy of Educational Objectives" to consider ways in which modern test theory can be used to connect aspects of assessment to the curriculum framework and to consider insights this connection might provide. (SLD)

4. Web-Support for Activating Use of Theory in Group-Based Learning.

ERIC Educational Resources Information Center

van der Veen, Jan; van Riemsdijk, Maarten; Laagland, Eelco; Gommer, Lisa; Jones, Val

This paper describes a series of experiments conducted within the context of a course on organizational theory that is taught at the Department of Management Sciences at the University of Twente (Netherlands). In 1997, a group-based learning approach was adopted, but after the first year it was apparent that acquisition and application of theory…

5. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

ERIC Educational Resources Information Center

de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

2011-01-01

Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

6. Generalizability Theory Reliability of Written Expression Curriculum-Based Measurement in Universal Screening

ERIC Educational Resources Information Center

Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.

2016-01-01

The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…

7. Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course

ERIC Educational Resources Information Center

McGowan, Ian S.

2016-01-01

Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…

8. Effect of Cognitive-Behavioral-Theory-Based Skill Training on Academic Procrastination Behaviors of University Students

ERIC Educational Resources Information Center

Toker, Betül; Avci, Rasit

2015-01-01

This study examined the effectiveness of a cognitive-behavioral theory (CBT) psycho-educational group program on the academic procrastination behaviors of university students and the persistence of any training effect. This was a quasi-experimental research based on an experimental and control group pretest, posttest, and followup test model.…

9. Applying Unidimensional and Multidimensional Item Response Theory Models in Testlet-Based Reading Assessment

ERIC Educational Resources Information Center

Min, Shangchao; He, Lianzhen

2014-01-01

This study examined the relative effectiveness of the multidimensional bi-factor model and multidimensional testlet response theory (TRT) model in accommodating local dependence in testlet-based reading assessment with both dichotomously and polytomously scored items. The data used were 14,089 test-takers' item-level responses to the testlet-based…

10. Revisiting Transactional Distance Theory in a Context of Web-Based High-School Distance Education

ERIC Educational Resources Information Center

Murphy, Elizabeth Anne; Rodriguez-Manzanares, Maria Angeles

2008-01-01

The purpose of this paper is to report on a study that provided an opportunity to consider Transactional Distance Theory (TDT) in a current technology context of web-based learning in distance education (DE), high-school classrooms. Data collection relied on semi-structured interviews conducted with 22 e-teachers and managers in Newfoundland and…

11. English Textbooks Based on Research and Theory--A Possible Dream.

ERIC Educational Resources Information Center

Suhor, Charles

1984-01-01

Research based text materials will probably never dominate the textbook market. To begin with, translating theory and research into practice is a chancy business. There are also creative problems such as the inherent oversimplification involved in textbook writing. Every textbook writer who has been a classroom teacher will acknowledge that such…

12. Operationalizing Levels of Academic Mastery Based on Vygotsky's Theory: The Study of Mathematical Knowledge

ERIC Educational Resources Information Center

Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry

2015-01-01

The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…

13. Theory and Utility-Key Themes in Evidence-Based Assessment: Comment on the Special Section

ERIC Educational Resources Information Center

McFall, Richard M.

2005-01-01

This article focuses on two key themes in the four featured reviews on evidence-based assessment. The first theme is the essential role of theory in psychological assessment. An overview of this complex, multilayered role is presented. The second theme is the need for a common metric with which to gauge the utility of specific psychological tests…

14. Glacier mapping based on rough set theory in the Manas River watershed

Yan, Lili; Wang, Jian; Hao, Xiaohua; Tang, Zhiguang

2014-04-01

Precise glacier information is important for assessing climate change in remote mountain areas. To obtain more accurate glacier mapping, rough set theory, which can deal with vague and uncertainty information, was introduced to obtain optimal knowledge rules for glacier mapping. Optical images, thermal infrared band data, texture information and morphometric parameters were combined to build a decision table used in our proposed rough set theory method. After discretizing the real value attributes, decision rules were calculated through the decision rule generation algorithm for glacier mapping. A decision classifier based on the generated rules classified the multispectral image into glacier and non-glacier areas. The result of maximum likelihood classification (MLC) was used to compare with the result of the classification based on the rough set theory. Confusion matrix and visual interpretation were used to evaluate the overall accuracy of the results of the two methods. The accuracies of the rough set method and maximum likelihood classification were compared, yielding overall accuracies of 94.15% and 93.88%, respectively. It showed the area difference based on rough set was smaller by comparing the glacier areas of the rough set method and MLC with visual interpreter, respectively. The high accuracy for glacier mapping and the small area difference for glacier based on rough set theory demonstrated that this method was effective and promising for glacier mapping.

15. Coaching Process Based on Transformative Learning Theory for Changing the Instructional Mindset of Elementary School Teachers

ERIC Educational Resources Information Center

Kawinkamolroj, Milintra; Triwaranyu, Charinee; Thongthew, Sumlee

2015-01-01

This research aimed to develop coaching process based on transformative learning theory for changing the mindset about instruction of elementary school teachers. Tools used in this process include mindset tests and questionnaires designed to assess the instructional mindset of teachers and to allow the teachers to reflect on how they perceive…

16. Enhancing Student Motivation: A Longitudinal Intervention Study Based on Future Time Perspective Theory

ERIC Educational Resources Information Center

Schuitema, Jaap; Peetsma, Thea; van der Veen, Ineke

2014-01-01

The authors investigated the effects of an intervention developed to enhance student motivation in the first years of secondary education. The intervention, based on future time perspective (FTP) theory, has been found to be effective in prevocational secondary education (T. T. D. Peetsma & I. Van der Veen, 2008, 2009). The authors extend the…

17. Lessons Learnt from Employing van Hiele Theory Based Instruction in Senior Secondary School Geometry Classrooms

ERIC Educational Resources Information Center

Alex, Jogymol Kalariparambil; Mammen, Kuttickattu John

2016-01-01

This paper reports on a part of a study which was conducted to determine the effect of van Hiele theory based instruction in the teaching of geometry to Grade 10 learners. The sample consisted of 359 participants from five conveniently selected schools from Mthatha District in the Eastern Cape Province in South Africa. There were 195 learners in…

18. Assessing Instructional Reform in San Diego: A Theory-Based Approach

ERIC Educational Resources Information Center

O'Day, Jennifer; Quick, Heather E.

2009-01-01

This article provides an overview of the approach, methodology, and key findings from a theory-based evaluation of the district-led instructional reform effort in San Diego City Schools, under the leadership of Alan Bersin and Anthony Alvarado, that began in 1998. Beginning with an analysis of the achievement trends in San Diego relative to other…

19. From Theory to Practice: Concept-Based Inquiry in a High School Art Classroom

ERIC Educational Resources Information Center

Walker, Margaret A.

2014-01-01

This study examines what an emerging educational theory looks like when put into practice in an art classroom. It explores the teaching methodology of a high school art teacher who has utilized concept-based inquiry in the classroom to engage his students in artmaking and analyzes the influence this methodology has had on his adolescent students.…

20. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

PubMed

Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

2017-02-17

The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice.

1. A queuing model for designing multi-modality buried target detection systems: preliminary results

Malof, Jordan M.; Morton, Kenneth D.; Collins, Leslie M.; Torrione, Peter A.

2015-05-01

Many remote sensing modalities have been developed for buried target detection, each one offering its own relative advantages over the others. As a result there has been interest in combining several modalities into a single detection platform that benefits from the advantages of each constituent sensor, without suffering from their weaknesses. Traditionally this involves collecting data continuously on all sensors and then performing data, feature, or decision level fusion. While this is effective for lowering false alarm rates, this strategy neglects the potential benefits of a more general system-level fusion architecture. Such an architecture can involve dynamically changing which modalities are in operation. For example, a large standoff modality such as a forward-looking infrared (FLIR) camera can be employed until an alarm is encountered, at which point a high performance (but short standoff) sensor, such as ground penetrating radar (GPR), is employed. Because the system is dynamically changing its rate of advance and sensors, it becomes difficult to evaluate the expected false alarm rate and advance rate. In this work, a probabilistic model is proposed that can be used to estimate these quantities based on a provided operating policy. In this model the system consists of a set of states (e.g., sensors employed) and conditions encountered (e.g., alarm locations). The predictive accuracy of the model is evaluated using a collection of collocated FLIR and GPR data and the results indicate that the model is effective at predicting the desired system metrics.

2. Vervet monkeys solve a multiplayer "forbidden circle game" by queuing to learn restraint.

PubMed

Fruteau, Cécile; van Damme, Eric; Noë, Ronald

2013-04-22

In social dilemmas, the ability of individuals to coordinate their actions is crucial to reach group optima. Unless exacted by power or force, coordination in humans relies on a common understanding of the problem, which is greatly facilitated by communication. The lack of means of consultation about the nature of the problem and how to solve it may explain why multiagent coordination in nonhuman vertebrates has commonly been observed only when multiple individuals react instantaneously to a single stimulus, either natural or experimentally simulated, for example a predator, a prey, or a neighboring group. Here we report how vervet monkeys solved an experimentally induced coordination problem. In each of three groups, we trained a low-ranking female, the "provider," to open a container holding a large amount of food, which the providers only opened when all individuals dominant to them ("dominants") stayed outside an imaginary "forbidden circle" around it. Without any human guidance, the dominants learned restraint one by one, in hierarchical order from high to low. Once all dominants showed restraint immediately at the start of the trial, the providers opened the container almost instantly, saving all individuals opportunity costs due to lost foraging time. Solving this game required trial-and-error learning based on individual feedback from the provider to each dominant, and all dominants being patient enough to wait outside the circle while others learned restraint. Communication, social learning, and policing by high-ranking animals played no perceptible role.

3. towards a theory-based multi-dimensional framework for assessment in mathematics: The "SEA" framework

Anku, Sitsofe E.

1997-09-01

Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.

4. [The model of the reward choice basing on the theory of reinforcement learning].

PubMed

Smirnitskaia, I A; Frolov, A A; Merzhanova, G Kh

2007-01-01

We developed the model of alimentary instrumental conditioned bar-pressing reflex for cats making a choice between either immediate small reinforcement ("impulsive behavior") or delayed more valuable reinforcement ("self-control behavior"). Our model is based on the reinforcement learning theory. We emulated dopamine contribution by discount coefficient of this theory (a subjective decrease in the value of a delayed reinforcement). The results of computer simulation showed that "cats" with large discount coefficient demonstrated "self-control behavior"; small discount coefficient was associated with "impulsive behavior". This data are in agreement with the experimental data indicating that the impulsive behavior is due to a decreased amount of dopamine in striatum.

5. Theory-based scaling of the SOL width in circular limited tokamak plasmas

Halpern, F. D.; Ricci, P.; Labit, B.; Furno, I.; Jolliet, S.; Loizu, J.; Mosetto, A.; Arnoux, G.; Gunn, J. P.; Horacek, J.; Kočan, M.; LaBombard, B.; Silva, C.; Contributors, JET-EFDA

2013-12-01

A theory-based scaling for the characteristic length of a circular, limited tokamak scrape-off layer (SOL) is obtained by considering the balance between parallel losses and non-linearly saturated resistive ballooning mode turbulence driving anomalous perpendicular transport. The SOL size increases with plasma size, resistivity, and safety factor q. The scaling is verified against flux-driven non-linear turbulence simulations, which reveal good agreement within a wide range of dimensionless parameters, including parameters closely matching the TCV tokamak. An initial comparison of the theory against experimental data from several tokamaks also yields good agreement.

6. A Monte Carlo Comparison of Item and Person Statistics Based on Item Response Theory versus Classical Test Theory.

ERIC Educational Resources Information Center

MacDonald, Paul; Paunonen, Sampo V.

2002-01-01

Examined the behavior of item and person statistics from item response theory and classical test theory frameworks through Monte Carlo methods with simulated test data. Findings suggest that item difficulty and person ability estimates are highly comparable for both approaches. (SLD)

7. Robust stabilization control based on guardian maps theory for a longitudinal model of hypersonic vehicle.

PubMed

Liu, Yanbin; Liu, Mengying; Sun, Peihua

2014-01-01

A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods.

8. Classification of PolSAR image based on quotient space theory

An, Zhihui; Yu, Jie; Liu, Xiaomeng; Liu, Limin; Jiao, Shuai; Zhu, Teng; Wang, Shaohua

2015-12-01

In order to improve the classification accuracy, quotient space theory was applied in the classification of polarimetric SAR (PolSAR) image. Firstly, Yamaguchi decomposition method is adopted, which can get the polarimetric characteristic of the image. At the same time, Gray level Co-occurrence Matrix (GLCM) and Gabor wavelet are used to get texture feature, respectively. Secondly, combined with texture feature and polarimetric characteristic, Support Vector Machine (SVM) classifier is used for initial classification to establish different granularity spaces. Finally, according to the quotient space granularity synthetic theory, we merge and reason the different quotient spaces to get the comprehensive classification result. Method proposed in this paper is tested with L-band AIRSAR of San Francisco bay. The result shows that the comprehensive classification result based on the theory of quotient space is superior to the classification result of single granularity space.

9. Vibration analysis of single-walled carbon peapods based on nonlocal Timoshenko beam theory

Ghadiri, Majid; Hajbarati, Hamid; Safi, Mohsen

2017-04-01

In this article, vibration behavior of single-walled carbon nanotube encapsulating C60 molecules is studied using the Eringen's nonlocal elasticity theory within the frame work of Timoshenko beam theory. The governing equation and boundary conditions are derived using Hamilton's principle. It is considered that the nanopeapod is embedded in an elastic medium and the C60 molecules are modeled as lumped masses attached to the nanobeam. The Galerkin's method is applied to determine the natural frequency of the nanobeam with clamped-clamped boundary conditions. Effects of nonlocality, foundation stiffness, and ratio of the fullerenes' mass to the nanotube's mass on the natural frequencies are investigated. In addition, by vanishing effects of shear deformation and rotary inertia, the results based on Euler-Bernoulli beam theory are presented.

10. Robust Stabilization Control Based on Guardian Maps Theory for a Longitudinal Model of Hypersonic Vehicle

PubMed Central

Liu, Mengying; Sun, Peihua

2014-01-01

A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535

11. An automated integration-free path-integral method based on Kleinert's variational perturbation theory

Wong, Kin-Yiu; Gao, Jiali

2007-12-01

Based on Kleinert's variational perturbation (KP) theory [Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets, 3rd ed. (World Scientific, Singapore, 2004)], we present an analytic path-integral approach for computing the effective centroid potential. The approach enables the KP theory to be applied to any realistic systems beyond the first-order perturbation (i.e., the original Feynman-Kleinert [Phys. Rev. A 34, 5080 (1986)] variational method). Accurate values are obtained for several systems in which exact quantum results are known. Furthermore, the computed kinetic isotope effects for a series of proton transfer reactions, in which the potential energy surfaces are evaluated by density-functional theory, are in good accordance with experiments. We hope that our method could be used by non-path-integral experts or experimentalists as a "black box" for any given system.

12. A comparison of design variables for control theory based airfoil optimization

NASA Technical Reports Server (NTRS)

Reuther, James; Jameson, Antony

1995-01-01

This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.

13. Performance of Four-Leg VSC based DSTATCOM using Single Phase P-Q Theory

Jampana, Bangarraju; Veramalla, Rajagopal; Askani, Jayalaxmi

2017-02-01

This paper presents single-phase P-Q theory for four-leg VSC based distributed static compensator (DSTATCOM) in the distribution system. The proposed DSTATCOM maintains unity power factor at source, zero voltage regulation, eliminates current harmonics, load balancing and neutral current compensation. The advantage of using four-leg VSC based DSTATCOM is to eliminate isolated/non-isolated transformer connection at point of common coupling (PCC) for neutral current compensation. The elimination of transformer connection at PCC with proposed topology will reduce cost of DSTATCOM. The single-phase P-Q theory control algorithm is used to extract fundamental component of active and reactive currents for generation of reference source currents which is based on indirect current control method. The proposed DSTATCOM is modelled and the results are validated with various consumer loads under unity power factor and zero voltage regulation modes in the MATLAB R2013a environment using simpower system toolbox.

14. Theory and experiments in model-based space system anomaly management

This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

15. Recognizing through feeling. A physical and computer simulation based on educational theory.

PubMed

Lyons, J; Milton, J

1999-01-01

This article focuses on the educational theory underpinning computer-based simulation in professional education. An innovative computer-based physical simulation to facilitate student learning of assessment and palpation skills in midwifery has been developed to prototype stage and preliminary evaluations conducted. The learning experience explicitly builds on the learning and teaching theory--a "conversational framework"--developed by Laurillard. A template incorporating all the dimensions of the Laurillard framework in the learning experience is presented and discussed. It is argued that this template could have wider application especially in clinically based health science courses. The case-based learning environment allows the students to solve problems and make valid clinical judgments. Throughout the learning experiences, students effectively examine a pregnant woman while interrelating their experiences with the academic knowledge of the teacher (a world of "descriptions"). The structure for learning relies on a mechanism for identifying and addressing the misunderstandings students initially hold. The creation of situations of cognitive conflict in the student's world of action (Laurillard's concept of intrinsic feedback) is seen as central to the learning experience. Finally, the article will canvass the issues faced by a project team designing and developing a technology-based educational package around an educational theory.

16. Efficacy of theory-based interventions to promote physical activity. A meta-analysis of randomised controlled trials.

PubMed

Gourlan, M; Bernard, P; Bortolon, C; Romain, A J; Lareyre, O; Carayol, M; Ninot, G; Boiché, J

2016-01-01

Implementing theory-based interventions is an effective way to influence physical activity (PA) behaviour in the population. This meta-analysis aimed to (1) determine the global effect of theory-based randomised controlled trials dedicated to the promotion of PA among adults, (2) measure the actual efficacy of interventions against their theoretical objectives and (3) compare the efficacy of single- versus combined-theory interventions. A systematic search through databases and review articles was carried out. Our results show that theory-based interventions (k = 82) significantly impact the PA behaviour of participants (d = 0.31, 95% CI [0.24, 0.37]). While moderation analyses revealed no efficacy difference between theories, interventions based on a single theory (d = 0.35; 95% CI [0.26, 0.43]) reported a higher impact on PA behaviour than those based on a combination of theories (d = 0.21; 95% CI [0.11, 0.32]). In spite of the global positive effect of theory-based interventions on PA behaviour, further research is required to better identify the specificities, overlaps or complementarities of the components of interventions based on relevant theories.

17. Personality and Psychopathology: a Theory-Based Revision of Eysenck’s PEN Model

PubMed Central

van Kampen, Dirk

2009-01-01

The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck’s original PEN model by repairing the various shortcomings that can be noted in Eysenck’s personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question ‘which personality factors are basic?’, arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck’s theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck’s PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions. PMID:20498694

18. An interface energy density-based theory considering the coherent interface effect in nanomaterials

Yao, Yin; Chen, Shaohua; Fang, Daining

2017-02-01

To characterize the coherent interface effect conveniently and feasibly in nanomaterials, a continuum theory is proposed that is based on the concept of the interface free energy density, which is a dominant factor affecting the mechanical properties of the coherent interface in materials of all scales. The effect of the residual strain caused by self-relaxation and the lattice misfit of nanomaterials, as well as that due to the interface deformation induced by an external load on the interface free energy density is considered. In contrast to the existing theories, the stress discontinuity at the interface is characterized by the interface free energy density through an interface-induced traction. As a result, the interface elastic constant introduced in previous theories, which is not easy to determine precisely, is avoided in the present theory. Only the surface energy density of the bulk materials forming the interface, the relaxation parameter induced by surface relaxation, and the mismatch parameter for forming a coherent interface between the two surfaces are involved. All the related parameters are far easier to determine than the interface elastic constants. The effective bulk and shear moduli of a nanoparticle-reinforced nanocomposite are predicted using the proposed theory. Closed-form solutions are achieved, demonstrating the feasibility and convenience of the proposed model for predicting the interface effect in nanomaterials.

19. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission.

PubMed

Kiani, Mehdi; Ghovanloo, Maysam

2012-09-01

Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory.

20. Membrane-Based Characterization of a Gas Component — A Transient Sensor Theory

PubMed Central

Lazik, Detlef

2014-01-01

Based on a multi-gas solution-diffusion problem for a dense symmetrical membrane this paper presents a transient theory of a planar, membrane-based sensor cell for measuring gas from both initial conditions: dynamic and thermodynamic equilibrium. Using this theory, the ranges for which previously developed, simpler approaches are valid will be discussed; these approaches are of vital interest for membrane-based gas sensor applications. Finally, a new theoretical approach is introduced to identify varying gas components by arranging sensor cell pairs resulting in a concentration independent gas-specific critical time. Literature data for the N2, O2, Ar, CH4, CO2, H2 and C4H10 diffusion coefficients and solubilities for a polydimethylsiloxane membrane were used to simulate gas specific sensor responses. The results demonstrate the influence of (i) the operational mode; (ii) sensor geometry and (iii) gas matrices (air, Ar) on that critical time. Based on the developed theory the case-specific suitable membrane materials can be determined and both operation and design options for these sensors can be optimized for individual applications. The results of mixing experiments for different gases (O2, CO2) in a gas matrix of air confirmed the theoretical predictions. PMID:24608004

1. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

2016-06-01

The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

2. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

PubMed

Salim, Shelly; Moh, Sangman

2016-06-30

A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead.

3. Surface effects on the vibration behavior of flexoelectric nanobeams based on nonlocal elasticity theory

2017-01-01

In this research, vibration characteristics of a flexoelectric nanobeam in contact with Winkler-Pasternak foundation is investigated based on the nonlocal elasticity theory considering surface effects. This nonclassical nanobeam model contains flexoelectric effect to capture coupling of strain gradients and electrical polarizations. Moreover, the nonlocal elasticity theory is employed to study the nonlocal and long-range interactions between the particles. The present model can degenerate into the classical model if the nonlocal parameter, flexoelectric and surface effects are omitted. Hamilton's principle is employed to derive the governing equations and the related boundary conditions which are solved applying a Galerkin-based solution. Natural frequencies are verified with those of previous papers on nanobeams. It is illustrated that flexoelectricity, nonlocality, surface stresses, elastic foundation and boundary conditions affects considerably the vibration frequencies of piezoelectric nanobeams.

4. Theory-based metrological traceability in education: A reading measurement network.

PubMed

Fisher, William P; Stenner, A Jackson

2016-10-01

Huge resources are invested in metrology and standards in the natural sciences, engineering, and across a wide range of commercial technologies. Significant positive returns of human, social, environmental, and economic value on these investments have been sustained for decades. Proven methods for calibrating test and survey instruments in linear units are readily available, as are data- and theory-based methods for equating those instruments to a shared unit. Using these methods, metrological traceability is obtained in a variety of commercially available elementary and secondary English and Spanish language reading education programs in the U.S., Canada, Mexico, and Australia. Given established historical patterns, widespread routine reproduction of predicted text-based and instructional effects expressed in a common language and shared frame of reference may lead to significant developments in theory and practice. Opportunities for systematic implementations of teacher-driven lean thinking and continuous quality improvement methods may be of particular interest and value.

5. Design of Flexure-based Precision Transmission Mechanisms using Screw Theory

SciTech Connect

Hopkins, J B; Panas, R M

2011-02-07

This paper enables the synthesis of flexure-based transmission mechanisms that possess multiple decoupled inputs and outputs of any type (e.g. rotations, translations, and/or screw motions), which are linked by designer-specified transmission ratios. A comprehensive library of geometric shapes is utilized from which every feasible concept that possesses the desired transmission characteristics may be rapidly conceptualized and compared before an optimal concept is selected. These geometric shapes represent the rigorous mathematics of screw theory and uniquely link a body's desired motions to the flexible constraints that enable those motions. This paper's impact is most significant to the design of nano-positioners, microscopy stages, optical mounts, and sensors. A flexure-based microscopy stage was designed, fabricated, and tested to demonstrate the utility of the theory.

6. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks

PubMed Central

Salim, Shelly; Moh, Sangman

2016-01-01

A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

7. Eight myths on motivating social services workers: theory-based perspectives.

PubMed

Latting, J K

1991-01-01

A combination of factors has made formal motivational and reward systems rare in human service organizations generally and virtually non-existent in social service agencies. The author reviews eight of these myths by reference to eight motivational theories which refute them: need theory, expectancy theory, feedback theory, equity theory, reinforcement theory, cognitive evaluation theory, goal setting theory, and social influence theory. Although most of these theories have been developed and applied in the private sector, relevant research has also been conducted in social service agencies. The author concludes with a summary of guidelines suggested by the eight theories for motivating human service workers.

8. A simplified orthotropic formulation of the viscoplasticity theory based on overstress

NASA Technical Reports Server (NTRS)

Sutcu, M.; Krempl, E.

1988-01-01

An orthotropic, small strain viscoplasticity theory based on overstress is presented. In each preferred direction the stress is composed of time (rate) independent (or plastic) and viscous (or rate dependent) contributions. Tension-compression asymmetry can depend on direction and is included in the model. Upon a proper choice of a material constant one preferred direction can exhibit linear elastic response while the other two deform in a viscoplastic manner.

9. A reexamination of information theory-based methods for DNA-binding site identification

PubMed Central

Erill, Ivan; O'Neill, Michael C

2009-01-01

Background Searching for transcription factor binding sites in genome sequences is still an open problem in bioinformatics. Despite substantial progress, search methods based on information theory remain a standard in the field, even though the full validity of their underlying assumptions has only been tested in artificial settings. Here we use newly available data on transcription factors from different bacterial genomes to make a more thorough assessment of information theory-based search methods. Results Our results reveal that conventional benchmarking against artificial sequence data leads frequently to overestimation of search efficiency. In addition, we find that sequence information by itself is often inadequate and therefore must be complemented by other cues, such as curvature, in real genomes. Furthermore, results on skewed genomes show that methods integrating skew information, such as Relative Entropy, are not effective because their assumptions may not hold in real genomes. The evidence suggests that binding sites tend to evolve towards genomic skew, rather than against it, and to maintain their information content through increased conservation. Based on these results, we identify several misconceptions on information theory as applied to binding sites, such as negative entropy, and we propose a revised paradigm to explain the observed results. Conclusion We conclude that, among information theory-based methods, the most unassuming search methods perform, on average, better than any other alternatives, since heuristic corrections to these methods are prone to fail when working on real data. A reexamination of information content in binding sites reveals that information content is a compound measure of search and binding affinity requirements, a fact that has important repercussions for our understanding of binding site evolution. PMID:19210776

10. Microvibration Attenuation based on H∞/LPV Theory for High Stability Space Missions

Preda, Valentin; Cieslak, Jerome; Henry, David; Bennani, Samir; Falcoz, Alexandre

2015-11-01

This paper presents a LPV (Linear Parameter Varying) solution for a mixed passive-active architecture used to mitigate the microvibrations generated by reaction wheels in satellites. In particular, H∞/LPV theory is used to mitigate low frequency disturbances, current baseline for high frequency microvibration mitigation being based on elastomer materials. The issue of multiple harmonic microvibrations is also investigated. Simulation results from a test benchmark provided by Airbus Defence and Space demonstrate the potential of the proposed method.

11. Information-theory-based solution of the inverse problem in classical statistical mechanics.

PubMed

D'Alessandro, Marco; Cilloco, Francesco

2010-08-01

We present a procedure for the determination of the interaction potential from the knowledge of the radial pair distribution function. The method, realized inside an inverse Monte Carlo simulation scheme, is based on the application of the maximum entropy principle of information theory and the interaction potential emerges as the asymptotic expression of the transition probability. Results obtained for high density monoatomic fluids are very satisfactory and provide an accurate extraction of the potential, despite a modest computational effort.

12. A Quantitative Quasispecies Theory-Based Model of Virus Escape Mutation Under Immune Selection

DTIC Science & Technology

2012-01-01

A quantitative quasispecies theory-based model of virus escape mutation under immune selection Hyung-June Woo and Jaques Reifman1 Biotechnology High...Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although...response. The virus quasispecies dynamics are explicitly repre- sented by mutations in the combined sequence space of a set of epitopes within the viral

13. Applying Educational Theory to Simulation-Based Training and Assessment in Surgery.

PubMed

Chauvin, Sheila W

2015-08-01

Considerable progress has been made regarding the range of simulator technologies and simulation formats. Similarly, results from research in human learning and behavior have facilitated the development of best practices in simulation-based training (SBT) and surgical education. Today, SBT is a common curriculum component in surgical education that can significantly complement clinical learning, performance, and patient care experiences. Beginning with important considerations for selecting appropriate forms of simulation, several relevant educational theories of learning are described.

14. A general theory to analyse and design wireless power transfer based on impedance matching

Liu, Shuo; Chen, Linhui; Zhou, Yongchun; Cui, Tie Jun

2014-10-01

We propose a general theory to analyse and design the wireless power transfer (WPT) systems based on impedance matching. We take two commonly used structures as examples, the transformer-coupling-based WPT and the series/parallel capacitor-based WPT, to show how to design the impedance matching network (IMN) to obtain the maximum transfer efficiency and the maximum output power. Using the impedance matching theory (IMT), we derive a simple expression of the overall transfer efficiency by the coils' quality factors and the coupling coefficient, which has perfect accuracy compared to full-circuit simulations. Full-wave electromagnetic software, CST Microwave Studio, has been used to extract the parameters of coils, thus providing us a comprehensive way to simulate WPT systems directly from the coils' physical model. We have also discussed the relationship between the output power and the transfer efficiency, and found that the maximum output power and the maximum transfer efficiency may occur at different frequencies. Hence, both power and efficiency should be considered in real WPT applications. To validate the proposed theory, two types of WPT experiments have been conducted using 30 cm-diameter coils for lighting a 20 W light bulb with 60% efficiency over a distance of 50 cm. The experimental results have very good agreements to the theoretical predictions.

15. Practice of Improving Roll Deformation Theory in Strip Rolling Process Based on Boundary Integral Equation Method

Yuan, Zhengwen; Xiao, Hong; Xie, Hongbiao

2014-02-01

Precise strip-shape control theory is significant to improve rolled strip quality, and roll flattening theory is a primary part of the strip-shape theory. To improve the accuracy of roll flattening calculation based on semi-infinite body model, a new and more accurate roll flattening model is proposed in this paper, which is derived based on boundary integral equation method. The displacement fields of the finite length semi-infinite body on left and right sides are simulated by using finite element method (FEM) and displacement decay functions on left and right sides are established. Based on the new roll flattening model, a new 4Hi mill deformation model is established and verified by FEM. The new model is compared with Foppl formula and semi-infinite body model in different strip width, roll shifting value and bending force. The results show that the pressure and flattening between rolls calculated by the new model are more precise than other two models, especially near the two roll barrel edges.

16. A new probability distribution model of turbulent irradiance based on Born perturbation theory

Wang, Hongxing; Liu, Min; Hu, Hao; Wang, Qian; Liu, Xiguo

2010-10-01

The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled. Theory reliably describes the behavior in the weak turbulence regime, but theoretical description in the strong and whole turbulence regimes are still controversial. Based on Born perturbation theory, the physical manifestations and correlations of three typical PDF models (Rice-Nakagami, exponential-Bessel and negative-exponential distribution) were theoretically analyzed. It is shown that these models can be derived by separately making circular-Gaussian, strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory, which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications. In addition, a common shortcoming of the three models is that they are all approximations. A new model, called the Maclaurin-spread distribution, is proposed without any approximation except for assuming the correlation coefficient to be zero. So, it is considered that the new model can exactly reflect the Born perturbation theory. Simulated results prove the accuracy of this new model.

17. Adapting SAFT-γ perturbation theory to site-based molecular dynamics simulation. I. Homogeneous fluids

SciTech Connect

2013-12-21

In this work, we aim to develop a version of the Statistical Associating Fluid Theory (SAFT)-γ equation of state (EOS) that is compatible with united-atom force fields, rather than experimental data. We rely on the accuracy of the force fields to provide the relation to experimental data. Although, our objective is a transferable theory of interfacial properties for soft and fused heteronuclear chains, we first clarify the details of the SAFT-γ approach in terms of site-based simulations for homogeneous fluids. We show that a direct comparison of Helmholtz free energy to molecular simulation, in the framework of a third order Weeks-Chandler-Andersen perturbation theory, leads to an EOS that takes force field parameters as input and reproduces simulation results for Vapor-Liquid Equilibria (VLE) calculations. For example, saturated liquid density and vapor pressure of n-alkanes ranging from methane to dodecane deviate from those of the Transferable Potential for Phase Equilibria (TraPPE) force field by about 0.8% and 4%, respectively. Similar agreement between simulation and theory is obtained for critical properties and second virial coefficient. The EOS also reproduces simulation data of mixtures with about 5% deviation in bubble point pressure. Extension to inhomogeneous systems and united-atom site types beyond those used in description of n-alkanes will be addressed in succeeding papers.

18. A theory-based approach to understanding suicide risk in shelter-seeking women.

PubMed

Wolford-Clevenger, Caitlin; Smith, Phillip N

2015-04-01

Women seeking shelter from intimate partner violence are at an increased risk for suicide ideation and attempts compared to women in the general population. Control-based violence, which is common among shelter-seeking women, may play a pivotal role in the development of suicide ideation and attempts. Current risk assessment and management practices for shelter-seeking women are limited by the lack of an empirically grounded understanding of increased risk in this population. We argue that in order to more effectively promote risk assessment and management, an empirically supported theory that is sensitive to the experiences of shelter-seeking women is needed. Such a theory-driven approach has the benefits of identifying and prioritizing targetable areas for intervention. Here, we review the evidence for the link between coercive control and suicide ideation and attempts from the perspective of Baumeister's escape theory of suicide. This theory has the potential to explain the role of coercive control in the development of suicide ideation and eventual attempts in shelter-seeking women. Implications for suicide risk assessment and prevention in domestic violence shelters are discussed.

19. Theory of electrical conductivity and dielectric permittivity of highly aligned graphene-based nanocomposites.

PubMed

Xia, Xiaodong; Hao, Jia; Wang, Yang; Zhong, Zheng; Weng, George J

2017-03-24

Highly aligned graphene-based nanocomposites are of great interest due to their excellent electrical properties along the aligned direction. Graphene fillers in these composites are not necessarily perfectly aligned, but their orientations are highly confined to a certain angle, with 90-degree giving rise to the randomly oriented state and 0-degree to the perfectly aligned one. Recent experiments have shown that electrical conductivity and dielectric permittivity of highly aligned graphene-polymer nanocomposites are strongly dependent on this distribution angle, but at present no theory seems to exist to address this issue. In this work we present a new effective-medium theory that is derived from the underlying physical process including the effects of graphene orientation, filler loading, aspect ratio, percolation threshold, interfacial tunneling, and Maxwell-Wagner-Sillars polarization, to determine these two properties. The theory is formulated in the context of preferred orientational average. We highlight this new theory with an application to rGO/epoxy nanocomposites, and demonstrate that the calculated in-plane and out-of-plane conductivity and permittivity are in agreement with the experimental data as the range of graphene orientations changes from the randomly oriented to the highly aligned state. We also show that the percolation thresholds of highly aligned graphene nanocomposites are in general different along the planar and the normal directions, but they converge into a single one when the statistical distribution of graphene fillers is spherically symmetric.

20. Theory of plasma contactors in ground-based experiments and low Earth orbit

NASA Technical Reports Server (NTRS)

Gerver, M. J.; Hastings, Daniel E.; Oberhardt, M. R.

1990-01-01

Previous theoretical work on plasma contactors as current collectors has fallen into two categories: collisionless double layer theory (describing space charge limited contactor clouds) and collisional quasineutral theory. Ground based experiments at low current are well explained by double layer theory, but this theory does not scale well to power generation by electrodynamic tethers in space, since very high anode potentials are needed to draw a substantial ambient electron current across the magnetic field in the absence of collisions (or effective collisions due to turbulence). Isotropic quasineutral models of contactor clouds, extending over a region where the effective collision frequency upsilon sub e exceeds the electron cyclotron frequency omega sub ce, have low anode potentials, but would collect very little ambient electron current, much less than the emitted ion current. A new model is presented, for an anisotropic contactor cloud oriented along the magnetic field, with upsilon sub e less than omega sub ce. The electron motion along the magnetic field is nearly collisionless, forming double layers in that direction, while across the magnetic field the electrons diffuse collisionally and the potential profile is determined by quasineutrality. Using a simplified expression for upsilon sub e due to ion acoustic turbulence, an analytic solution has been found for this model, which should be applicable to current collection in space. The anode potential is low and the collected ambient electron current can be several times the emitted ion current.

1. Current algebra formulation of M-theory based on E11 Kac-Moody algebra

Sugawara, Hirotaka

2017-02-01

Quantum M-theory is formulated using the current algebra technique. The current algebra is based on a Kac-Moody algebra rather than usual finite dimensional Lie algebra. Specifically, I study the E11 Kac-Moody algebra that was shown recently1‑5 to contain all the ingredients of M-theory. Both the internal symmetry and the external Lorentz symmetry can be realized inside E11, so that, by constructing the current algebra of E11, I obtain both internal gauge theory and gravity theory. The energy-momentum tensor is constructed as the bilinear form of the currents, yielding a system of quantum equations of motion of the currents/fields. Supersymmetry is incorporated in a natural way. The so-called “field-current identity” is built in and, for example, the gravitino field is itself a conserved supercurrent. One unanticipated outcome is that the quantum gravity equation is not identical to the one obtained from the Einstein-Hilbert action.

2. Three new branched chain equations of state based on Wertheim's perturbation theory.

PubMed

Marshall, Bennett D; Chapman, Walter G

2013-05-07

In this work, we present three new branched chain equations of state (EOS) based on Wertheim's perturbation theory. The first represents a slightly approximate general branched chain solution of Wertheim's second order perturbation theory (TPT2) for athermal hard chains, and the second represents the extension of first order perturbation theory with a dimer reference fluid (TPT1-D) to branched athermal hard chain molecules. Each athermal branched chain EOS was shown to give improved results over their linear counterparts when compared to simulation data for branched chain molecules with the branched TPT1-D EOS being the most accurate. Further, it is shown that the branched TPT1-D EOS can be extended to a Lennard-Jones dimer reference system to obtain an equation of state for branched Lennard-Jones chains. The theory is shown to accurately predict the change in phase diagram and vapor pressure which results from branching as compared to experimental data for n-octane and corresponding branched isomers.

3. Characterizations of MV-algebras based on the theory of falling shadows.

PubMed

Yang, Yongwei; Xin, Xiaolong; He, Pengfei

2014-01-01

Based on the falling shadow theory, the concept of falling fuzzy (implicative) ideals as a generalization of that of a T ∧-fuzzy (implicative) ideal is proposed in MV-algebras. The relationships between falling fuzzy (implicative) ideals and T-fuzzy (implicative) ideals are discussed, and conditions for a falling fuzzy (implicative) ideal to be a T ∧-fuzzy (implicative) ideal are provided. Some characterizations of falling fuzzy (implicative) ideals are presented by studying proprieties of them. The product ⊛ and the up product ⊚ operations on falling shadows and the upset of a falling shadow are established, by which T-fuzzy ideals are investigated based on probability spaces.

4. [A method for the medical image registration based on the statistics samples averaging distribution theory].

PubMed

Xu, Peng; Yao, Dezhong; Luo, Fen

2005-08-01

The registration method based on mutual information is currently a popular technique for the medical image registration, but the computation for the mutual information is complex and the registration speed is slow. In engineering process, a subsampling technique is taken to accelerate the registration speed at the cost of registration accuracy. In this paper a new method based on statistics sample theory is developed, which has both a higher speed and a higher accuracy as compared with the normal subsampling method, and the simulation results confirm the validity of the new method.

5. Testing a theory-based mobility monitoring protocol using in-home sensors: a feasibility study.

PubMed

Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J

2013-10-01

Mobility is a key factor in the performance of many everyday tasks required for independent living as a person ages. The purpose of this mixed-methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assess the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial, and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3-month, and 6-month visits. Semi-structured interviews to characterize acceptability of the technology were conducted at the 3-month and 6-month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation.

6. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

Cifter, Atilla

2011-06-01

This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

7. Promoting fruit and vegetable consumption. Testing an intervention based on the theory of planned behaviour.

PubMed

Kothe, E J; Mullan, B A; Butow, P

2012-06-01

This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result.

8. Developing Theory to Guide Building Practitioners’ Capacity to Implement Evidence-Based Interventions

PubMed Central

Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C.; Escoffery, Cam T.; Herrmann, Alison K.; Thatcher, Esther; Hartman, Marieke A.; Fernandez, Maria

2017-01-01

Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners’ capacity to adopt and implement a variety of EBIs across diverse practice contexts. PMID:26500080

9. Developing Theory to Guide Building Practitioners' Capacity to Implement Evidence-Based Interventions.

PubMed

Leeman, Jennifer; Calancie, Larissa; Kegler, Michelle C; Escoffery, Cam T; Herrmann, Alison K; Thatcher, Esther; Hartman, Marieke A; Fernandez, Maria E

2017-02-01

Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners' capacity to adopt and implement a variety of EBIs across diverse practice contexts.

10. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

2014-09-01

Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

11. An optimization program based on the method of feasible directions: Theory and users guide

NASA Technical Reports Server (NTRS)

Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

1994-01-01

The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

12. Blockage fault diagnosis method of combine harvester based on BPNN and DS evidence theory

Chen, Jin; Xu, Kai; Wang, Yifan; Wang, Kun; Wang, Shuqing

2017-01-01

According to the complexity and the lack of intelligent analysis method of combine harvester blockage fault , this paper puts forward a method , based on the combination of BP neural network (BPNN)and DS evidence theory , for combine harvester blockage fault diagnosis. Choosing cutting table auger, conveyer trough, threshing cylinder and grain conveying auger as the study, this paper divides the condition of combine harvester into four categories, namely, normal, slightly blocking, blockage, severe blockage, which being as an identification framework for DS evidence theory. BP neural network is used for analysing speed information of monitoring points and distributing basic probability for each proposition in the identification framework. Dempster combination rule converged information at different time to obtain diagnostic results.Test results show that this method can timely and accurately judge the work state of combine harvester, the blocking fault warning time will be increased to 2 seconds and the success probability of blocking fault warning reach more than 90%.

13. Generalized poroviscoelastic model based on effective Biot theory and its application to borehole guided wave analysis

Liu, Xu; Greenhalgh, Stewart; Zhou, Bing; Heinson, Graham

2016-12-01

A method using modified attenuation factor function is suggested to determine the parameters of the generalized Zener model approximating the attenuation factor function. This method is applied to constitute the poroviscoelastic model based on the effective Biot theory which considers the attenuative solid frame of reservoir. In the poroviscoelastic model, frequency-dependent bulk modulus and shear modulus of solid frame are represented by generalized Zener models. As an application, the borehole logging dispersion equations from Biot theory are extended to include effects from the intrinsic body attenuation in formation media in full-frequency range. The velocity dispersions of borehole guided waves are calculated to investigate the influence from attenuative bore fluid, attenuative solid frame of the formation and impermeable bore wall.

14. Change detection of bitemporal multispectral images based on FCM and D-S theory

Shi, Aiye; Gao, Guirong; Shen, Shaohong

2016-12-01

In this paper, we propose a change detection method of bitemporal multispectral images based on the D-S theory and fuzzy c-means (FCM) algorithm. Firstly, the uncertainty and certainty regions are determined by thresholding method applied to the magnitudes of difference image (MDI) and spectral angle information (SAI) of bitemporal images. Secondly, the FCM algorithm is applied to the MDI and SAI in the uncertainty region, respectively. Then, the basic probability assignment (BPA) functions of changed and unchanged classes are obtained by the fuzzy membership values from the FCM algorithm. In addition, the optimal value of fuzzy exponent of FCM is adaptively determined by conflict degree between the MDI and SAI in uncertainty region. Finally, the D-S theory is applied to obtain the new fuzzy partition matrix for uncertainty region and further the change map is obtained. Experiments on bitemporal Landsat TM images and bitemporal SPOT images validate that the proposed method is effective.

15. Social judgment theory based model on opinion formation, polarization and evolution

Chau, H. F.; Wong, C. Y.; Chow, F. K.; Fung, Chi-Hang Fred

2014-12-01

The dynamical origin of opinion polarization in the real world is an interesting topic that physical scientists may help to understand. To properly model the dynamics, the theory must be fully compatible with findings by social psychologists on microscopic opinion change. Here we introduce a generic model of opinion formation with homogeneous agents based on the well-known social judgment theory in social psychology by extending a similar model proposed by Jager and Amblard. The agents’ opinions will eventually cluster around extreme and/or moderate opinions forming three phases in a two-dimensional parameter space that describes the microscopic opinion response of the agents. The dynamics of this model can be qualitatively understood by mean-field analysis. More importantly, first-order phase transition in opinion distribution is observed by evolving the system under a slow change in the system parameters, showing that punctuated equilibria in public opinion can occur even in a fully connected social network.

16. Dynamically Incremental K-means++ Clustering Algorithm Based on Fuzzy Rough Set Theory

Li, Wei; Wang, Rujing; Jia, Xiufang; Jiang, Qing

Being classic K-means++ clustering algorithm only for static data, dynamically incremental K-means++ clustering algorithm (DK-Means++) is presented based on fuzzy rough set theory in this paper. Firstly, in DK-Means++ clustering algorithm, the formula of similar degree is improved by weights computed by using of the important degree of attributes which are reduced on the basis of rough fuzzy set theory. Secondly, new data only need match granular which was clustered by K-means++ algorithm or seldom new data is clustered by classic K-means++ algorithm in global data. In this way, that all data is re-clustered each time in dynamic data set is avoided, so the efficiency of clustering is improved. Throughout our experiments showing, DK-Means++ algorithm can objectively and efficiently deal with clustering problem of dynamically incremental data.

17. Control theory based airfoil design for potential flow and a finite volume discretization

NASA Technical Reports Server (NTRS)

Reuther, J.; Jameson, A.

1994-01-01

This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

18. A preliminary study on atrial epicardial mapping signals based on Graph Theory.

PubMed

Sun, Liqian; Yang, Cuiwei; Zhang, Lin; Chen, Ying; Wu, Zhong; Shao, Jun

2014-07-01

In order to get a better understanding of atrial fibrillation, we introduced a method based on Graph Theory to interpret the relations of different parts of the atria. Atrial electrograms under sinus rhythm and atrial fibrillation were collected from eight living mongrel dogs with cholinergic AF model. These epicardial signals were acquired from 95 unipolar electrodes attached to the surface of the atria and four pulmonary veins. Then, we analyzed the electrode correlations using Graph Theory. The topology, the connectivity and the parameters of graphs during different rhythms were studied. Our results showed that the connectivity of graphs varied from sinus rhythm to atrial fibrillation and there were parameter gradients in various parts of the atria. The results provide spatial insight into the interaction between different parts of the atria and the method may have its potential for studying atrial fibrillation.

19. Characterization of degeneration process in combustion instability based on dynamical systems theory

Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

2015-11-01

We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011), 10.1063/1.3563577; Chaos 22, 043128 (2012), 10.1063/1.4766589]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics.

20. Characterization of degeneration process in combustion instability based on dynamical systems theory.

PubMed

Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

2015-11-01

We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011); Chaos 22, 043128 (2012)]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics.

1. Collaborative filtering algorithm based on Forgetting Curve and Long Tail theory

Qi, Shen; Li, Shiwei; Zhou, Hao

2017-03-01

The traditional collaborative filtering algorithm only pays attention to the rating by users. In reality, however, user and item information is always changing with time flying. Therefore, recommendation systems need to take time-varying changes into consideration. The collaborative filtering algorithm which is based on Forgetting Curve and Long Tail theory (FCLT) is introduced for the above problems. The following two points are discussed depending on the problem: First, the user-item rating matrix can update in real time by forgetting curve; secondly, according to the Long Tail theory and item popularity, a further similarity calculation method is obtained. The experimental results demonstrated that the proposed algorithm can effectively improve the recommendation accuracy and alleviate the Long Tail effect.

2. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

NASA Technical Reports Server (NTRS)

Waszak, Martin R.

1992-01-01

The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

3. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach

PubMed Central

Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

2016-01-01

The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287

4. Accuracies of the empirical theories of the escape probability based on Eigen model and Braun model compared with the exact extension of Onsager theory.

PubMed

Wojcik, Mariusz; Tachiya, M

2009-03-14

This paper deals with the exact extension of the original Onsager theory of the escape probability to the case of finite recombination rate at nonzero reaction radius. The empirical theories based on the Eigen model and the Braun model, which are applicable in the absence and presence of an external electric field, respectively, are based on a wrong assumption that both recombination and separation processes in geminate recombination follow exponential kinetics. The accuracies of the empirical theories are examined against the exact extension of the Onsager theory. The Eigen model gives the escape probability in the absence of an electric field, which is different by a factor of 3 from the exact one. We have shown that this difference can be removed by operationally redefining the volume occupied by the dissociating partner before dissociation, which appears in the Eigen model as a parameter. The Braun model gives the escape probability in the presence of an electric field, which is significantly different from the exact one over the whole range of electric fields. Appropriate modification of the original Braun model removes the discrepancy at zero or low electric fields, but it does not affect the discrepancy at high electric fields. In all the above theories it is assumed that recombination takes place only at the reaction radius. The escape probability in the case when recombination takes place over a range of distances is also calculated and compared with that in the case of recombination only at the reaction radius.

5. A boundary-continuous-displacement based Fourier analysis of laminated doubly-curved panels using classical shallow shell theories

Chaudhuri, Reaz A.; Kabir, Humayun R.

1992-11-01

A new methodology based on classical shallow shell theories is presented for solution to the static response and eigenvalue problems, involving a system of one fourth-order and two third-order highly coupled linear partial differential equations with the SS2-type simply supported boundary conditions. A comparison with solutions based on the first-order shear deformation theory made it possible to establish the upper limit of validity of the present classical lamination theory (CLT) based natural frequencies for angle-ply panels. Data obtained confirmed that introduction of transverse shear stress resultants into the two surface-parallel force equilibrium equations without concomitant changes in the kinematic relations constitutes little improvement over Donnell's or Sanders' shell theories, and all four classical shallow shell theories furnish virtually indistinguishable numerical results.

6. An Implicational View of Self-Healing and Personality Change Based on Gendlin's Theory of Experiencing.

ERIC Educational Resources Information Center

Bohart, Arthur C.

There is relatively little theory on how psychotherapy clients self-heal since most theories of therapy stress the magic of the therapist's interventions. Of the theories that exist, this paper briefly discusses Carl Rogers' theory of self-actualization; and the dialectical theories of Greenberg and his colleagues, Jenkins, and Rychlak. Gendlin's…

7. Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation

ERIC Educational Resources Information Center

Nam, Chang S.; Smith-Jackson, Tonya L.

2007-01-01

Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…

8. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

2012-01-01

SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area

9. Optimal control of ICU patient discharge: from theory to implementation.

PubMed

Mallor, Fermín; Azcárate, Cristina; Barado, Julio

2015-09-01

This paper deals with the management of scarce health care resources. We consider a control problem in which the objective is to minimize the rate of patient rejection due to service saturation. The scope of decisions is limited, in terms both of the amount of resources to be used, which are supposed to be fixed, and of the patient arrival pattern, which is assumed to be uncontrollable. This means that the only potential areas of control are speed or completeness of service. By means of queuing theory and optimization techniques, we provide a theoretical solution expressed in terms of service rates. In order to make this theoretical analysis useful for the effective control of the healthcare system, however, further steps in the analysis of the solution are required: physicians need flexible and medically-meaningful operative rules for shortening patient length of service to the degree needed to give the service rates dictated by the theoretical analysis. The main contribution of this paper is to discuss how the theoretical solutions can be transformed into effective management rules to guide doctors' decisions. The study examines three types of rules based on intuitive interpretations of the theoretical solution. Rules are evaluated through implementation in a simulation model. We compare the service rates provided by the different policies with those dictated by the theoretical solution. Probabilistic analysis is also included to support rule validity. An Intensive Care Unit is used to illustrate this control problem. The study focuses on the Markovian case before moving on to consider more realistic LoS distributions (Weibull, Lognormal and Phase-type distribution).

10. Simulating Single Word Processing in the Classic Aphasia Syndromes Based on the Wernicke-Lichtheim-Geschwind Theory

ERIC Educational Resources Information Center

Weems, Scott A.; Reggia, James A.

2006-01-01

The Wernicke-Lichtheim-Geschwind (WLG) theory of the neurobiological basis of language is of great historical importance, and it continues to exert a substantial influence on most contemporary theories of language in spite of its widely recognized limitations. Here, we suggest that neurobiologically grounded computational models based on the WLG…

11. The Conceptual Mechanism for Viable Organizational Learning Based on Complex System Theory and the Viable System Model

ERIC Educational Resources Information Center

Sung, Dia; You, Yeongmahn; Song, Ji Hoon

2008-01-01

The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…

12. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

ERIC Educational Resources Information Center

Fukuhara, Hirotaka; Kamata, Akihito

2011-01-01

A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

13. An Alienation-Based Framework for Student Experience in Higher Education: New Interpretations of Past Observations in Student Learning Theory

ERIC Educational Resources Information Center

2014-01-01

This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…

14. Evaluating Art Studio Courses at Sultan Qaboos University in Light of the Discipline Based Art Education Theory

ERIC Educational Resources Information Center

Al-Amri, Mohammed

2010-01-01

Discipline-Based Art Education (DBAE), a theory developed in the USA, has been influential but also used in Art Education institutions world-wide. One of its stated goals was to develop the quality of teaching art education. Today, it is used as a theory for identifying and assessing good practices in the field of Art Education. The purpose of…

15. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

SciTech Connect

Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

2006-10-01

Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

16. Motivational Measure of the Instruction Compared: Instruction Based on the ARCS Motivation Theory vs Traditional Instruction in Blended Courses

ERIC Educational Resources Information Center

Colakoglu, Ozgur M.; Akdemir, Omur

2012-01-01

The ARCS Motivation Theory was proposed to guide instructional designers and teachers who develop their own instruction to integrate motivational design strategies into the instruction. There is a lack of literature supporting the idea that instruction for blended courses if designed based on the ARCS Motivation Theory provides different…

17. A simple laminate theory using the orthotropic viscoplasticity theory based on overstress. I - In-plane stress-strain relationships for metal matrix composites

NASA Technical Reports Server (NTRS)

Krempl, Erhard; Hong, Bor Zen

1989-01-01

A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.

18. Toward a limited realism for psychiatric nosology based on the coherence theory of truth.

PubMed

Kendler, K S

2015-04-01

A fundamental debate in the philosophy of science is whether our central concepts are true or only useful instruments to help predict and manipulate the world. The first position is termed 'realism' and the second 'instrumentalism'. Strong support for the instrumentalist position comes from the 'pessimistic induction' (PI) argument. Given that many key scientific concepts once considered true (e.g., humors, ether, epicycles, phlogiston) are now considered false, how, the argument goes, can we assert that our current concepts are true? The PI argument applies strongly to psychiatric diagnoses. Given our long history of abandoned diagnoses, arguments that we have finally 'gotten it right' and developed definitive psychiatric categories that correspond to observer-independent reality are difficult to defend. For our current diagnostic categories, we should settle for a less ambitious vision of truth. For this, the coherence theory, which postulates that something is true when it fits well with the other things we confidently know about the world, can serve us well. Using the coherence theory, a diagnosis is real to the extent that it is well integrated into our accumulating scientific data base. Furthermore, the coherence theory establishes a framework for us to evaluate our diagnostic categories and can provide a set of criteria, closely related to our concept of validators, for deciding when they are getting better. Finally, we need be much less skeptical about the truth status of the aggregate concept of psychiatric illness than we are regarding the specific categories in our current nosology.

19. Factors influencing variation in physician adenoma detection rates: a theory-based approach

PubMed Central

Atkins, Louise; Hunkeler, Enid M.; Jensen, Christopher D.; Michie, Susan; Lee, Jeffrey K.; Doubeni, Chyke A.; Zauber, Ann G.; Levin, Theodore R.; Quinn, Virginia P.; Corley, Douglas A.

2015-01-01

Background & Aims Interventions to improve physician adenoma detection rates for colonoscopy have generally not been successful and there are little data on the factors contributing to variation that may be appropriate targets for intervention. We sought to identify factors that may influence variation in detection rates using theory-based tools for understanding behavior. Methods We separately studied gastroenterologists and endoscopy nurses at three Kaiser Permanente Northern California medical centers to identify potentially modifiable factors relevant to physician adenoma detection rate variability using structured group interviews (focus groups) and theory-based tools for understanding behavior and eliciting behavior change: the Capability, Opportunity, and Motivation behavior model; the Theoretical Domains Framework; and the Behavior Change Wheel. Results Nine factors potentially associated with detection rate variability were identified, including six related to capability (uncertainty about which types of polyps to remove; style of endoscopy team leadership; compromised ability to focus during an examination due to distractions; examination technique during withdrawal; difficulty detecting certain types of adenomas; and examiner fatigue and pain), two related to opportunity (perceived pressure due to the number of examinations expected per shift and social pressure to finish examinations before scheduled breaks or the end of a shift), and one related to motivation (valuing a meticulous examination as the top priority). Examples of potential intervention strategies are provided. Conclusions Using theory-based tools, this study identified several novel and potentially modifiable factors relating to capability, opportunity, and motivation that may contribute to adenoma detection rate variability and be appropriate targets for future intervention trials. PMID:26366787

20. A theory-based intervention to improve breast cancer awareness and screening in Jamaica.

PubMed

Anakwenze, Chidinma P; Coronado-Interis, Evelyn; Aung, Maung; Jolly, Pauline E

2015-05-01

Despite declines in breast cancer mortality rates in developed countries, mortality rates remain high in Jamaica due to low levels of screening and lack of early detection. We hypothesized that a theory-based health educational intervention would increase awareness of breast cancer and intention to screen among women in Western Jamaica. Two hundred and forty six women attending hospitals or clinics were enrolled in an educational intervention consisting of a pretest, breast cancer presentation, and posttest if they had never been screened or had not been screened in 5 years or more. The questionnaires assessed attitudes and knowledge of risk factors and symptoms related to breast cancer. Participants were followed approximately 6 months after the intervention to determine whether they accessed breast cancer screening. There were statistically significant increases (p < 0.0001) in the percentage of correct knowledge responses and in participants' intention to screen from pretest to posttest. The greatest posttest improvements were among items measuring knowledge of breast cancer screening tests and risk factors. Of the 134 women who were reached by phone for post-intervention follow-up, 30 women (22.4 %) were screened for breast cancer and 104 women (77.6 %) had not been screened. The use of a theory-based educational intervention positively influenced knowledge of breast cancer risk factors, symptoms, and types of screening and increased screening rates in screening-naïve women. This theory-based educational intervention may be replicated to promote awareness of breast cancer and further increase screening rates in other areas of Jamaica and other developing countries.

1. A feature extraction method based on information theory for fault diagnosis of reciprocating machinery.

PubMed

Wang, Huaqing; Chen, Peng

2009-01-01

This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to.

2. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

PubMed Central

Wang, Huaqing; Chen, Peng

2009-01-01

This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

3. Genetic-program-based data mining for hybrid decision-theoretic algorithms and theories

Smith, James F., III

2005-03-01

A genetic program (GP) based data mining (DM) procedure has been developed that automatically creates decision theoretic algorithms. A GP is an algorithm that uses the theory of evolution to automatically evolve other computer programs or mathematical expressions. The output of the GP is a computer program or mathematical expression that is optimal in the sense that it maximizes a fitness function. The decision theoretic algorithms created by the DM algorithm are typically designed for making real-time decisions about the behavior of systems. The database that is mined by the DM typically consists of many scenarios characterized by sensor output and labeled by experts as to the status of the scenario. The DM procedure will call a GP as a data mining function. The GP incorporates the database and expert"s rules into its fitness function to evolve an optimal decision theoretic algorithm. A decision theoretic algorithm created through this process will be discussed as well as validation efforts showing the utility of the decision theoretic algorithm created by the DM process. GP based data mining to determine equations related to scientific theories and automatic simplification methods based on computer algebra will also be discussed.

4. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

PubMed

Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

2016-05-01

This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

5. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

PubMed

Stejskal, Taryn M

2012-01-01

Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole

6. Electron-deuteron scattering based on the Chiral Effective Field Theory

Rozpȩdzik, Dagmara

2014-06-01

Based on the Chiral Effective Field Theory (ChEFT) dynamical picture of the two-pion exchange (TPE) contributions to the nuclear current operator which appear at higher order chiral expansions were considered. Their role in the electron-deuteron scattering reactions was studied and chiral predictions were compared with those obtained in the conventional framework. Results for cross section and various polarization observables are presented. The bound and scattering states were calculated with five different chiral nucleon-nucleon (NN) potentials which leads to the so-called theoretical uncertainty bands for the predicted results.

7. An information theory based search for homogeneity on the largest accessible scale

Sarkar, Suman; Pandey, Biswajit

2016-11-01

We analyse the Sloan Digital Sky Survey Data Release 12 quasar catalogue to test the large-scale smoothness in the quasar distribution. We quantify the degree of inhomogeneity in the quasar distribution using information theory based measures and find that the degree of inhomogeneity diminishes with increasing length scales which finally reach a plateau at ˜250 h-1 Mpc. The residual inhomogeneity at the plateau is consistent with that expected for a Poisson point process. Our results indicate that the quasar distribution is homogeneous beyond length scales of 250 h-1 Mpc.

8. Modelling and analysis of magnetic memory testing method based on the density functional theory

Liu, Bin; Fu, Ying; Jian, Ren

2015-01-01

Metal magnetic memory (MMM) method is a novel, passive magnetic method for inspecting mechanical degradation of ferromagnetic components. To promote a further understanding of the MMM testing mechanism, the relationship between stress concentration and the self-magnetic leakage field measured by MMM effect was quantitatively interpreted using a density functional theory based on the generalised gradient approximation. Meanwhile, the influence of doping effect on MMM signal was calculated. Interestingly, the theoretical approach is in very good agreement with the experimental observations. A new research programme for quantitative interpretation of the MMM effect was initiated.

9. The Study of Relationship and Strategy Between New Energy and Economic Development Based on Decoupling Theory

Liu, Jun; Xu, Hui; Liu, Yaping; Xu, Yang

With the increasing pressure in energy conservation and emissions reduction, the new energy revolution in China is imminent. The implementation of electric energy substitution and cleaner alternatives is an important way to resolve the contradiction among economic growth, energy saving and emission reduction. This article demonstrates that China is in the second stage which energy consumption and GDP is increasing together with the reducing of energy consumption intensity based on the theory of decoupling. At the same time, new energy revolution needs to be realized through the increasing of the carbon productivity and the proportion of new energy.

10. Suppressing Chaos of Warship Power System Based on the Quantum Mechanics Theory

Cong, Xinrong; Li, Longsuo

2014-08-01

Chaos control of marine power system is investigated by adding the Gaussian white noise to the system. The top Lyapunov exponent is computed to detect whether the classical system chaos or not, also the phase portraits are plotted to further verify the obtained results. The classical control of chaos and its quantum counterpart of the marine power system are investigated. The Hamiltonian of the controlled system is given to analyze the quantum counterpart of the classical system, which is based on the quantum mechanics theory.

11. System for absolute measurement of electrolytic conductivity in aqueous solutions based on van der Pauw's theory

Zhang, Bing; Lin, Zhen; Zhang, Xiao; Yu, Xiang; Wei, Jiali; Wang, Xiaoping

2014-05-01

Based on an innovative application of van der Pauw's theory, a system was developed for the absolute measurement of electrolytic conductivity in aqueous solutions. An electrolytic conductivity meter was designed that uses a four-electrode system with an axial-radial two-dimensional adjustment structure coupled to an ac voltage excitation source and signal collecting circuit. The measurement accuracy, resolution and repeatability of the measurement system were examined through a series of experiments. Moreover, the measurement system and a high-precision electrolytic conductivity meter were compared using some actual water samples.

12. Transport-theory based multispectral imaging with PDE-constrained optimization

Kim, Hyun K.; Flexman, Molly; Yamashiro, Darrell J.; Kandel, Jessica J.; Hielscher, Andreas H.

2011-02-01

We introduce here a transport-theory-based PDE-constrained multispectral imaging algorithm for direct reconstruction of the spatial distribution of chromophores in tissue. The method solves the forward and inverse problems simultaneously in the framework of a reduced Hessian sequential quadratic programming method. The performance of the new algorithm is evaluated using numerical and experimental studies involving tumor bearing mice. The results show that the PDE-constrained multispectral method leads to 15-fold acceleration in the image reconstruction of tissue chromophores when compared to the unconstrained multispectral approach and also gives more accurate results when compared to the traditional two-step method.

13. Evaluation of a preschool nutrition education program based on the theory of multiple intelligences.

PubMed

Cason, K L

2001-01-01

This report describes the evaluation of a preschool nutrition education program based on the theory of multiple intelligences. Forty-six nutrition educators provided a series of 12 lessons to 6102 preschool-age children. The program was evaluated using a pretest/post-test design to assess differences in fruit and vegetable identification, healthy snack choices, willingness to taste foods, and eating behaviors. Subjects showed significant improvement in food identification and recognition, healthy snack identification, willingness to taste foods, and frequency of fruit, vegetable, meat, and dairy consumption. The evaluation indicates that the program was an effective approach for educating preschool children about nutrition.

14. Design optical antenna and fiber coupling system based on the vector theory of reflection and refraction.

PubMed

Jiang, Ping; Yang, Huajun; Mao, Shengqian

2015-10-05

A Cassegrain antenna system and an optical fiber coupling system which consists of a plano-concave lens and a plano-convex lens are designed based on the vector theory of reflection and refraction, so as to improve the transmission performance of the optical antenna and fiber coupling system. Three-dimensional ray tracing simulation are performed and results of the optical aberrations calculation and the experimental test show that the aberrations caused by on-axial defocusing, off-axial defocusing and deflection of receiving antenna can be well corrected by the optical fiber coupling system.

15. Constraints on Neutron Star Radii Based on Chiral Effective Field Theory Interactions

SciTech Connect

Hebeler, K.; Lattimer, J. M.; Pethick, C. J.; Schwenk, A.

2010-10-15

We show that microscopic calculations based on chiral effective field theory interactions constrain the properties of neutron-rich matter below nuclear densities to a much higher degree than is reflected in commonly used equations of state. Combined with observed neutron star masses, our results lead to a radius R=9.7-13.9 km for a 1.4M{sub {center_dot}} star, where the theoretical range is due, in about equal amounts, to uncertainties in many-body forces and to the extrapolation to high densities.

16. An ISAR imaging algorithm for the space satellite based on empirical mode decomposition theory

Zhao, Tao; Dong, Chun-zhu

2014-11-01

Currently, high resolution imaging of the space satellite is a popular topic in the field of radar technology. In contrast with regular targets, the satellite target often moves along with its trajectory and simultaneously its solar panel substrate changes the direction toward the sun to obtain energy. Aiming at the imaging problem, a signal separating and imaging approach based on the empirical mode decomposition (EMD) theory is proposed, and the approach can realize separating the signal of two parts in the satellite target, the main body and the solar panel substrate and imaging for the target. The simulation experimentation can demonstrate the validity of the proposed method.

17. Optical thin-film reflection filters based on the theory of photonic crystals.

PubMed

Sun, Xuezheng; Shen, Weidong; Gai, Xin; Gu, Peifu; Liu, Xu; Zhang, Yueguang

2008-05-01

Based on the theory of photonic crystals and the framework of a single-channel reflection filter that we presented before, structures of reflection filters with multiple channels are proposed. These structures can overcome some drawbacks of conventional multichannel transmission filters and are much easier to fabricate. We have practically fabricated the reflection filters with two and three channels, and the tested results show approximate agreement with theoretical simulation. Moreover, the superprism effect is also simulated in the single-channel reflection filter, the superiorities to transmission filters are discussed, and these analyses may shed some light on new applications of reflection filters in optical communication and other systems.

18. The Experimental Research on E-Learning Instructional Design Model Based on Cognitive Flexibility Theory

Cao, Xianzhong; Wang, Feng; Zheng, Zhongmei

The paper reports an educational experiment on the e-Learning instructional design model based on Cognitive Flexibility Theory, the experiment were made to explore the feasibility and effectiveness of the model in promoting the learning quality in ill-structured domain. The study performed the experiment on two groups of students: one group learned through the system designed by the model and the other learned by the traditional method. The results of the experiment indicate that the e-Learning designed through the model is helpful to promote the intrinsic motivation, learning quality in ill-structured domains, ability to resolve ill-structured problem and creative thinking ability of the students.

19. Unique laminar-flow stability limit based shallow-water theory

USGS Publications Warehouse

Chen, Cheng-lung

1993-01-01

Two approaches are generally taken in deriving the stability limit for the Froude member (Fs) for laminar sheet flow. The first approach used the Orr-Sommerfeld equation, while the second uses the cross-section-averaged equations of continuity and motion. Because both approaches are based on shallow-water theory, the values of Fs obtained from both approaches should be identical, yet in the literature they are not. This suggests that a defect exists in at least one of the two approaches. After examining the governing equations used in both approaches, one finds that the existing cross-section -averaged equation of motion is dependent on the frame of reference.

20. A description of the mechanical behavior of composite solid propellants based on molecular theory

NASA Technical Reports Server (NTRS)

Landel, R. F.

1976-01-01

Both the investigation and the representation of the stress-strain response (including rupture) of gum and filled elastomers can be based on a simple functional statement. Internally consistent experiments are used to sort out the effects of time, temperature, strain and crosslink density on gum rubbers. All effects are readily correlated and shown to be essentially independent of the elastomer when considered in terms of non-dimensionalized stress, strain and time. A semiquantitative molecular theory is developed to explain this result. The introduction of fillers modifies the response, but, guided by the framework thus provided, their effects can be readily accounted for.

1. Wavefront sensing based on phase contrast theory and coherent optical processing

Lei, Huang; Qi, Bian; Chenlu, Zhou; Tenghao, Li; Mali, Gong

2016-07-01

A novel wavefront sensing method based on phase contrast theory and coherent optical processing is proposed. The wavefront gradient field in the object plane is modulated into intensity distribution in a gang of patterns, making high-density detection available. By applying the method, we have also designed a wavefront sensor. It consists of a classical coherent optical processing system, a CCD detector array, two pieces of orthogonal composite sinusoidal gratings, and a mechanical structure that can perform real-time linear positioning. The simulation results prove and demonstrate the validity of the method and the sensor in high-precision measurement of the wavefront gradient field.

2. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

PubMed Central

Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

2014-01-01

A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

3. A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua

DOE PAGES

Taylor, Washington; Wang, Yi-Nan

2016-01-22

Here, we use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be ~ 1048. Moreover, the distribution of bases peaks around h1,1 ~ 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We also find that the number of non-Higgsable gauge group factors grows roughly linearly in h1,1 of the threefold base. Typical bases have ~ 6more » isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) x SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) x SU(2) is the third most common connected two-factor product group, following SU(2) x SU(2) and G2 x SU(2), which arise more frequently.« less

4. A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua

SciTech Connect

Taylor, Washington; Wang, Yi-Nan

2016-01-22

Here, we use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be ~ 1048. Moreover, the distribution of bases peaks around h1,1 ~ 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We also find that the number of non-Higgsable gauge group factors grows roughly linearly in h1,1 of the threefold base. Typical bases have ~ 6 isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) x SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) x SU(2) is the third most common connected two-factor product group, following SU(2) x SU(2) and G2 x SU(2), which arise more frequently.

5. Are the Performance Based Logistics Prophets Using Science or Alchemy to Create Life-Cycle Affordability? Using Theory to Predict the Efficacy of Performance Based Logistics

DTIC Science & Technology

2013-10-01

Based Logistics Prophets Using Science or Alchemy to Create Life-Cycle Affordability? Using Theory to Predict the Efficacy of Performance Based...Using Science or Alchemy to Create Life-Cycle Affordability? Using Theory to Predict the Efficacy of Performance Based Logistics 5a. CONTRACT NUMBER 5b...Are the PBL Prophets Using Science or Alchemy to Create Life Cycle Affordability? 328Defense ARJ, October 2013, Vol. 20 No. 3 : 325–348 Defense

6. Investigating the Perceptions of Care Coordinators on Using Behavior Theory-Based Mobile Health Technology With Medicaid Populations: A Grounded Theory Study

PubMed Central

2017-01-01

Background Medicaid populations are less engaged in their health care than the rest of the population, translating to worse health outcomes and increased health care costs. Since theory-based mobile health (mHealth) interventions have been shown to increase patient engagement, mobile phones may be an optimal strategy to reach this population. With increased development of theory-based mHealth technology, these interventions must now be evaluated with these medically underserved populations in a real-world setting. Objective The aim of our study was to investigate care coordinators’ perceived value of using a health behavior theory-based mHealth platform with Medicaid clients. In particular, attention was paid to the perceived impact on patient engagement. This research was conducted using the patient-provider text messaging (short message service, SMS) platform, Sense Health (now Wellpass), which integrates the transtheoretical model (TTM), also called the stages of change model; social cognitive theory (SCT); supportive accountability; and motivational interviewing (MI). Methods Interviews based in grounded theory methodology were conducted with 10 care managers to understand perceptions of the relationship between mHealth and patient engagement. Results The interviews with care managers yielded a foundation for a grounded theory model, presenting themes that suggested 4 intertwined correlative relationships revolving around patient engagement: (1) A text messaging (short message service, SMS) platform supplements the client-care manager dynamic, which is grounded in high quality, reciprocal-communication to increase patient engagement; (2) Texting enhances the relationship between literacy and access to care for Medicaid patients, increasing low-literacy patients’ agency to access services; (3) Texting enhances communication, providing care managers with a new means to support their clients; and (4) Reminders augment client accountability, leading to both

7. Mapping edge-based traffic measurements onto the internal links in MPLS network

Zhao, Guofeng; Tang, Hong; Zhang, Yi

2004-09-01

Applying multi-protocol label switching techniques to IP-based backbone for traffic engineering goals has shown advantageous. Obtaining a volume of load on each internal link of the network is crucial for traffic engineering applying. Though collecting can be available for each link, such as applying traditional SNMP scheme, the approach may cause heavy processing load and sharply degrade the throughput of the core routers. Then monitoring merely at the edge of the network and mapping the measurements onto the core provides a good alternative way. In this paper, we explore a scheme for traffic mapping with edge-based measurements in MPLS network. It is supposed that the volume of traffic on each internal link over the domain would be mapped onto by measurements available only at ingress nodes. We apply path-based measurements at ingress nodes without enabling measurements in the core of the network. We propose a method that can infer a path from the ingress to the egress node using label distribution protocol without collecting routing data from core routers. Based on flow theory and queuing theory, we prove that our approach is effective and present the algorithm for traffic mapping. We also show performance simulation results that indicate potential of our approach.

8. Towards a space-time theory for estimating base flow in river networks

Furey, Peter Rankin

2001-10-01

The author's thesis is that the statistics and physics of base flow must be connected to develop a theory for estimating base flow in river networks. Developing a framework for base flow estimation which addresses this connection is a first step towards developing a space-time theory for estimating base flow in river networks. A mass balance equation for base flow from a hill serves as the starting point for developing this framework. The equation is simplified under a short-time approximation, not a steady-state solution, and from it two basin-scale equations are developed by treating a river basin as a collection of hills. One equation is a regression model or regression equation which assumes that saturated hydraulic conductivity and hydraulic head vary randomly in space among hills in a basin. The other equation is a filter that can be used to estimate base flow from stream flow data. The regression equation provides a physical explanation for both the temporal and spatial variability of base flow in a river network. The equation is tested using low stream flow data which is dominated by and possibly equal to base flow. Estimates of base flow are needed to test the regression equation at other times of the year. The filter is used to estimate base flow, at all times of year, given stream flow and precipitation data. However, the filter often performs poorly over short time scales, mainly during or right after a rainfall event. Weaknesses in filter performance are shown to be caused by both parameter error and model error. It is demonstrated that parameter error can be improved with rainfall data that are more accurate in space. A simple 2-dimensional dynamical model of stream flow generation from a hill is used to study the performance of a hillscale version of the filter. With this model, the effects of saturated hydraulic conductivity and precipitation magnitude on filter performance are investigated as well as the impact of model error on the filter. It is

9. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory.

PubMed

Nikolaev, Evgeni V; Sontag, Eduardo D

2016-04-01

Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic") or environmental ("extrinsic") noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic") behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a

10. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

PubMed Central

Voth, Elizabeth C; Oelke, Nelly D

2016-01-01

Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

11. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory

PubMed Central

Nikolaev, Evgeni V.

2016-01-01

Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of “on-off” switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular (“intrinsic”) or environmental (“extrinsic”) noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a “majority-vote” correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of “monotone” dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable (“chaotic”) behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation

12. Theory of chemical kinetics and charge transfer based on nonequilibrium thermodynamics.

PubMed

Bazant, Martin Z

2013-05-21

Advances in the fields of catalysis and electrochemical energy conversion often involve nanoparticles, which can have kinetics surprisingly different from the bulk material. Classical theories of chemical kinetics assume independent reactions in dilute solutions, whose rates are determined by mean concentrations. In condensed matter, strong interactions alter chemical activities and create variations that can dramatically affect the reaction rate. The extreme case is that of a reaction coupled to a phase transformation, whose kinetics must depend not only on the order parameter but also on its gradients at phase boundaries. Reaction-driven phase transformations are common in electrochemistry, when charge transfer is accompanied by ion intercalation or deposition in a solid phase. Examples abound in Li-ion, metal-air, and lead-acid batteries, as well as metal electrodeposition-dissolution. Despite complex thermodynamics, however, the standard kinetic model is the Butler-Volmer equation, based on a dilute solution approximation. The Marcus theory of charge transfer likewise considers isolated reactants and neglects elastic stress, configurational entropy, and other nonidealities in condensed phases. The limitations of existing theories recently became apparent for the Li-ion battery material LixFePO4 (LFP). It has a strong tendency to separate into Li-rich and Li-poor solid phases, which scientists believe limits its performance. Chemists first modeled phase separation in LFP as an isotropic "shrinking core" within each particle, but experiments later revealed striped phase boundaries on the active crystal facet. This raised the question: What is the reaction rate at a surface undergoing a phase transformation? Meanwhile, dramatic rate enhancement was attained with LFP nanoparticles, and classical battery models could not predict the roles of phase separation and surface modification. In this Account, I present a general theory of chemical kinetics, developed over

13. A general theory of evolution based on energy efficiency: its implications for diseases.

PubMed

Yun, Anthony J; Lee, Patrick Y; Doux, John D; Conley, Buford R

2006-01-01

We propose a general theory of evolution based on energy efficiency. Life represents an emergent property of energy. The earth receives energy from cosmic sources such as the sun. Biologic life can be characterized by the conversion of available energy into complex systems. Direct energy converters such as photosynthetic microorganisms and plants transform light energy into high-energy phosphate bonds that fuel biochemical work. Indirect converters such as herbivores and carnivores predominantly feed off the food chain supplied by these direct converters. Improving energy efficiency confers competitive advantage in the contest among organisms for energy. We introduce a term, return on energy (ROE), as a measure of energy efficiency. We define ROE as a ratio of the amount of energy acquired by a system to the amount of energy consumed to generate that gain. Life-death cycling represents a tactic to sample the environment for innovations that allow increases in ROE to develop over generations rather than an individual lifespan. However, the variation-selection strategem of Darwinian evolution may define a particular tactic rather than an overarching biological paradigm. A theory of evolution based on competition for energy and driven by improvements in ROE both encompasses prior notions of evolution and portends post-Darwinian mechanisms. Such processes may involve the exchange of non-genetic traits that improve ROE, as exemplified by cognitive adaptations or memes. Under these circumstances, indefinite persistence may become favored over life-death cycling, as increases in ROE may then occur more efficiently within a single lifespan rather than over multiple generations. The key to this transition may involve novel methods to address the promotion of health and cognitive plasticity. We describe the implications of this theory for human diseases.

14. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

PubMed Central

Taguri, Masataka; Ishikawa, Yoshiki

2016-01-01

Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

15. Einstein Critical-Slowing-Down is Siegel CyberWar Denial-of-Access Queuing/Pinning/ Jamming/Aikido Via Siegel DIGIT-Physics BEC ``Intersection''-BECOME-UNION Barabasi Network/GRAPH-Physics BEC: Strutt/Rayleigh-Siegel Percolation GLOBALITY-to-LOCALITY Phase-Transition Critical-Phenomenon

Buick, Otto; Falcon, Pat; Alexander, G.; Siegel, Edward Carl-Ludwig

2013-03-01

Einstein[Dover(03)] critical-slowing-down(CSD)[Pais, Subtle in The Lord; Life & Sci. of Albert Einstein(81)] is Siegel CyberWar denial-of-access(DOA) operations-research queuing theory/pinning/jamming/.../Read [Aikido, Aikibojitsu & Natural-Law(90)]/Aikido(!!!) phase-transition critical-phenomenon via Siegel DIGIT-Physics (Newcomb[Am.J.Math. 4,39(1881)]-{Planck[(1901)]-Einstein[(1905)])-Poincare[Calcul Probabilités(12)-p.313]-Weyl [Goett.Nachr.(14); Math.Ann.77,313 (16)]-{Bose[(24)-Einstein[(25)]-Fermi[(27)]-Dirac[(1927)]}-``Benford''[Proc.Am.Phil.Soc. 78,4,551 (38)]-Kac[Maths.Stat.-Reasoning(55)]-Raimi[Sci.Am. 221,109 (69)...]-Jech[preprint, PSU(95)]-Hill[Proc.AMS 123,3,887(95)]-Browne[NYT(8/98)]-Antonoff-Smith-Siegel[AMS Joint-Mtg.,S.-D.(02)] algebraic-inversion to yield ONLY BOSE-EINSTEIN QUANTUM-statistics (BEQS) with ZERO-digit Bose-Einstein CONDENSATION(BEC) ``INTERSECTION''-BECOME-UNION to Barabasi[PRL 876,5632(01); Rev.Mod.Phys.74,47(02)...] Network /Net/GRAPH(!!!)-physics BEC: Strutt/Rayleigh(1881)-Polya(21)-``Anderson''(58)-Siegel[J.Non-crystalline-Sol.40,453(80)

16. A theory of gravity based on the Weyl-Eddington action

Zee, A.

1982-02-01

We show that the conformal Weyl-Eddington theory can yield an effective long-ranged theory of gravity in accord with observations. In the framework of induced gravity, an R2 term is induced with a finite calculable coefficient with just the right sign for the theory to be free from tachyons. Our observation is not true order by order in perturbation theory.

17. A Theory-Based Approach to Teaching Young Children about Health: A Recipe for Understanding

ERIC Educational Resources Information Center

Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

2011-01-01

The theory-theory account of conceptual development posits that children's concepts are integrated into theories. Concept-learning studies have documented the central role that theories play in children's learning of experimenter-defined categories but have yet to extensively examine complex, real-world concepts, such as health. The present study…

18. Risk assessment and hierarchical risk management of enterprises in chemical industrial parks based on catastrophe theory.

PubMed

Chen, Yu; Song, Guobao; Yang, Fenglin; Zhang, Shushen; Zhang, Yun; Liu, Zhenyu

2012-12-03

According to risk systems theory and the characteristics of the chemical industry, an index system was established for risk assessment of enterprises in chemical industrial parks (CIPs) based on the inherent risk of the source, effectiveness of the prevention and control mechanism, and vulnerability of the receptor. A comprehensive risk assessment method based on catastrophe theory was then proposed and used to analyze the risk levels of ten major chemical enterprises in the Songmu Island CIP, China. According to the principle of equal distribution function, the chemical enterprise risk level was divided into the following five levels: 1.0 (very safe), 0.8 (safe), 0.6 (generally recognized as safe, GRAS), 0.4 (unsafe), 0.2 (very unsafe). The results revealed five enterprises (50%) with an unsafe risk level, and another five enterprises (50%) at the generally recognized as safe risk level. This method solves the multi-objective evaluation and decision-making problem. Additionally, this method involves simple calculations and provides an effective technique for risk assessment and hierarchical risk management of enterprises in CIPs.

19. Practical application of game theory based production flow planning method in virtual manufacturing networks

Olender, M.; Krenczyk, D.

2016-08-01

Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

20. Coding theory based models for protein translation initiation in prokaryotic organisms.

PubMed

May, Elebeoba E; Vouk, Mladen A; Bitzer, Donald L; Rosnick, David I

2004-01-01

Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

1. Adapting evidence-based interventions using a common theory, practices, and principles.

PubMed

Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D

2014-01-01

Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed.

2. Credibility theory based dynamic control bound optimization for reservoir flood limited water level

Jiang, Zhiqiang; Sun, Ping; Ji, Changming; Zhou, Jianzhong

2015-10-01

The dynamic control operation of reservoir flood limited water level (FLWL) can solve the contradictions between reservoir flood control and beneficial operation well, and it is an important measure to make sure the security of flood control and realize the flood utilization. The dynamic control bound of FLWL is a fundamental key element for implementing reservoir dynamic control operation. In order to optimize the dynamic control bound of FLWL by considering flood forecasting error, this paper took the forecasting error as a fuzzy variable, and described it with the emerging credibility theory in recent years. By combining the flood forecasting error quantitative model, a credibility-based fuzzy chance constrained model used to optimize the dynamic control bound was proposed in this paper, and fuzzy simulation technology was used to solve the model. The FENGTAN reservoir in China was selected as a case study, and the results show that, compared with the original operation water level, the initial operation water level (IOWL) of FENGTAN reservoir can be raised 4 m, 2 m and 5.5 m respectively in the three division stages of flood season, and without increasing flood control risk. In addition, the rationality and feasibility of the proposed forecasting error quantitative model and credibility-based dynamic control bound optimization model are verified by the calculation results of extreme risk theory.

3. Coding theory based models for protein translation initiation in prokaryotic organisms.

SciTech Connect

May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

2003-03-01

Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

4. Looking to the future of new media in health marketing: deriving propositions based on traditional theories.

PubMed

Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice

2008-01-01

Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future.

5. Optimization of a photovoltaic pumping system based on the optimal control theory

SciTech Connect

Betka, A.; Attali, A.

2010-07-15

This paper suggests how an optimal operation of a photovoltaic pumping system based on an induction motor driving a centrifugal pump can be realized. The optimization problem consists in maximizing the daily pumped water quantity via the optimization of the motor efficiency for every operation point. The proposed structure allows at the same time the minimization the machine losses, the field oriented control and the maximum power tracking of the photovoltaic array. This will be attained based on multi-input and multi-output optimal regulator theory. The effectiveness of the proposed algorithm is described by simulation and the obtained results are compared to those of a system working with a constant air gap flux. (author)

6. Study on the salary system for IT enterprise based on double factor motivation theory

Zhuang, Chen; Qian, Wu

2005-12-01

To improve the fact that the IT enterprise's salary & compensation system can not motivate a company's staff efficiently, the salary system based on Hertzberg's double factor motivation theory and the enterprise characteristics is presented. The salary system includes a salary model, an assessment model and a performance model. The system is connected with a cash incentive based on the staff's performance and emphasizes that the salary alone is not a motivating factor. Health care, for example, may also play a positive role on the motivation factor. According to this system, a scientific and reasonable salary & compensation management system was established and applied in an IT enterprise. It was found to promote the enterprise's overall performance and competitive power.

7. Predictive models based on sensitivity theory and their application to practical shielding problems

SciTech Connect

Bhuiyan, S.I.; Roussin, R.W.; Lucius, J.L.; Bartine, D.E.

1983-01-01

Two new calculational models based on the use of cross-section sensitivity coefficients have been devised for calculating radiation transport in relatively simple shields. The two models, one an exponential model and the other a power model, have been applied, together with the traditional linear model, to 1- and 2-m-thick concrete-slab problems in which the water content, reinforcing-steel content, or composition of the concrete was varied. Comparing the results obtained with the three models with those obtained from exact one-dimensional discrete-ordinates transport calculations indicates that the exponential model, named the BEST model (for basic exponential shielding trend), is a particularly promising predictive tool for shielding problems dominated by exponential attenuation. When applied to a deep-penetration sodium problem, the BEST model also yields better results than do calculations based on second-order sensitivity theory.

8. Discovering Pair-Wise Genetic Interactions: An Information Theory-Based Approach

PubMed Central

Ignac, Tomasz M.; Skupin, Alexander; Sakhanenko, Nikita A.; Galas, David J.

2014-01-01

Phenotypic variation, including that which underlies health and disease in humans, results in part from multiple interactions among both genetic variation and environmental factors. While diseases or phenotypes caused by single gene variants can be identified by established association methods and family-based approaches, complex phenotypic traits resulting from multi-gene interactions remain very difficult to characterize. Here we describe a new method based on information theory, and demonstrate how it improves on previous approaches to identifying genetic interactions, including both synthetic and modifier kinds of interactions. We apply our measure, called interaction distance, to previously analyzed data sets of yeast sporulation efficiency, lipid related mouse data and several human disease models to characterize the method. We show how the interaction distance can reveal novel gene interaction candidates in experimental and simulated data sets, and outperforms other measures in several circumstances. The method also allows us to optimize case/control sample composition for clinical studies. PMID:24670935

9. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

PubMed

Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

2015-01-01

Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

10. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

NASA Technical Reports Server (NTRS)

Xu, Kun

1998-01-01

A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

11. Re-Examining of Moffitt's Theory of Delinquency through Agent Based Modeling.

PubMed

Leaw, Jia Ning; Ang, Rebecca P; Huan, Vivien S; Chan, Wei Teng; Cheong, Siew Ann

2015-01-01

Moffitt's theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome.

12. Calculation of thermal expansion coefficient of glasses based on topological constraint theory

Zeng, Huidan; Ye, Feng; Li, Xiang; Wang, Ling; Yang, Bin; Chen, Jianding; Zhang, Xianghua; Sun, Luyi

2016-10-01

In this work, the thermal expansion behavior and the structure configuration evolution of glasses were studied. Degree of freedom based on the topological constraint theory is correlated with configuration evolution; considering the chemical composition and the configuration change, the analytical equation for calculating the thermal expansion coefficient of glasses from degree of freedom was derived. The thermal expansion of typical silicate and chalcogenide glasses was examined by calculating their thermal expansion coefficients (TEC) using the approach stated above. The results showed that this approach was energetically favorable for glass materials and revealed the corresponding underlying essence from viewpoint of configuration entropy. This work establishes a configuration-based methodology to calculate the thermal expansion coefficient of glasses that, lack periodic order.

13. Unit Template Synchronous Reference Frame Theory Based Control Algorithm for DSTATCOM

Bangarraju, J.; Rajagopal, V.; Jayalaxmi, A.

2014-04-01

This article proposes new and simplified unit templates instead of standard phase locked loop (PLL) for Synchronous Reference Frame Theory Control Algorithm (SRFT). The extraction of synchronizing components (sinθ and cosθ) for parks and inverse parks transformation using standard PLL takes more execution time. This execution time in control algorithm delays the extraction of reference source current generation. The standard PLL not only takes more execution time but also increases the reactive power burden on the Distributed Static Compensator (DSTATCOM). This work proposes a unit template based SRFT control algorithm for four-leg insulated gate bipolar transistor based voltage source converter for DSTATCOM in distribution systems. This will reduce the execution time and reactive power burden on the DSTATCOM. The proposed DSTATCOM suppress harmonics, regulates the terminal voltage along with neutral current compensation. The DSTATCOM in distribution systems with proposed control algorithm is modeled and simulated using MATLAB using SIMULINK and Simpower systems toolboxes.

14. Application of the information fusion based on evidence theory in urban development

Aiqun, Chen; Zequn, Guan

2005-10-01

Traditional remote sensing image classification methods have been mature,especially the maximum likelihood technique based on statistical analysis methods. But Traditional remote sensing image classification methods can't handle multiple source remotely sensed data,in order to make optimized decisions, better use must be made of all available information acquired from different sources.Evidential reasoning has been proposed as one of the most promising approaches for integrating multisource information. We expatiate on Dempster-Shafer evidence theory and present a method of multisource information fusion based on it. We also apply this method in urban development. The experimental results show that this method present in this paper is effective,and it can greatly improve the ability of image classification.

15. Buckling Analysis for Stiffened Anisotropic Circular Cylinders Based on Sanders Nonlinear Shell Theory

NASA Technical Reports Server (NTRS)

Nemeth, Michael P.

2014-01-01

Nonlinear and bifurcation buckling equations for elastic, stiffened, geometrically perfect, right-circular cylindrical, anisotropic shells subjected to combined loads are presented that are based on Sanders' shell theory. Based on these equations, a three-parameter approximate Rayleigh-Ritz solution and a classical solution to the buckling problem are presented for cylinders with simply supported edges. Extensive comparisons of results obtained from these solutions with published results are also presented for a wide range of cylinder constructions. These comparisons include laminated-composite cylinders with a wide variety of shell-wall orthotropies and anisotropies. Numerous results are also given that show the discrepancies between the results obtained by using Donnell's equations and variants of Sanders' equations. For some cases, nondimensional parameters are identified and "master" curves are presented that facilitate the concise representation of results.

16. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

PubMed

Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

2012-05-03

The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls.

17. Moving object detection using a background modeling based on entropy theory and quad-tree decomposition

Elharrouss, Omar; Moujahid, Driss; Elkah, Samah; Tairi, Hamid

2016-11-01

A particular algorithm for moving object detection using a background subtraction approach is proposed. We generate the background model by combining quad-tree decomposition with entropy theory. In general, many background subtraction approaches are sensitive to sudden illumination change in the scene and cannot update the background image in scenes. The proposed background modeling approach analyzes the illumination change problem. After performing the background subtraction based on the proposed background model, the moving targets can be accurately detected at each frame of the image sequence. In order to produce high accuracy for the motion detection, the binary motion mask can be computed by the proposed threshold function. The experimental analysis based on statistical measurements proves the efficiency of our proposed method in terms of quality and quantity. And it even outperforms substantially existing methods by perceptional evaluation.

18. Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling

PubMed Central

Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann

2015-01-01

Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

19. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain

PubMed Central

Brette, Romain

2015-01-01

Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496

20. Optimal design of hydrometric monitoring networks with dynamic components based on Information Theory

Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri

2016-04-01

The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A

1. Active optical alignment of off-axis telescopes based on nodal aberration theory.

PubMed

Zhang, Xiaobin; Zhang, Dong; Xu, Shuyan; Ma, Hongcai

2016-11-14

Our paper mainly separates the specific aberration contributions of third-order astigmatism and third-order coma from the total aberration fields, on the framework of the modified nodal aberration theory (NAT), for the perturbed off-axis telescope. Based on the derived aberration functions, two alignment models for the same off-axis two-mirror telescope are established and compared. Among them, one is based on third-order NAT, the other is based on fifth-order NAT. By comparison, it is found that the calculated perturbations based on fifth-order NAT are more accurate. It illustrates that third-order astigmatism and third-order coma contributed from fifth-order aberrations can't be neglected in the alignment process. Then the fifth-order NAT is used for the alignment of off-axis three-mirror telescopes. After simulation, it is found that the perturbed off-axis three-mirror telescope can be perfectly aligned as well. To further demonstrate the application of the alignment method based on fifth-order NAT (simplified as NAT method), Monte-Carlo simulations for both off-axis two-mirror telescope and off-axis three-mirror telescope are conducted in the end. Meantime, a comparison between NAT method and sensitivity table method is also conducted. It is proven that the computation accuracy of NAT method is much higher, especially in poor conditions.

2. A modular hierarchy-based theory of the chemical origins of life based on molecular complementarity.

PubMed

Root-Bernstein, Robert

2012-12-18

Albert Szent-Gyorgyi once defined discovery as seeing what everyone else sees and thinking what no one else thinks. I often find that phenomena that are obvious to other people are not obvious to me. Molecular complementarity is one of these phenomena: while rare among any random set of compounds, it is ubiquitous in living systems. Because every molecule in a living system binds more or less specifically to several others, we now speak of "interactomes". What explains the ubiquity of molecular complementarity in living systems? What might such an explanation reveal about the chemical origins of life and the principles that have governed its evolution? Beyond this, what might complementarity tell us about the optimization of integrated systems in general? My research combines theoretical and experimental approaches to molecular complementarity relating to evolution from prebiotic chemical systems to superorganismal interactions. Experimentally, I have characterized complementarity involving specific binding between small molecules and explored how these small-molecule modules have been incorporated into macromolecular systems such as receptors and transporters. Several general principles have emerged from this research. Molecules that bind to each other almost always alter each other's physiological effects; and conversely, molecules that have antagonistic or synergistic physiological effects almost always bind to each other. This principle suggests a chemical link between biological structure and function. Secondly, modern biological systems contain an embedded molecular paleontology based on complementarity that can reveal their chemical origins. This molecular paleontology is often manifested through modules involving small, molecularly complementary subunits that are built into modern macromolecular structures such as receptors and transporters. A third principle is that complementary modules are conserved and repurposed at every stage of evolution. Molecular

3. Theory Of An Electro-Optic Modulator Based On Quantum Wells In A Semiconductor etalon

Guy, D. R. P.; Apsley, N.; Taylor, L. L.; Bass, S. J.; Klipstein, P. C.

1987-08-01

We present the design of an electro-optic modulator which is based on the quantum-confined Stark effect (QCSE) in GaAs quantum wells contained within the central layer of a Fabry-Perot etalon. The etalon mirrors are quarter wave stacks of Ga0.7A10.3As and AlAs, eliminating the need for the application of anti-reflection coatings. p-i-n-doping is employed with the undoped Or stack sandwiched between doped mirrors, enabling electric fields of the order of 105Vcm-1 to be readily developed across the quantum wells. Placing a multiple quantum well structure within an etalon resonant cavity gives flexibility of design in terms of operating wavelength and mode: Light incident perpendicular to the QW stack is modulated through the operation of the QCSE on the QW excitons, either electro-refractively by a change in the real part of the QW refractive index producing a wavelength modulation of the narrow-band Fabry-Perot transmission resonance or in electro-absorptive mode. The semi-empirical theory uses conventional multilayer optical matrix methods together with a recent theory of the QCSE which has been tested against the results of electro-reflectance experiments. In electro-absorption mode we find a ratio of 19:1 on:off at 857.5nm for 8 quantum wells. In electro-refractive mode, using 32 wells, we predict modulation from 10% to 76% reflection at 883nm. These figures exclude substrate effects. Extension of the theory to other materials systems is readily accomplished. We present the reflectivity spectrum of a high quality etalon in In0.53Ga0.47As/InP and find good agreement with the predictions of the optical matrix model.

4. Coverage theories for metagenomic DNA sequencing based on a generalization of Stevens' theorem.

PubMed

Wendl, Michael C; Kota, Karthik; Weinstock, George M; Mitreva, Makedonka

2013-11-01

Metagenomic project design has relied variously upon speculation, semi-empirical and ad hoc heuristic models, and elementary extensions of single-sample Lander-Waterman expectation theory, all of which are demonstrably inadequate. Here, we propose an approach based upon a generalization of Stevens' Theorem for randomly covering a domain. We extend this result to account for the presence of multiple species, from which are derived useful probabilities for fully recovering a particular target microbe of interest and for average contig length. These show improved specificities compared to older measures and recommend deeper data generation than the levels chosen by some early studies, supporting the view that poor assemblies were due at least somewhat to insufficient data. We assess predictions empirically by generating roughly 4.5 Gb of sequence from a twelve member bacterial community, comparing coverage for two particular members, Selenomonas artemidis and Enterococcus faecium, which are the least ([Formula: see text]3 %) and most ([Formula: see text]12 %) abundant species, respectively. Agreement is reasonable, with differences likely attributable to coverage biases. We show that, in some cases, bias is simple in the sense that a small reduction in read length to simulate less efficient covering brings data and theory into essentially complete accord. Finally, we describe two applications of the theory. One plots coverage probability over the relevant parameter space, constructing essentially a "metagenomic design map" to enable straightforward analysis and design of future projects. The other gives an overview of the data requirements for various types of sequencing milestones, including a desired number of contact reads and contig length, for detection of a rare viral species.

5. Contextualization and standardization of the supportive leadership behavior questionnaire based on socio- cognitive theory in Iran

PubMed Central

Shirazi, Mandana; Emami, Amir Hosein; Mirmoosavi, ,Seyed Jamal; Alavinia, Seyed Mohammad; Zamanian, Hadi; Fathollahbeigi, Faezeh; Masiello, Italo

2014-01-01

6. A new theory-based social classification in Japan and its validation using historically collected information.

PubMed

Hiyoshi, Ayako; Fukuda, Yoshiharu; Shipley, Martin J; Bartley, Mel; Brunner, Eric J

2013-06-01

Studies of health inequalities in Japan have increased since the millennium. However, there remains a lack of an accepted theory-based classification to measure occupation-related social position for Japan. This study attempts to derive such a classification based on the National Statistics Socio-economic Classification in the UK. Using routinely collected data from the nationally representative Comprehensive Survey of the Living Conditions of People on Health and Welfare, the Japanese Socioeconomic Classification was derived using two variables - occupational group and employment status. Validation analyses were conducted using household income, home ownership, self-rated good or poor health, and Kessler 6 psychological distress (n ≈ 36,000). After adjustment for age, marital status, and area (prefecture), one step lower social class was associated with mean 16% (p < 0.001) lower income, and a risk ratio of 0.93 (p < 0.001) for home ownership. The probability of good health showed a trend in men and women (risk ratio 0.94 and 0.93, respectively, for one step lower social class, p < 0.001). The trend for poor health was significant in women (odds ratio 1.12, p < 0.001) but not in men. Kessler 6 psychological distress showed significant trends in men (risk ratio 1.03, p = 0.044) and in women (1.05, p = 0.004). We propose the Japanese Socioeconomic Classification, derived from basic occupational and employment status information, as a meaningful, theory-based and standard classification system suitable for monitoring occupation-related health inequalities in Japan.

7. Where are family theories in family-based obesity treatment?: conceptualizing the study of families in pediatric weight management.

PubMed

Skelton, J A; Buehler, C; Irby, M B; Grzywacz, J G

2012-07-01

Family-based approaches to pediatric obesity treatment are considered the 'gold-standard,' and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment.

8. Where are family theories in family-based obesity treatment?: conceptualizing the study of families in pediatric weight management

PubMed Central

Skelton, JA; Buehler, C; Irby, MB; Grzywacz, JG

2014-01-01

Family-based approaches to pediatric obesity treatment are considered the ‘gold-standard,’ and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment. PMID:22531090

9. Optical surface measurement using phase retrieval hybrid algorithm based on diffraction angular spectrum theory

Feng, Liang; Zeng, Zhi-ge; Wu, Yong-qian

2013-08-01

In order to test the high dynamic range error beyond one wavelength after the rough polish process, we design a phase retrieval hybrid algorithm based on diffraction angular spectrum theory. Phase retrieval is a wave front sensing method that uses the intensity distribution to reconstruct the phase distribution of optical field. Phase retrieval is established on the model of diffractive propagation and approach the real intensity distribution gradually. In this paper, we introduce the basic principle and challenges of optical surface measurement using phase retrieval, then discuss the major parts of phase retrieval: diffractive propagation and hybrid algorithm. The angular spectrum theory describes the diffractive propagation in the frequency domain instead of spatial domain, which simplifies the computation greatly. Through the theoretical analysis, the angular spectrum in discrete form is more effective when the high frequency part values less and the diffractive distance isn't far. The phase retrieval hybrid algorithm derives from modified GS algorithm and conjugate gradient method, aiming to solve the problem of phase wrapping caused by the high dynamic range error. In the algorithm, phase distribution is described by Zernike polynomials and the coefficients of Zernike polynomials are optimized by the hybrid algorithm. Simulation results show that the retrieved phase distribution and real phase distribution are quite contiguous for the high dynamic range error beyond λ.

10. Skin cancer preventive behaviors among rural farmers: An intervention based on protection motivation theory

PubMed Central

2016-01-01

Background: Skin cancer is a serious public health problem in the world. Its prevalence in many countries has been increased in recent years. This study aimed to assess the effects of a theory-based educational intervention to promote skin cancer preventive behaviors (SCPBs) among rural farmers in Chalderan County, Iran. Methods: This was a quasi-randomized controlled field trial study conducted on 238 rural farmers. The data were collected by a questionnaire containing the constructs of the Protection Motivation Theory (PMT) as well as the items of SCPBs. The differences between the groups before and 3 months after the intervention were determined by independent t-test, paired t-test, and chi-square applying SPSS software. Results: Before the intervention, no significant difference was found in the scores of the PMT constructs between the two groups (p>0.05). However, significant differences were found between the scores of all the variables, as well as SCPBs, in the two groups after the intervention (p<0.05). Conclusion: The PMT was found to be an appropriate framework for designing educational interventions aiming at promoting SCPBs among rural farmers. It was concluded that designing an educational program with a focus on promoting perceived susceptibility increased the level of performing SCPBs among the rural farmers. PMID:28210609

11. Sub-watershed prioritization based on sediment yield using game theory

2016-10-01

The proper placement of soil and water conservation measures cannot be designated due to lack of appropriate technical prioritization of different areas of a watershed. Therefore, quantifying soil erosion hazard and spatial prioritization of sub-watersheds would aid in better watershed management planning. Although, many approaches have been applied to prioritize sub-watersheds, but still the efficient techniques like game theory have not been practically applied to prioritize sub-watersheds. The present study therefore has used the game theory to prioritize sub-watersheds in Gorganroud and Qareh Sou watersheds in Golestan Province, northern Iran. Towards this goal, 38 independent factors were classified in seven components using Principal Component Analysis (PCA) method with one representative variable in each component. The Condorcet method used for prioritization of effective variables indicated that the percent of forestry lands (52 scores) and discharge with 10 years of return period (32 scores) were respectively the most and the least effective variables on sediment yield. The Fallback bargaining and the Borda Scoring algorithms were also selected to prioritize study sub-watersheds based on weighted grades of total score for each variable. Accordingly, the aforesaid algorithms classified sub-watersheds in three categories. Comparison of results similarly introduced Galikesh, Qazaqli, Gonbad, Siyah Ab and Tamar as first ranked sub-watersheds with the worth condition, Tangrah and Naharkhoran as second priority and eventually Pole Ordougah as sub-watershed with the lowest priority.

12. New Approach to Optimize the Apfs Placement Based on Instantaneous Reactive Power Theory by Genetic Algorithm

2014-01-01

In electrical distribution systems, a great amount of power are wasting across the lines, also nowadays power factors, voltage profiles and total harmonic distortions (THDs) of most loads are not as would be desired. So these important parameters of a system play highly important role in wasting money and energy, and besides both consumers and sources are suffering from a high rate of distortions and even instabilities. Active power filters (APFs) are innovative ideas for solving of this adversity which have recently used instantaneous reactive power theory. In this paper, a novel method is proposed to optimize the allocation of APFs. The introduced method is based on the instantaneous reactive power theory in vectorial representation. By use of this representation, it is possible to asses different compensation strategies. Also, APFs proper placement in the system plays a crucial role in either reducing the losses costs and power quality improvement. To optimize the APFs placement, a new objective function has been defined on the basis of five terms: total losses, power factor, voltage profile, THD and cost. Genetic algorithm has been used to solve the optimization problem. The results of applying this method to a distribution network illustrate the method advantages.

13. An integrated finite element simulation of cardiomyocyte function based on triphasic theory

PubMed Central

2015-01-01

In numerical simulations of cardiac excitation-contraction coupling, the intracellular potential distribution and mobility of cytosol and ions have been mostly ignored. Although the intracellular potential gradient is small, during depolarization it can be a significant driving force for ion movement, and is comparable to diffusion in terms of net flux. Furthermore, fluid in the t-tubules is thought to advect ions to facilitate their exchange with the extracellular space. We extend our previous finite element model that was based on triphasic theory to examine the significance of these factors in cardiac physiology. Triphasic theory allows us to study the behavior of solids (proteins), fluids (cytosol) and ions governed by mechanics and electrochemistry in detailed subcellular structures, including myofibrils, mitochondria, the sarcoplasmic reticulum, membranes, and t-tubules. Our simulation results predicted an electrical potential gradient inside the t-tubules at the onset of depolarization, which corresponded to the Na+ channel distribution therein. Ejection and suction of fluid between the t-tubules and the extracellular compartment during isometric contraction were observed. We also examined the influence of t-tubule morphology and mitochondrial location on the electrophysiology and mechanics of the cardiomyocyte. Our results confirm that the t-tubule structure is important for synchrony of Ca2+ release, and suggest that mitochondria in the sub-sarcolemmal region might serve to cancel Ca2+ inflow through surface sarcolemma, thereby maintaining the intracellular Ca2+ environment in equilibrium. PMID:26539124

14. A system model for ultrasonic NDT based on the Physical Theory of Diffraction (PTD).

PubMed

Darmon, M; Dorval, V; Kamta Djakou, A; Fradkin, L; Chatillon, S

2016-01-01

Simulation of ultrasonic Non Destructive Testing (NDT) is helpful for evaluating performances of inspection techniques and requires the modelling of waves scattered by defects. Two classical flaw scattering models have been previously usually employed and evaluated to deal with inspection of planar defects, the Kirchhoff approximation (KA) for simulating reflection and the Geometrical Theory of Diffraction (GTD) for simulating diffraction. Combining them so as to retain advantages of both, the Physical Theory of Diffraction (PTD) initially developed in electromagnetism has been recently extended to elastodynamics. In this paper a PTD-based system model is proposed for simulating the ultrasonic response of crack-like defects. It is also extended to provide good description of regions surrounding critical rays where the shear diffracted waves and head waves interfere. Both numerical and experimental validation of the PTD model is carried out in various practical NDT configurations, such as pulse echo and Time of Flight Diffraction (TOFD), involving both crack tip and corner echoes. Numerical validation involves comparison of this model with KA and GTD as well as the Finite-Element Method (FEM).

15. Effective meson masses in nuclear matter based on a cutoff field theory

SciTech Connect

Nakano, M.; Noda, N.; Mitsumori, T.; Koide, K.; Kouno, H.; Hasegawa, A.

1997-02-01

Effective masses of {sigma}, {omega}, {pi}, and {rho} mesons in nuclear matter are calculated based on a cutoff field theory. Instead of the traditional density-Feynman representation, we adopt the particle-hole-antiparticle representation for nuclear propagators so that unphysical components are not included in the meson self-energies. For an estimation of the contribution from the divergent particle-antiparticle excitations, i.e., vacuum polarization in nuclear matter, the idea of the renormalization group method is adopted. In this cutoff field theory, all the counterterms are finite and calculated numerically. It is shown that the predicted meson masses converge even if the cutoff {Lambda} is changed as long as {Lambda} is sufficiently large and that the prescription works well also for so-called nonrenormalized mesons such as {pi} and {rho}. According to this method, it is concluded that meson masses in nuclear matter have a weak dependence on the baryon density. {copyright} {ital 1997} {ital The American Physical Society}

16. The use of theory based semistructured elicitation questionnaires: formative research for CDC's Prevention Marketing Initiative.

PubMed Central

Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M

1996-01-01

Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153

17. Speed of synchronization in complex networks of neural oscillators: analytic results based on Random Matrix Theory.

PubMed

Timme, Marc; Geisel, Theo; Wolf, Fred

2006-03-01

We analyze the dynamics of networks of spiking neural oscillators. First, we present an exact linear stability theory of the synchronous state for networks of arbitrary connectivity. For general neuron rise functions, stability is determined by multiple operators, for which standard analysis is not suitable. We describe a general nonstandard solution to the multioperator problem. Subsequently, we derive a class of neuronal rise functions for which all stability operators become degenerate and standard eigenvalue analysis becomes a suitable tool. Interestingly, this class is found to consist of networks of leaky integrate-and-fire neurons. For random networks of inhibitory integrate-and-fire neurons, we then develop an analytical approach, based on the theory of random matrices, to precisely determine the eigenvalue distributions of the stability operators. This yields the asymptotic relaxation time for perturbations to the synchronous state which provides the characteristic time scale on which neurons can coordinate their activity in such networks. For networks with finite in-degree, i.e., finite number of presynaptic inputs per neuron, we find a speed limit to coordinating spiking activity. Even with arbitrarily strong interaction strengths neurons cannot synchronize faster than at a certain maximal speed determined by the typical in-degree.

18. Research on the lesion segmentation of breast tumor MR images based on FCM-DS theory

Zhang, Liangbin; Ma, Wenjun; Shen, Xing; Li, Yuehua; Zhu, Yuemin; Chen, Li; Zhang, Su

2017-03-01

Magnetic resonance imaging (MRI) plays an important role in the treatment of breast tumor by high intensity focused ultrasound (HIFU). The doctors evaluate the scale, distribution and the statement of benign or malignancy of breast tumor by analyzing variety modalities of MRI, such as the T2, DWI and DCE images for making accurate preoperative treatment plan and evaluating the effect of the operation. This paper presents a method of lesion segmentation of breast tumor based on FCM-DS theory. Fuzzy c-means clustering (FCM) algorithm combined with Dempster-Shafer (DS) theory is used to process the uncertainty of information, segmenting the lesion areas on DWI and DCE modalities of MRI and reducing the scale of the uncertain parts. Experiment results show that FCM-DS can fuse the DWI and DCE images to achieve accurate segmentation and display the statement of benign or malignancy of lesion area by Time-Intensity Curve (TIC), which could be beneficial in making preoperative treatment plan and evaluating the effect of the therapy.

19. [Prediction of regional soil quality based on mutual information theory integrated with decision tree algorithm].

PubMed

Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu

2012-02-01

In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.

20. Dynamics of controlled release systems based on water-in-water emulsions: a general theory.

PubMed

Sagis, Leonard M C

2008-10-06

Phase-separated biopolymer solutions, and aqueous dispersions of hydrogel beads, liposomes, polymersomes, aqueous polymer microcapsules, and colloidosomes are all examples of water-in-water emulsions. These systems can be used for encapsulation and controlled release purposes, in for example food or pharmaceutical applications. The stress-deformation behavior of the droplets in these systems is very complex, and affected by mass transfer across the interface. The relaxation time of a deformation of a droplet may depend on interfacial properties such as surface tension, bending rigidity, spontaneous curvature, permeability, and interfacial viscoelasticity. It also depends on bulk viscoelasticity and composition. A non-equilibrium thermodynamic model is developed for the dynamic behavior of these systems, which incorporates all these parameters, and is based on the interfacial transport phenomena (ITP) formalism. The ITP formalism allows us to describe all water-in-water emulsions with one general theory. Phase-separated biopolymer solutions, and dispersions of hydrogel beads, liposomes, polymersomes, polymer microcapsules, and colloidosomes are basically limiting cases of this general theory with respect to bulk and interfacial rheological behavior.