Queuing Theory and Reference Transactions.
ERIC Educational Resources Information Center
Terbille, Charles
1995-01-01
Examines the implications of applying the queuing theory to three different reference situations: (1) random patron arrivals; (2) random durations of transactions; and (3) use of two librarians. Tables and figures represent results from spreadsheet calculations of queues for each reference situation. (JMV)
NASA Astrophysics Data System (ADS)
Sun, Y.; Li, Y. P.; Huang, G. H.
2012-06-01
In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.
Zhang, Jie; Han, Guangjie; Qian, Yujie
2016-01-01
Increased co-channel interference (CCI) in wireless local area networks (WLANs) is bringing serious resource constraints to today's high-density wireless environments. CCI in IEEE 802.11-based networks is inevitable due to the nature of the carrier sensing mechanism however can be reduced by resource optimization approaches. That means the CCI analysis is basic, but also crucial for an efficient resource management. In this article, we present a novel CCI analysis approach based on the queuing theory, which considers the randomness of end users' behavior and the irregularity and complexity of network traffic in high-density WLANs that adopts the M/M/c queuing model for CCI analysis. Most of the CCIs occur when multiple networks overlap and trigger channel contentions; therefore, we use the ratio of signal-overlapped areas to signal coverage as a probabilistic factor to the queuing model to analyze the CCI impacts in highly overlapped WLANs. With the queuing model, we perform simulations to see how the CCI influences the quality of service (QoS) in high-density WLANs. PMID:27563896
An application of queuing theory to waterfowl migration
Sojda, Richard S.; Cornely, John E.; Fredrickson, Leigh H.
2002-01-01
There has always been great interest in the migration of waterfowl and other birds. We have applied queuing theory to modelling waterfowl migration, beginning with a prototype system for the Rocky Mountain Population of trumpeter swans (Cygnus buccinator) in Western North America. The queuing model can be classified as a D/BB/28 system, and we describe the input sources, service mechanism, and network configuration of queues and servers. The intrinsic nature of queuing theory is to represent the spatial and temporal characteristics of entities and how they move, are placed in queues, and are serviced. The service mechanism in our system is an algorithm representing how swans move through the flyway based on seasonal life cycle events. The system uses an observed number of swans at each of 27 areas for a breeding season as input and simulates their distribution through four seasonal steps. The result is a simulated distribution of birds for the subsequent year's breeding season. The model was built as a multiagent system with one agent handling movement algorithms, with one facilitating user interface, and with one to seven agents representing specific geographic areas for which swan management interventions can be implemented. The many parallels in queuing model servers and service mechanisms with waterfowl management areas and annual life cycle events made the transfer of the theory to practical application straightforward.
Queuing theory models for computer networks
NASA Technical Reports Server (NTRS)
Galant, David C.
1989-01-01
A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.
Queuing Theory: An Alternative Approach to Educational Research.
ERIC Educational Resources Information Center
Huyvaert, Sarah H.
Queuing theory is examined in this paper in order to determine if the theory could be applied in educational settings. It is defined as a form of operations research that uses mathematical formulas and/or computer simulation to study wait and congestion in a system and, through the study of these visible phenomena, to discover malfunctions within…
Application of queuing theory in production-inventory optimization
NASA Astrophysics Data System (ADS)
Rashid, Reza; Hoseini, Seyed Farzad; Gholamian, M. R.; Feizabadi, Mohammad
2015-07-01
This paper presents a mathematical model for an inventory control system in which customers' demands and suppliers' service time are considered as stochastic parameters. The proposed problem is solved through queuing theory for a single item. In this case, transitional probabilities are calculated in steady state. Afterward, the model is extended to the case of multi-item inventory systems. Then, to deal with the complexity of this problem, a new heuristic algorithm is developed. Finally, the presented bi-level inventory-queuing model is implemented as a case study in Electroestil Company.
Application of queuing theory in inventory systems with substitution flexibility
NASA Astrophysics Data System (ADS)
Seyedhoseini, S. M.; Rashid, Reza; Kamalpour, Iman; Zangeneh, Erfan
2015-01-01
Considering the competition in today's business environment, tactical planning of a supply chain becomes more complex than before. In many multi-product inventory systems, substitution flexibility can improve profits. This paper aims to prepare a comprehensive substitution inventory model, where an inventory system with two substitute products with ignorable lead time has been considered, and effects of simultaneous ordering have been examined. In this paper, demands of customers for both of the products have been regarded as stochastic parameters, and queuing theory has been used to construct a mathematical model. The model has been coded by C++, and it has been analyzed due to a real example, where the results indicate efficiency of proposed model.
Using Queuing Theory and Simulation Model to Optimize Hospital Pharmacy Performance
Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan
2014-01-01
Background: Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. Objectives: This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. Patients and Methods: A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results: Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation
An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory
Mousavi, Ali; Clarkson, P. John; Young, Terry
2015-01-01
This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation. PMID:27170899
An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory.
Komashie, Alexander; Mousavi, Ali; Clarkson, P John; Young, Terry
2015-01-01
This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation. PMID:27170899
Vahle, M.O.
1982-03-01
Queuing theory is applied to the problem of assigning computer ports within a terminal switching network to maximize the likelihood of instant connect. A brief background of the network is included to focus on the statement of the problem.
Queuing theory to guide the implementation of a heart failure inpatient registry program.
Zai, Adrian H; Farr, Kit M; Grant, Richard W; Mort, Elizabeth; Ferris, Timothy G; Chueh, Henry C
2009-01-01
OBJECTIVE The authors previously implemented an electronic heart failure registry at a large academic hospital to identify heart failure patients and to connect these patients with appropriate discharge services. Despite significant improvements in patient identification and connection rates, time to connection remained high, with an average delay of 3.2 days from the time patients were admitted to the time connections were made. Our objective for this current study was to determine the most effective solution to minimize time to connection. DESIGN We used a queuing theory model to simulate 3 different potential solutions to decrease the delay from patient identification to connection with discharge services. MEASUREMENTS The measures included average rate at which patients were being connected to the post discharge heart failure services program, average number of patients in line, and average patient waiting time. RESULTS Using queuing theory model simulations, we were able to estimate for our current system the minimum rate at which patients need to be connected (262 patients/mo), the ideal patient arrival rate (174 patients/mo) and the maximal patient arrival rate that could be achieved by adding 1 extra nurse (348 patients/mo). CONCLUSIONS Our modeling approach was instrumental in helping us characterize key process parameters and estimate the impact of adding staff on the time between identifying patients with heart failure and connecting them with appropriate discharge services. PMID:19390108
Application of queuing theory to patient satisfaction at a tertiary hospital in Nigeria
Ameh, Nkeiruka; Sabo, B.; Oyefabi, M. O.
2013-01-01
Background: Queuing theory is the mathematical approach to the analysis of waiting lines in any setting where arrival rate of subjects is faster than the system can handle. It is applicable to healthcare settings where the systems have excess capacity to accommodate random variations. Materials and Methods: A cross-sectional descriptive survey was done. Questionnaires were administered to patients who attended the general outpatient department. Observations were also made on the queuing model and the service discipline at the clinic. Questions were meant to obtain demographic characteristics and the time spent on the queue by patients before being seen by a doctor, time spent with the doctor, their views about the time spent on the queue and useful suggestions on how to reduce the time spent on the queue. A total of 210 patients were surveyed. Results: Majority of the patients (164, 78.1%) spent 2 h or less on the queue before being seen by a doctor and less than 1 h to see the doctor. Majority of the patients (144, 68.5%) were satisfied with the time they spent on the queue before being seen by a doctor. Useful suggestions proffered by the patients to decrease the time spent on the queue before seeing a doctor at the clinic included: that more doctors be employed (46, 21.9%), that doctors should come to work on time (25, 11.9%), that first-come-first served be observed strictly (32, 15.2%) and others suggested that the records staff should desist from collecting bribes from patients in order to place their cards before others. The queuing method employed at the clinic is the multiple single channel type and the service discipline is priority service. The patients who spent less time on the queue (<1 h) before seeing the doctor were more satisfied than those who spent more time (P < 0.05). Conclusion: The study has revealed that majority of the patients were satisfied with the practice at the general outpatient department. However, there is a need to employ
Spreadsheet Analysis Of Queuing In A Computer Network
NASA Technical Reports Server (NTRS)
Galant, David C.
1992-01-01
Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.
T-cell activation: A queuing theory analysis at low agonist density.
Wedagedera, J R; Burroughs, N J
2006-09-01
We analyze a simple linear triggering model of the T-cell receptor (TCR) within the framework of queuing theory, in which TCRs enter the queue upon full activation and exit by downregulation. We fit our model to four experimentally characterized threshold activation criteria and analyze their specificity and sensitivity: the initial calcium spike, cytotoxicity, immunological synapse formation, and cytokine secretion. Specificity characteristics improve as the time window for detection increases, saturating for time periods on the timescale of downregulation; thus, the calcium spike (30 s) has low specificity but a sensitivity to single-peptide MHC ligands, while the cytokine threshold (1 h) can distinguish ligands with a 30% variation in the complex lifetime. However, a robustness analysis shows that these properties are degraded when the queue parameters are subject to variation-for example, under stochasticity in the ligand number in the cell-cell interface and population variation in the cellular threshold. A time integration of the queue over a period of hours is shown to be able to control parameter noise efficiently for realistic parameter values when integrated over sufficiently long time periods (hours), the discrimination characteristics being determined by the TCR signal cascade kinetics (a kinetic proofreading scheme). Therefore, through a combination of thresholds and signal integration, a T cell can be responsive to low ligand density and specific to agonist quality. We suggest that multiple threshold mechanisms are employed to establish the conditions for efficient signal integration, i.e., coordinate the formation of a stable contact interface. PMID:16766611
T-Cell Activation: A Queuing Theory Analysis at Low Agonist Density
Wedagedera, J. R.; Burroughs, N. J.
2006-01-01
We analyze a simple linear triggering model of the T-cell receptor (TCR) within the framework of queuing theory, in which TCRs enter the queue upon full activation and exit by downregulation. We fit our model to four experimentally characterized threshold activation criteria and analyze their specificity and sensitivity: the initial calcium spike, cytotoxicity, immunological synapse formation, and cytokine secretion. Specificity characteristics improve as the time window for detection increases, saturating for time periods on the timescale of downregulation; thus, the calcium spike (30 s) has low specificity but a sensitivity to single-peptide MHC ligands, while the cytokine threshold (1 h) can distinguish ligands with a 30% variation in the complex lifetime. However, a robustness analysis shows that these properties are degraded when the queue parameters are subject to variation—for example, under stochasticity in the ligand number in the cell-cell interface and population variation in the cellular threshold. A time integration of the queue over a period of hours is shown to be able to control parameter noise efficiently for realistic parameter values when integrated over sufficiently long time periods (hours), the discrimination characteristics being determined by the TCR signal cascade kinetics (a kinetic proofreading scheme). Therefore, through a combination of thresholds and signal integration, a T cell can be responsive to low ligand density and specific to agonist quality. We suggest that multiple threshold mechanisms are employed to establish the conditions for efficient signal integration, i.e., coordinate the formation of a stable contact interface. PMID:16766611
Saichev, A; Sornette, D
2010-01-01
Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law approximately 1/t(alpha) with 0
Effects of diversity and procrastination in priority queuing theory: The different power law regimes
NASA Astrophysics Data System (ADS)
Saichev, A.; Sornette, D.
2010-01-01
Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law ˜1/tα with 0<α≤1 over a time scale of years. We present a simple model for this persistence phenomenon, framed within the standard priority queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a “time deficit” control parameter β and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail ˜1/t1/2 , resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/tα , with αɛ(0.5,∞) , including the well-known case 1/t . We also study the effect of “procrastination,” defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.
Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.
2015-03-10
Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.
Improving queuing service at McDonald's
NASA Astrophysics Data System (ADS)
Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.
2014-07-01
Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.
Dynamic neural-based buffer management for Queuing systems with self-similar characteristics.
Yousefi'zadeh, Homayoun; Jonckheere, Edmond A
2005-09-01
Buffer management in queuing systems plays an important role in addressing the tradeoff between efficiency measured in terms of overall packet loss and fairness measured in terms of individual source packet loss. Complete partitioning (CP) of a buffer with the best fairness characteristic and complete sharing (CS) of a buffer with the best efficiency characteristic are at the opposite ends of the spectrum of buffer management techniques. Dynamic partitioning buffer management techniques aim at addressing the tradeoff between efficiency and fairness. Ease of implementation is the key issue when determining the practicality of a dynamic buffer management technique. In this paper, two novel dynamic buffer management techniques for queuing systems accommodating self-similar traffic patterns are introduced. The techniques take advantage of the adaptive learning power of perceptron neural networks when applied to arriving traffic patterns of queuing systems. Relying on the water-filling approach, our proposed techniques are capable of coping with the tradeoff between packet loss and fairness issues. Computer simulations reveal that both of the proposed techniques enjoy great efficiency and fairness characteristics as well as ease of implementation. PMID:16252824
Haghighinejad, Hourvash Akbari; Kharazmi, Erfan; Hatam, Nahid; Yousefi, Sedigheh; Hesami, Seyed Ali; Danaei, Mina; Askarian, Mehrdad
2016-01-01
Background: Hospital emergencies have an essential role in health care systems. In the last decade, developed countries have paid great attention to overcrowding crisis in emergency departments. Simulation analysis of complex models for which conditions will change over time is much more effective than analytical solutions and emergency department (ED) is one of the most complex models for analysis. This study aimed to determine the number of patients who are waiting and waiting time in emergency department services in an Iranian hospital ED and to propose scenarios to reduce its queue and waiting time. Methods: This is a cross-sectional study in which simulation software (Arena, version 14) was used. The input information was extracted from the hospital database as well as through sampling. The objective was to evaluate the response variables of waiting time, number waiting and utilization of each server and test the three scenarios to improve them. Results: Running the models for 30 days revealed that a total of 4088 patients left the ED after being served and 1238 patients waited in the queue for admission in the ED bed area at end of the run (actually these patients received services out of their defined capacity). The first scenario result in the number of beds had to be increased from 81 to179 in order that the number waiting of the “bed area” server become almost zero. The second scenario which attempted to limit hospitalization time in the ED bed area to the third quartile of the serving time distribution could decrease the number waiting to 586 patients. Conclusion: Doubling the bed capacity in the emergency department and consequently other resources and capacity appropriately can solve the problem. This includes bed capacity requirement for both critically ill and less critically ill patients. Classification of ED internal sections based on severity of illness instead of medical specialty is another solution. PMID:26793727
A soft computing-based approach to optimise queuing-inventory control problem
NASA Astrophysics Data System (ADS)
Alaghebandha, Mohammad; Hajipour, Vahid
2015-04-01
In this paper, a multi-product continuous review inventory control problem within batch arrival queuing approach (MQr/M/1) is developed to find the optimal quantities of maximum inventory. The objective function is to minimise summation of ordering, holding and shortage costs under warehouse space, service level and expected lost-sales shortage cost constraints from retailer and warehouse viewpoints. Since the proposed model is Non-deterministic Polynomial-time hard, an efficient imperialist competitive algorithm (ICA) is proposed to solve the model. To justify proposed ICA, both ganetic algorithm and simulated annealing algorithm are utilised. In order to determine the best value of algorithm parameters that result in a better solution, a fine-tuning procedure is executed. Finally, the performance of the proposed ICA is analysed using some numerical illustrations.
He, Xinhua; Hu, Wenfa
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367
He, Xinhua
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367
Design and Implementation of High-Speed Input-Queued Switches Based on a Fair Scheduling Algorithm
NASA Astrophysics Data System (ADS)
Hu, Qingsheng; Zhao, Hua-An
To increase both the capacity and the processing speed for input-queued (IQ) switches, we proposed a fair scalable scheduling architecture (FSSA). By employing FSSA comprised of several cascaded sub-schedulers, a large-scale high performance switches or routers can be realized without the capacity limitation of monolithic device. In this paper, we present a fair scheduling algorithm named FSSA_DI based on an improved FSSA where a distributed iteration scheme is employed, the scheduler performance can be improved and the processing time can be reduced as well. Simulation results show that FSSA_DI achieves better performance on average delay and throughput under heavy loads compared to other existing algorithms. Moreover, a practical 64 × 64 FSSA using FSSA_DI algorithm is implemented by four Xilinx Vertex-4 FPGAs. Measurement results show that the data rates of our solution can be up to 800Mbps and the tradeoff between performance and hardware complexity has been solved peacefully.
Miller, Nicholas; Zavadil, Robert; Ellis, Abraham; Muljadi, Eduard; Camm, Ernst; Kirby, Brendan J
2007-01-01
The knowledge base of the electric power system engineering community continues to grow with installed capacity of wind generation in North America. While this process has certainly occurred at other times in the industry with other technologies, the relatively explosive growth, the compressed time frames from project conception to commissioning, and the unconventional characteristics of wind generation make this period in the industry somewhat unique. Large wind generation facilities are necessarily evolving to look more and more like conventional generating plants in terms of their ability to interact with the transmission network in a way that does not compromise performance or system reliability. Such an evolution has only been possible through the cumulative contributions of an ever-growing number of power system engineers who have delved into the unique technologies and technical challenges presented by wind generation. The industry is still only part of the way up the learning curve, however. Numerous technical challenges remain, and as has been found, each new wind generation facility has the potential to generate some new questions. With the IEEE PES expanding its presence and activities in this increasingly significant commercial arena, the prospects for staying "ahead of the curve" are brightened.
Human Factors of Queuing: A Library Circulation Model.
ERIC Educational Resources Information Center
Mansfield, Jerry W.
1981-01-01
Classical queuing theories and their accompanying service facilities totally disregard the human factors in the name of efficiency. As library managers we need to be more responsive to human needs in the design of service points and make every effort to minimize queuing and queue frustration. Five references are listed. (Author/RAA)
A queuing model for road traffic simulation
Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.
2015-03-10
We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.
NASA Astrophysics Data System (ADS)
Santoshkumar; Udaykumar, R. Y.
2015-04-01
The electrical vehicles (EVs) can be connected to the grid for power transaction. The vehicle-to-grid (V2G) supports the grid requirements and helps in maintaining the load demands. The grid control center (GCC), aggregator and EV are three key entities in V2G communication. The GCC sends the information about power requirements to the aggregator. The aggregator after receiving the information from the GCC sends the information to the EVs. Based on the information, the interested EV owners participate in power transaction with the grid. The aggregator facilitates the EVs by providing the parking and charging slot. In this paper the queuing model for EVs connected to the grid and development of wireless infrastructure for the EV to Smart Meter communication is proposed. The queuing model is developed and simulated. The path loss models for WiMAX are analyzed and compared. Also, the physical layer of WiMAX protocol is modeled and simulated for the EV to Smart Meter communication in V2G.
Emissions from queuing aircraft
Segal, H.
1980-01-01
The ability of the FAA (U.S. Federal Aviation Administration) Simplex mathematical model, which employs a simple point-source algorithm with provisions for selecting a particular plume height and initial box size for each aircraft being analyzed, to predict air quality through modeling emissions released from queuing aircraft was verified by measurements of carbon monoxide emissions from such aircraft during a five-day period at Los Angeles International Airport. The model predicted carbon monoxide concentrations of 4 ppm (National Ambient Air Quality Standard limit value is 35 ppm) at expected populated locations during the highest activity hour monitored. This study should also apply to other engine exhaust gases such as NO/sub x/.
Capacity Utilization Study for Aviation Security Cargo Inspection Queuing System
Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E; Brumback, Daryl L
2010-01-01
In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.
Capacity utilization study for aviation security cargo inspection queuing system
NASA Astrophysics Data System (ADS)
Allgood, Glenn O.; Olama, Mohammed M.; Lake, Joe E.; Brumback, Daryl
2010-04-01
In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system's ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.
Network Queuing System, Version 2.0
NASA Technical Reports Server (NTRS)
Walter, Howard; Bridges, Mike; Carver, Terrie; Kingsbury, Brent
1993-01-01
Network Queuing System (NQS) computer program is versatile batch- and device-queuing facility for single UNIX computer or group of computers in network. User invokes NQS collection of user-space programs to move batch and device jobs freely among different computers in network. Provides facilities for remote queuing, request routing, remote status, queue-status controls, batch-request resource quota limits, and remote output return. Revision of NQS provides for creation, deletion, addition, and setting of complexes aiding in limiting number of requests handled at one time. Also has improved device-oriented queues along with some revision of displays. Written in C language.
Modeling and simulation of M/M/c queuing pharmacy system with adjustable parameters
NASA Astrophysics Data System (ADS)
Rashida, A. R.; Fadzli, Mohammad; Ibrahim, Safwati; Goh, Siti Rohana
2016-02-01
This paper studies a discrete event simulation (DES) as a computer based modelling that imitates a real system of pharmacy unit. M/M/c queuing theo is used to model and analyse the characteristic of queuing system at the pharmacy unit of Hospital Tuanku Fauziah, Kangar in Perlis, Malaysia. The input of this model is based on statistical data collected for 20 working days in June 2014. Currently, patient waiting time of pharmacy unit is more than 15 minutes. The actual operation of the pharmacy unit is a mixed queuing server with M/M/2 queuing model where the pharmacist is referred as the server parameters. DES approach and ProModel simulation software is used to simulate the queuing model and to propose the improvement for queuing system at this pharmacy system. Waiting time for each server is analysed and found out that Counter 3 and 4 has the highest waiting time which is 16.98 and 16.73 minutes. Three scenarios; M/M/3, M/M/4 and M/M/5 are simulated and waiting time for actual queuing model and experimental queuing model are compared. The simulation results show that by adding the server (pharmacist), it will reduce patient waiting time to a reasonable improvement. Almost 50% average patient waiting time is reduced when one pharmacist is added to the counter. However, it is not necessary to fully utilize all counters because eventhough M/M/4 and M/M/5 produced more reduction in patient waiting time, but it is ineffective since Counter 5 is rarely used.
Application of queuing model in Dubai's busiest megaplex
NASA Astrophysics Data System (ADS)
Bhagchandani, Maneesha; Bajpai, Priti
2013-09-01
This paper provides a study and analysis of the extremely busy booking counters at the Megaplex in Dubai using the queuing model and simulation. Dubai is an emirate in UAE with a multicultural population. Majority of the population in Dubai is foreign born. Cinema is one of the major forms of entertainment. There are more than 13 megaplexes each with a number of screens ranging from 3 to 22. They screen movies in English, Arabic, Hindi and other languages. It has been observed that during the weekends megaplexes attract a large number of crowd resulting in long queues at the booking counters. One of the busiest megaplex was selected for the study. Queuing theory satisfies the model when tested in real time situation. The concepts of arrival rate, service rate, utilization rate, waiting time in the system, average number of people in the queue, using Little's Theorem and M/M/s queuing model along with simulation software have been used to suggest an empirical solution. The aim of the paper is twofold-To assess the present situation at the Megaplex and give recommendations to optimize the use of booking counters.
Discrete Event Simulation Models for CT Examination Queuing in West China Hospital
Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili
2016-01-01
In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237
Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.
Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili
2016-01-01
In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237
Queuing register uses fluid logic elements
NASA Technical Reports Server (NTRS)
1966-01-01
Queuing register /a multistage bit-shifting device/ uses a series of pure fluid elements to perform the required logic operations. The register has several stages of three-state pure fluid elements combined with two-input NOR gates.
Queuing Models of Tertiary Storage
NASA Technical Reports Server (NTRS)
Johnson, Theodore
1996-01-01
Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/
Principles of Queued Service Observing at CFHT
NASA Astrophysics Data System (ADS)
Manset, Nadine; Burdullis, T.; Devost, D.
2011-03-01
CFHT started to use Queued Service Observing in 2001, and is now operating in that mode over 95% of the time. Ten years later, the observations are now carried out by Remote Observers who are not present at the telescope (see the companion presentation "Remote Queued Service Observing at CFHT"). The next phase at CFHT will likley involve assisted or autonomous service observing (see the presentation "Artificial Intelligence in Autonomous Telescopes"), which would not be possible without first having a Queued observations system already in place. The advantages and disadvantages of QSO at CFHT will be reviewed. The principles of QSO at CFHT, which allow CFHT to complete 90-100% of the top 30-40% programs and often up to 80% of other accepted programs, will be presented, along with the strategic use of overfill programs, the method of agency balance, and the suite of planning, scheduling, analysis and data quality assessment tools available to Queue Coordinators and Remote Observers.
Bremer, H; Ehrenberg, M
1995-05-17
A recently reported comparison of stable RNA (rRNA, tRNA) and mRNA synthesis rates in ppGpp-synthesizing and ppGpp-deficient (delta relA delta spoT) bacteria has suggested that ppGpp inhibits transcription initiation from stable RNA promoters, as well as synthesis of (bulk) mRNA. Inhibition of stable RNA synthesis occurs mainly during slow growth of bacteria when cytoplasmic levels of ppGpp are high. In contrast, inhibition of mRNA occurs mainly during fast growth when ppGpp levels are low, and it is associated with a partial inactivation of RNA polymerase. To explain these observations it has been proposed that ppGpp causes transcriptional pausing and queuing during the synthesis of mRNA. Polymerase queuing requires high rates of transcription initiation in addition to polymerase pausing, and therefore high concentrations of free RNA polymerase. These conditions are found in fast growing bacteria. Furthermore, the RNA polymerase queues lead to a promoter blocking when RNA polymerase molecules stack up from the pause site back to the (mRNA) promoter. This occurs most frequently at pause sites close to the promoter. Blocking of mRNA promoters diverts RNA polymerase to stable RNA promoters. In this manner ppGpp could indirectly stimulate synthesis of stable RNA at high growth rates. In the present work a mathematical analysis, based on the theory of queuing, is presented and applied to the global control of transcription in bacteria. This model predicts the in vivo distribution of RNA polymerase over stable RNA and mRNA genes for both ppGpp-synthesizing and ppGpp-deficient bacteria in response to different environmental conditions. It also shows how small changes in basal ppGpp concentrations can produce large changes in the rate of stable RNA synthesis. PMID:7539631
Priority Queuing On A Parallel Data Bus
NASA Technical Reports Server (NTRS)
Wallis, D. E.
1985-01-01
Queuing strategy for communications along shared data bus minimizes number of data lines while always assuring user of highest priority given access to bus. New system handles up to 32 user demands on 17 data lines that previously serviced only 17 demands.
Is Your Queuing System ADA-Compliant?
ERIC Educational Resources Information Center
Lawrence, David
2002-01-01
Discusses the Americans with Disabilities (ADA) and Uniform Federal Accessibility Standards (UFAS) regulations regarding public facilities' crowd control stanchions and queuing systems. The major elements are protruding objects and wheelchair accessibility. Describes how to maintain compliance with the regulations and offers a list of additional…
Theory-Based Stakeholder Evaluation
ERIC Educational Resources Information Center
Hansen, Morten Balle; Vedung, Evert
2010-01-01
This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…
Belciug, Smaranda; Gorunescu, Florin
2015-02-01
Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application. PMID:25433363
ERIC Educational Resources Information Center
McEneaney, John E.
2006-01-01
The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory…
Some queuing network models of computer systems
NASA Technical Reports Server (NTRS)
Herndon, E. S.
1980-01-01
Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.
Queuing network approach for building evacuation planning
NASA Astrophysics Data System (ADS)
Ishak, Nurhanis; Khalid, Ruzelan; Baten, Md. Azizul; Nawawi, Mohd. Kamal Mohd.
2014-12-01
The complex behavior of pedestrians in a limited space layout can explicitly be modeled using an M/G/C/C state dependent queuing network. This paper implements the approach to study pedestrian flows through various corridors in a topological network. The best arrival rates and their impacts to the corridors' performances in terms of the throughput, blocking probability, expected number of occupants in the system and expected travel time were first measured using the M/G/C/C analytical model. These best arrival rates were then fed to its Network Flow Programming model to find the best arrival rates to source corridors and routes optimizing the network's total throughput. The analytical results were then validated using a simulation model. Various results of this study can be used to support the current Standard Operating Procedures (SOP) to efficiently and safely evacuate people in emergency cases.
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals. PMID:24109839
NQS - NETWORK QUEUING SYSTEM, VERSION 2.0 (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Walter, H.
1994-01-01
The Network Queuing System, NQS, is a versatile batch and device queuing facility for a single Unix computer or a group of networked computers. With the Unix operating system as a common interface, the user can invoke the NQS collection of user-space programs to move batch and device jobs freely around the different computer hardware tied into the network. NQS provides facilities for remote queuing, request routing, remote status, queue status controls, batch request resource quota limits, and remote output return. This program was developed as part of an effort aimed at tying together diverse UNIX based machines into NASA's Numerical Aerodynamic Simulator Processing System Network. This revision of NQS allows for creating, deleting, adding and setting of complexes that aid in limiting the number of requests to be handled at one time. It also has improved device-oriented queues along with some revision of the displays. NQS was designed to meet the following goals: 1) Provide for the full support of both batch and device requests. 2) Support all of the resource quotas enforceable by the underlying UNIX kernel implementation that are relevant to any particular batch request and its corresponding batch queue. 3) Support remote queuing and routing of batch and device requests throughout the NQS network. 4) Support queue access restrictions through user and group access lists for all queues. 5) Enable networked output return of both output and error files to possibly remote machines. 6) Allow mapping of accounts across machine boundaries. 7) Provide friendly configuration and modification mechanisms for each installation. 8) Support status operations across the network, without requiring a user to log in on remote target machines. 9) Provide for file staging or copying of files for movement to the actual execution machine. To support batch and device requests, NQS v.2 implements three queue types--batch, device and pipe. Batch queues hold and prioritize batch requests
A User Study of Public Catalogs: A Queuing Approach.
ERIC Educational Resources Information Center
Sage, Charles; And Others
As a means of studying the present public catalogs and possible catalog format alternatives at the Iowa State University library, a 6-week queuing study was conducted. Objectives of the study were (1) to determine the correlation between other library statistics (e.g., door counts and circulation records) and use of the public catalogs; (2) to…
An application of a queuing model for sea states
NASA Astrophysics Data System (ADS)
Loffredo, L.; Monbaliu, J.; Anderson, C.
2012-04-01
Unimodal approaches in design practice have shown inconsistencies in terms of directionality and limitations for accurate sea states description. Spectral multimodality needs to be included in the description of the wave climate. It can provide information about the coexistence of different wave systems originating from different meteorological events, such as locally generated wind waves and swell systems from distant storms. A 20 years dataset (1989-2008) for a location on the North Sea (K13, 53.2°N 3.2°E) has been retrieved from the ECMWF ERA- Interim re-analysis data archive, providing a consistent and homogeneous dataset. The work focuses on the joint and conditional probability distributions of wind sea and swell systems. For marine operations and design applications, critical combinations of wave systems may exist. We define a critical sea state on the basis of a set of thresholds, which can be not necessarily extreme, the emphasis is given to the dangerous combination of different wave systems concerning certain operations (i.e. small vessels navigation, dredging). The distribution of non-operability windows is described by a point process model with random and independent events, whose occurrences and lengths can be described only probabilistically. These characteristics allow to treat the emerging patterns as a part of a queuing system. According to this theory, generally adopted for several applications including traffic flows and waiting lines, the input process describes the sequence of requests for a service and the service mechanism the length of time that these requests will occupy the facilities. For weather-driven processes at sea an alternating renewal process appears as a suitable model. It consists of a sequence of critical events (period of inoperability), each of random duration, separated by calms, also of random durations. Inoperability periods and calms are assumed independent. In this model it is not possible more than one critical
Modified weighted fair queuing for packet scheduling in mobile WiMAX networks
NASA Astrophysics Data System (ADS)
Satrya, Gandeva B.; Brotoharsono, Tri
2013-03-01
The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.
Using multi-class queuing network to solve performance models of e-business sites.
Zheng, Xiao-ying; Chen, De-ren
2004-01-01
Due to e-business's variety of customers with different navigational patterns and demands, multi-class queuing network is a natural performance model for it. The open multi-class queuing network(QN) models are based on the assumption that no service center is saturated as a result of the combined loads of all the classes. Several formulas are used to calculate performance measures, including throughput, residence time, queue length, response time and the average number of requests. The solution technique of closed multi-class QN models is an approximate mean value analysis algorithm (MVA) based on three key equations, because the exact algorithm needs huge time and space requirement. As mixed multi-class QN models, include some open and some closed classes, the open classes should be eliminated to create a closed multi-class QN so that the closed model algorithm can be applied. Some corresponding examples are given to show how to apply the algorithms mentioned in this article. These examples indicate that multi-class QN is a reasonably accurate model of e-business and can be solved efficiently. PMID:14663849
Queuing of concurrent movement plans by basal ganglia.
Bhutani, Neha; Sureshbabu, Ramakrishnan; Farooqui, Ausaf A; Behari, Madhuri; Goyal, Vinay; Murthy, Aditya
2013-06-12
How the brain converts parallel representations of movement goals into sequential movements is not known. We tested the role of basal ganglia (BG) in the temporal control of movement sequences by a convergent approach involving inactivation of the BG by muscimol injections into the caudate nucleus of monkeys and assessing behavior of Parkinson's disease patients, performing a modified double-step saccade task. We tested a critical prediction of a class of competitive queuing models that explains serial behavior as the outcome of a selection of concurrently activated goals. In congruence with these models, we found that inactivation or impairment of the BG unmasked the parallel nature of goal representations such that a significantly greater extent of averaged saccades, curved saccades, and saccade sequence errors were observed. These results suggest that the BG perform a form of competitive queuing, holding the second movement plan in abeyance while the first movement is being executed, allowing the proper temporal control of movement sequences. PMID:23761894
Time-varying priority queuing models for human dynamics.
Jo, Hang-Hyun; Pan, Raj Kumar; Kaski, Kimmo
2012-06-01
Queuing models provide insight into the temporal inhomogeneity of human dynamics, characterized by the broad distribution of waiting times of individuals performing tasks. We theoretically study the queuing model of an agent trying to execute a task of interest, the priority of which may vary with time due to the agent's "state of mind." However, its execution is disrupted by other tasks of random priorities. By considering the priority of the task of interest either decreasing or increasing algebraically in time, we analytically obtain and numerically confirm the bimodal and unimodal waiting time distributions with power-law decaying tails, respectively. These results are also compared to the updating time distribution of papers in arXiv.org and the processing time distribution of papers in Physical Review journals. Our analysis helps to understand human task execution in a more realistic scenario. PMID:23005156
Time-varying priority queuing models for human dynamics
NASA Astrophysics Data System (ADS)
Jo, Hang-Hyun; Pan, Raj Kumar; Kaski, Kimmo
2012-06-01
Queuing models provide insight into the temporal inhomogeneity of human dynamics, characterized by the broad distribution of waiting times of individuals performing tasks. We theoretically study the queuing model of an agent trying to execute a task of interest, the priority of which may vary with time due to the agent's “state of mind.” However, its execution is disrupted by other tasks of random priorities. By considering the priority of the task of interest either decreasing or increasing algebraically in time, we analytically obtain and numerically confirm the bimodal and unimodal waiting time distributions with power-law decaying tails, respectively. These results are also compared to the updating time distribution of papers in arXiv.org and the processing time distribution of papers in Physical Review journals. Our analysis helps to understand human task execution in a more realistic scenario.
Using a segregation measure for the workload characterization of multi-class queuing networks
Dowdy, L.W.; Krantz, A.T.; Leuze, M.R.
1989-01-01
When a queuing network model of a computer system is constructed, the workload is characterized by several parameters known as the device demands. The demand that every customer places upon every device must be specified. A complete workload characterization of a K device network with N different customers contains N * K parameters. Substantial savings in complexity result if the number of workload parameters is decreased. If, for example, only the average demands on the K devices are used in the workload characterization, the overhead of parameter collection is reduced, and the solution of the queuing network model is simplified. With this approach, however, the multi-class system is represented by a single-class model. A loss of accuracy results. It has been recently demonstrated that the performance of a multi-class network is bounded below by its single-class counterpart model and is bounded above by a simple function based upon the single-class model. In this paper, a new workload characterization technique is proposed which requires: the K average device demands for the single-class counterpart model and a segregation measure, a value which indicates the degree to which different customers tend to utilize different parts of the network. The segregation measure specifies the point between the two bounds where the multi-class model's performance lies. This measure is quite intuitive and is simple to calculate. The technique provides an accurate estimate of the performance of a multi-class network. 6 refs., 5 figs., 3 tabs.
Wang, Jie; Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo
2014-01-01
Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. PMID:25401151
Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo
2014-01-01
Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. PMID:25401151
ERIC Educational Resources Information Center
Rogers, Patricia J.; Weiss, Carol H.
2007-01-01
This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly…
NAS Requirements Checklist for Job Queuing/Scheduling Software
NASA Technical Reports Server (NTRS)
Jones, James Patton
1996-01-01
The increasing reliability of parallel systems and clusters of computers has resulted in these systems becoming more attractive for true production workloads. Today, the primary obstacle to production use of clusters of computers is the lack of a functional and robust Job Management System for parallel applications. This document provides a checklist of NAS requirements for job queuing and scheduling in order to make most efficient use of parallel systems and clusters for parallel applications. Future requirements are also identified to assist software vendors with design planning.
A message-queuing framework for STAR's online monitoring and metadata collection
Arkhipkin, D.; Lauret, J.; Betts, W.
2011-12-23
We report our experience on migrating STARs Online Services (Run Control System, Data Acquisition System, Slow Control System and Subsystem Monitoring) from direct read/write database accesses to a modern non-blocking message-oriented infrastructure. Based on the Advanced Messaging Queuing Protocol (AMQP) and standards, this novel approach does not specify the message data structure, allowing great flexibility in its use. After careful consideration, we chose Google Protocol Buffers as our primary (de)serialization format for structured data exchange. This migration allows us to reduce the overall system complexity and greatly improve the reliability of the metadata collection and the performance of our online services in general. We will present this new framework through its software architecture overview, providing details about our staged and non-disruptive migration process as well as details of the implementation of pluggable components to provide future improvements without compromising stability and availability of services.
Final Report for ?Queuing Network Models of Performance of High End Computing Systems?
Buckwalter, J
2005-09-28
The primary objective of this project is to perform general research into queuing network models of performance of high end computing systems. A related objective is to investigate and predict how an increase in the number of nodes of a supercomputer will decrease the running time of a user's software package, which is often referred to as the strong scaling problem. We investigate the large, MPI-based Linux cluster MCR at LLNL, running the well-known NAS Parallel Benchmark (NPB) applications. Data is collected directly from NPB and also from the low-overhead LLNL profiling tool mpiP. For a run, we break the wall clock execution time of the benchmark into four components: switch delay, MPI contention time, MPI service time, and non-MPI computation time. Switch delay is estimated from message statistics. MPI service time and non-MPI computation time are calculated directly from measurement data. MPI contention is estimated by means of a queuing network model (QNM), based in part on MPI service time. This model of execution time validates reasonably well against the measured execution time, usually within 10%. Since the number of nodes used to run the application is a major input to the model, we can use the model to predict application execution times for various numbers of nodes. We also investigate how the four components of execution time scale individually as the number of nodes increases. Switch delay and MPI service time scale regularly. MPI contention is estimated by the QNM submodel and also has a fairly regular pattern. However, non-MPI compute time has a somewhat irregular pattern, possibly due to caching effects in the memory hierarchy. In contrast to some other performance modeling methods, this method is relatively fast to set up, fast to calculate, simple for data collection, and yet accurate enough to be quite useful.
NASA Astrophysics Data System (ADS)
Duan, Haoran
1997-12-01
This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a
Basing quantum theory on information processing
NASA Astrophysics Data System (ADS)
Barnum, Howard
2008-03-01
I consider information-based derivations of the quantum formalism, in a framework encompassing quantum and classical theory and a broad spectrum of theories serving as foils to them. The most ambitious hope for such a derivation is a role analogous to Einstein's development of the dynamics and kinetics of macroscopic bodies, and later of their gravitational interactions, on the basis of simple principles with clear operational meanings and experimental consequences. Short of this, it could still provide a principled understanding of the features of quantum mechanics that account for its greater-than-classical information-processing power, helping guide the search for new quantum algorithms and protocols. I summarize the convex operational framework for theories, and discuss information-processing in theories therein. Results include the fact that information that can be obtained without disturbance is inherently classical, generalized no-cloning and no-broadcasting theorems, exponentially secure bit commitment in all non-classical theories without entanglement, properties of theories that allow teleportation, and properties of theories that allow ``remote steering'' of ensembles using entanglement. Joint work with collaborators including Jonathan Barrett, Matthew Leifer, Alexander Wilce, Oscar Dahlsten, and Ben Toner.
Jigsaw Cooperative Learning: Acid-Base Theories
ERIC Educational Resources Information Center
Tarhan, Leman; Sesen, Burcin Acar
2012-01-01
This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…
Evaluation of Job Queuing/Scheduling Software: Phase I Report
NASA Technical Reports Server (NTRS)
Jones, James Patton
1996-01-01
The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.
Second Evaluation of Job Queuing/Scheduling Software. Phase 1
NASA Technical Reports Server (NTRS)
Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)
1997-01-01
The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.
Queuing model of a traffic bottleneck with bimodal arrival rate
NASA Astrophysics Data System (ADS)
Woelki, Marko
2016-06-01
This paper revisits the problem of tuning the density in a traffic bottleneck by reduction of the arrival rate when the queue length exceeds a certain threshold, studied recently for variants of totally asymmetric simple exclusion process (TASEP) and Burgers equation. In the present approach, a simple finite queuing system is considered and its contrasting “phase diagram” is derived. One can observe one jammed region, one low-density region and one region where the queue length is equilibrated around the threshold. Despite the simplicity of the model the physics is in accordance with the previous approach: The density is tuned at the threshold if the exit rate lies in between the two arrival rates.
Computer Simulation of a Queuing System in a Mathematical Modeling Course.
ERIC Educational Resources Information Center
Eyob, Ephrem
1990-01-01
The results of a simulation model of a queuing system are reported. Use in an introductory quantitative analysis course to enhance students' computer and quantitative modeling knowledge is described. (CW)
Theory-based telehealth and patient empowerment.
Suter, Paula; Suter, W Newton; Johnston, Donna
2011-04-01
Health care technology holds great potential to improve the quality of health care delivery. One effective technology is remote patient monitoring, whereby patient data, such as vital signs or symptom reports, are captured from home monitoring devices and transmitted to health care professionals for review. The use of remote patient monitoring, often referred to as telehealth, has been widely adopted by health care providers, particularly home care agencies. Most agencies have invested in telehealth to facilitate the early identification of disease exacerbation, particularly for patients with chronic diseases such as heart failure and diabetes. This technology has been successfully harnessed by agencies to reduce rehospitalization rates through remote data interpretation and the provision of timely interventions. We propose that the use of telehealth by home care agencies and other health care providers be expanded to empower patients and promote disease self-management with resultant improved health care outcomes. This article describes how remote monitoring, in combination with the application of salient adult learning and cognitive behavioral theories and applied to telehealth care delivery and practice, can promote improved patient self-efficacy with disease management. We present theories applicable for improving health-related behaviors and illustrate how theory-based practices can be implemented in the field of home care. Home care teams that deliver theory-based telehealth function as valuable partners to physicians and hospitals in an integrated health care delivery system. PMID:21241182
The scope of usage-based theory.
Ibbotson, Paul
2013-01-01
Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the "cognitive commitment" of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552
The Scope of Usage-Based Theory
Ibbotson, Paul
2013-01-01
Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552
Spectrally queued feature selection for robotic visual odometery
NASA Astrophysics Data System (ADS)
Pirozzo, David M.; Frederick, Philip A.; Hunt, Shawn; Theisen, Bernard; Del Rose, Mike
2011-01-01
Over the last two decades, research in Unmanned Vehicles (UV) has rapidly progressed and become more influenced by the field of biological sciences. Researchers have been investigating mechanical aspects of varying species to improve UV air and ground intrinsic mobility, they have been exploring the computational aspects of the brain for the development of pattern recognition and decision algorithms and they have been exploring perception capabilities of numerous animals and insects. This paper describes a 3 month exploratory applied research effort performed at the US ARMY Research, Development and Engineering Command's (RDECOM) Tank Automotive Research, Development and Engineering Center (TARDEC) in the area of biologically inspired spectrally augmented feature selection for robotic visual odometry. The motivation for this applied research was to develop a feasibility analysis on multi-spectrally queued feature selection, with improved temporal stability, for the purposes of visual odometry. The intended application is future semi-autonomous Unmanned Ground Vehicle (UGV) control as the richness of data sets required to enable human like behavior in these systems has yet to be defined.
Theory-based categorization under speeded conditions
Luhmann, Christian C.; Ahn, Woo-Kyoung; Palmeri, Thomas J.
2009-01-01
It is widely accepted that similarity influences rapid categorization, whereas theories can influence only more leisurely category judgments. In contrast, we argue that it is not the type of knowledge used that determines categorization speed, but rather the complexity of the categorization processes. In two experiments, participants learned four categories of items, each consisting of three causally related features. Participants gave more weight to cause features than to effect features, even under speeded response conditions. Furthermore, the time required to make judgments was equivalent, regardless of whether participants were using causal knowledge or base-rate information. We argue that both causal knowledge and base-rate information, once precompiled during learning, can be used at roughly the same speeds during categorization, thus demonstrating an important parallel between these two types of knowledge. PMID:17128608
NASA Astrophysics Data System (ADS)
Kunwar, Bharat; Simini, Filippo; Johansson, Anders
2016-02-01
Estimating city evacuation time is a nontrivial problem due to the interaction between thousands of individual agents, giving rise to various collective phenomena, such as bottleneck formation, intermittent flow, and stop-and-go waves. We present a mean field approach to draw relationships between road network spatial attributes, the number of evacuees, and the resultant evacuation time estimate (ETE). Using volunteered geographic information, we divide 50 United Kingdom cities into a total of 704 catchment areas (CAs) which we define as an area where all agents share the same nearest exit node. 90% of the agents are within ≈6 ,847 m of CA exit nodes with ≈13 ,778 agents/CA. We establish a characteristic flow rate from catchment area attributes (population, distance to exit node, and exit node width) and a mean flow rate in a free-flow regime by simulating total evacuations using an agent based "queuing network" model. We use these variables to determine a relationship between catchment area attributes and resultant ETEs. This relationship could enable emergency planners to make a rapid appraisal of evacuation strategies and help support decisions in the run up to a crisis.
Modelling pedestrian travel time and the design of facilities: a queuing approach.
Rahman, Khalidur; Ghani, Noraida Abdul; Kamil, Anton Abdulbasah; Mustafa, Adli; Kabir Chowdhury, Md Ahmed
2013-01-01
Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities. PMID:23691055
Modelling Pedestrian Travel Time and the Design of Facilities: A Queuing Approach
Rahman, Khalidur; Abdul Ghani, Noraida; Abdulbasah Kamil, Anton; Mustafa, Adli; Kabir Chowdhury, Md. Ahmed
2013-01-01
Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities. PMID:23691055
Theoretical description of metabolism using queueing theory.
Evstigneev, Vladyslav P; Holyavka, Marina G; Khrapatiy, Sergii V; Evstigneev, Maxim P
2014-09-01
A theoretical description of the process of metabolism has been developed on the basis of the Pachinko model (see Nicholson and Wilson in Nat Rev Drug Discov 2:668-676, 2003) and the queueing theory. The suggested approach relies on the probabilistic nature of the metabolic events and the Poisson distribution of the incoming flow of substrate molecules. The main focus of the work is an output flow of metabolites or the effectiveness of metabolism process. Two simplest models have been analyzed: short- and long-living complexes of the source molecules with a metabolizing point (Hole) without queuing. It has been concluded that the approach based on queueing theory enables a very broad range of metabolic events to be described theoretically from a single probabilistic point of view. PMID:25142745
Feature-Based Binding and Phase Theory
ERIC Educational Resources Information Center
Antonenko, Andrei
2012-01-01
Current theories of binding cannot provide a uniform account for many facts associated with the distribution of anaphors, such as long-distance binding effects and the subject-orientation of monomorphemic anaphors. Further, traditional binding theory is incompatible with minimalist assumptions. In this dissertation I propose an analysis of…
Theory Based Approaches to Learning. Implications for Adult Educators.
ERIC Educational Resources Information Center
Bolton, Elizabeth B.; Jones, Edward V.
This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…
Teaching a Theory-Based Sociology of Gender Course.
ERIC Educational Resources Information Center
Blee, Kathleen M.
1986-01-01
Presents the advantages of structuring a course on gender around the classical and contemporary sociological theories from which feminist theories are derived. Contrasts this approach against the more prevalent research-based approach. Provides suggestions on selecting and using theory and presents practical considerations in teaching such a…
Impact of Mandatory HIV Screening in the Emergency Department: A Queuing Study.
Liu, Nan; Stone, Patricia W; Schnall, Rebecca
2016-04-01
To improve HIV screening rates, New York State in 2010 mandated that all persons 13-64 years receiving health care services, including care in emergency departments (EDs), be offered HIV testing. Little attention has been paid to the effect of screening on patient flow. Time-stamped ED visit data from patients eligible for HIV screening, 7,844 of whom were seen by providers and 767 who left before being seen by providers, were retrieved from electronic health records in one adult ED. During day shifts, 10% of patients left without being seen, and during evening shifts, 5% left without being seen. All patients seen by providers were offered testing, and 6% were tested for HIV. Queuing models were developed to evaluate the effect of HIV screening on ED length of stay, patient waiting time, and rate of leaving without being seen. Base case analysis was conducted using actual testing rates, and sensitivity analyses were conducted to evaluate the impact of increasing the testing rate. Length of ED stay of patients who received HIV tests was 24 minutes longer on day shifts and 104 minutes longer on evening shifts than for patients not tested for HIV. Increases in HIV testing rate were estimated to increase waiting time for all patients, including those who left without being seen. Our simulation suggested that incorporating HIV testing into ED patient visits not only adds to practitioner workload but also increases patient waiting time significantly during busy shifts, which may increase the rate of leaving without being seen. PMID:26829415
Theory-Based University Admissions Testing for a New Millennium
ERIC Educational Resources Information Center
Sternberg, Robert J.
2004-01-01
This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…
Recursive renormalization group theory based subgrid modeling
NASA Technical Reports Server (NTRS)
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Theory of fracture mechanics based upon plasticity
NASA Technical Reports Server (NTRS)
Lee, J. D.
1976-01-01
A theory of fracture mechanics is formulated on the foundation of continuum mechanics. Fracture surface is introduced as an unknown quantity and is incorporated into boundary and initial conditions. Surface energy is included in the global form of energy conservation law and the dissipative mechanism is formulated into constitutive equations which indicate the thermodynamic irreversibility and the irreversibility of fracture process as well.
Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen
2014-01-01
Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms. PMID:24283669
Theory of friction based on brittle fracture
Byerlee, J.D.
1967-01-01
A theory of friction is presented that may be more applicable to geologic materials than the classic Bowden and Tabor theory. In the model, surfaces touch at the peaks of asperities and sliding occurs when the asperities fail by brittle fracture. The coefficient of friction, ??, was calculated from the strength of asperities of certain ideal shapes; for cone-shaped asperities, ?? is about 0.1 and for wedge-shaped asperities, ?? is about 0.15. For actual situations which seem close to the ideal model, observed ?? was found to be very close to 0.1, even for materials such as quartz and calcite with widely differing strengths. If surface forces are present, the theory predicts that ?? should decrease with load and that it should be higher in a vacuum than in air. In the presence of a fluid film between sliding surfaces, ?? should depend on the area of the surfaces in contact. Both effects are observed. The character of wear particles produced during sliding and the way in which ?? depends on normal load, roughness, and environment lend further support to the model of friction presented here. ?? 1967 The American Institute of Physics.
Maximum entropy principle based estimation of performance distribution in queueing theory.
He, Dayi; Li, Ran; Huang, Qi; Lei, Ping
2014-01-01
In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated. PMID:25207992
Graph-based linear scaling electronic structure theory
NASA Astrophysics Data System (ADS)
Niklasson, Anders M. N.; Mniszewski, Susan M.; Negre, Christian F. A.; Cawkwell, Marc J.; Swart, Pieter J.; Mohd-Yusof, Jamal; Germann, Timothy C.; Wall, Michael E.; Bock, Nicolas; Rubensson, Emanuel H.; Djidjev, Hristo
2016-06-01
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.
Graph-based linear scaling electronic structure theory.
Niklasson, Anders M N; Mniszewski, Susan M; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Rubensson, Emanuel H; Djidjev, Hristo
2016-06-21
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations. PMID:27334148
MODELING AND PERFORMANCE EVALUATION FOR AVIATION SECURITY CARGO INSPECTION QUEUING SYSTEM
Allgood, Glenn O; Olama, Mohammed M; Rose, Terri A; Brumback, Daryl L
2009-01-01
Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and throughput. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures will reduce the overall cost and shipping delays associated with the new inspection requirements.
Discrete-time Queuing Analysis of Opportunistic Spectrum Access: Single User Case
NASA Astrophysics Data System (ADS)
Wang, Jin-long; Xu, Yu-hua; Gao, Zhan; Wu, Qi-hui
2011-11-01
This article studies the discrete-time queuing dynamics of opportunistic spectrum access (OSA) systems, in which the secondary user seeks spectrum vacancies between bursty transmissions of the primary user to communicate. Since spectrum sensing and data transmission can not be performed simultaneously, the secondary user employs a sensing-then-transmission strategy to detect the presence of the primary user before accessing the licensed channel. Consequently, the transmission of the secondary user is periodically suspended for spectrum sensing. To capture the discontinuous transmission nature of the secondary user, we introduce a discrete-time queuing subjected to bursty preemption to describe the behavior of the secondary user. Specifically, we derive some important metrics of the secondary user, including secondary spectrum utilization ratio, buffer length, packet delay and packet dropping ratio. Finally, simulation results validate the proposed theoretical model and reveal that the theoretical results fit the simulated results well.
A Multiple Constraint Queuing Model for Predicting Current and Future Terminal Area Capacities
NASA Technical Reports Server (NTRS)
Meyn, Larry A.
2004-01-01
A new queuing model is being developed to evaluate the capacity benefits of several new concepts for terminal airspace operations. The major innovation is the ability to support a wide variety of multiple constraints for modeling the scheduling logic of several concepts. Among the constraints modeled are in-trail separation, separation between aircraft landing on parallel runways, in-trail separation at terminal area entry points, and permissible terminal area flight times.
Theory-Based Approaches to the Concept of Life
ERIC Educational Resources Information Center
El-Hani, Charbel Nino
2008-01-01
In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…
Evaluation Theory in Problem-Based Learning Approach.
ERIC Educational Resources Information Center
Hsu, Yu-chen
The purpose of this paper is to review evaluation theories and techniques in both the medical and educational fields and to propose an evaluation theory to explain the condition variables, the method variables, and the outcome variables of student assessment in a problem-based learning (PBL) approach. The PBL definition and process are presented,…
Task-Based Language Teaching and Expansive Learning Theory
ERIC Educational Resources Information Center
Robertson, Margaret
2014-01-01
Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…
Toward a Theory-Based Approach to Instructional Development.
ERIC Educational Resources Information Center
Merrill, M. David
Instructional development should be based on theory rather than raw empiricism. The dimensions and possible form of an instructional theory are outlined in three premises. It was presumed that a limited set of behavior categories exist and that all behaviors can be calssed into one or more of these categories. It was also presumed that for each…
NASA Astrophysics Data System (ADS)
Chowdhury, Prasun; Saha Misra, Iti
2014-10-01
Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.
Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.
Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K
2016-01-01
Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages. PMID:26558394
Measurement Theory in Deutsch's Algorithm Based on the Truth Values
NASA Astrophysics Data System (ADS)
Nagata, Koji; Nakamura, Tadao
2016-08-01
We propose a new measurement theory, in qubits handling, based on the truth values, i.e., the truth T (1) for true and the falsity F (0) for false. The results of measurement are either 0 or 1. To implement Deutsch's algorithm, we need both observability and controllability of a quantum state. The new measurement theory can satisfy these two. Especially, we systematically describe our assertion based on more mathematical analysis using raw data in a thoughtful experiment.
Elastic theory of origami-based metamaterials
NASA Astrophysics Data System (ADS)
Brunck, V.; Lechenault, F.; Reid, A.; Adda-Bedia, M.
2016-03-01
Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n -creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.
Elastic theory of origami-based metamaterials
NASA Astrophysics Data System (ADS)
Lechenault, Frederic; Brunck, V.; Reid, A.; Adda-Bedia, M.
Origami offers the possibility for new metamaterials whose overall mechanical properties can be programmed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would aim to determine the shape taken by the structure at rest and its mechanical response. We introduce a vector deformation field acting on the imprinted network of creases, that allows to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n-creases meeting at a single vertex, and then extended to origami tesselations. The generalized waterbomb base and the Miura-Ori are treated within this formalism. For the Miura folding, we uncover a phase transition from monostable to two metastable states, that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. This research was supported by the ANR Grant 14-CE07-0031 METAMAT.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 23 Highways 1 2010-04-01 2010-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 23 Highways 1 2011-04-01 2011-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 23 Highways 1 2013-04-01 2013-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 23 Highways 1 2014-04-01 2014-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 23 Highways 1 2012-04-01 2012-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...
[Carcinogenesis theory based on estrogen deficiency].
Suba, Zsuzsanna
2009-06-21
Earlier, estrogens were considered simply the most important hormones involved in female physiology and reproduction. Nowadays it has become familiar that they have pivotal roles in gene regulation of cell differentiation and proliferation. There are many contradictions concerning the associations of female sexual steroids and cancer. Cancers of the highly estrogen dependent organs are in the forefront of tumors as they are regarded as hormone associated ones. However, re-evaluation of earlier results supporting the carcinogenic capacity of estrogen exhibited many shortcomings and controversies. Recently, the clinical studies on hormone replacement therapy in postmenopausal women justified beneficial anticancer effects in several organs even in the female breast. The newly revealed association between estrogen deficiency and oral cancer risk also means a contradiction of the traditional concept of estrogen-induced cancer. Distinction between cancers of moderately and highly estrogen dependent tumors can be based on their different epidemiological features. The vast majority of the so-called smoking associated malignancies of the moderately estrogen dependent organs occur typically in the late postmenopausal life of women when the ovarian estrogen production is fairly decreased. However cancers of the highly estrogen dependent organs such as breast, endometrium and ovary exhibit both premenopausal and postmenopausal occurrence. In spite of the different epidemiological data of these two groups of cancers the mechanism of gene regulation disorder in the background of tumor initiation cannot act through quite opposite pathways. This suggests that in moderately estrogen sensitive organs a serious, in the highly estrogen dependent sites even a mild estrogen deficiency is enough to provoke gene regulation disorders. The new findings both on smoking associated and hormone related cancers might lead to the same conversion; not estrogen but rather its deficiency may provoke
On the Complexity of Constraint-Based Theory Extraction
NASA Astrophysics Data System (ADS)
Boley, Mario; Gärtner, Thomas
In this paper we rule out output polynomial listing algorithms for the general problem of discovering theories for a conjunction of monotone and anti-monotone constraints as well as for the particular subproblem in which all constraints are frequency-based. For the general problem we prove a concrete exponential lower time bound that holds for any correct algorithm and even in cases in which the size of the theory as well as the only previous bound are constant. For the case of frequency-based constraints our result holds unless P = NP. These findings motivate further research to identify tractable subproblems and justify approaches with exponential worst case complexity.
Aviation security cargo inspection queuing simulation model for material flow and accountability
Olama, Mohammed M; Allgood, Glenn O; Rose, Terri A; Brumback, Daryl L
2009-01-01
Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.
Field-Based Concerns about Fourth-Generation Evaluation Theory.
ERIC Educational Resources Information Center
Lai, Morris K.
Some aspects of fourth generation evaluation procedures that have been advocated by E. G. Guba and Y. S. Lincoln were examined empirically, with emphasis on areas where there have been discrepancies between theory and field-based experience. In fourth generation evaluation, the product of an evaluation is not a set of conclusions, recommendations,…
A Memory-Based Theory of Verbal Cognition
ERIC Educational Resources Information Center
Dennis, Simon
2005-01-01
The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these…
A Model of Statistics Performance Based on Achievement Goal Theory.
ERIC Educational Resources Information Center
Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.
2003-01-01
Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…
Theory-Based Considerations Influence the Interpretation of Generic Sentences
ERIC Educational Resources Information Center
Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.
2010-01-01
Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…
A Natural Teaching Method Based on Learning Theory.
ERIC Educational Resources Information Center
Smilkstein, Rita
1991-01-01
The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…
Toward an Instructionally Oriented Theory of Example-Based Learning
ERIC Educational Resources Information Center
Renkl, Alexander
2014-01-01
Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from…
Reasserting Theory in Professionally Based Initial Teacher Education
ERIC Educational Resources Information Center
Hodson, Elaine; Smith, Kim; Brown, Tony
2012-01-01
Conceptions of theory within initial teacher education in England are adjusting to new conditions where most learning how to teach is school-based. Student teachers on a programme situated primarily in an employing school were monitored within a practitioner enquiry by their university programme tutors according to how they progressively…
Project-Based Language Learning: An Activity Theory Analysis
ERIC Educational Resources Information Center
Gibbes, Marina; Carson, Lorna
2014-01-01
This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
Qualitative model-based diagnosis using possibility theory
NASA Technical Reports Server (NTRS)
Joslyn, Cliff
1994-01-01
The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.
Congestion at Card and Book Catalogs--A Queuing Theory Approach.
ERIC Educational Resources Information Center
Bookstein, Abraham
The question of whether a library's catalog should consist of cards arranged in a single alphabetical order (the "dictionary catalog) or be segregated as a separate file is discussed. Development is extended to encompass related problems involved in the creation of a book catalog. A model to study the effects of congestion at the catalog is…
Congestion at Card and Book Catalogs--A Queuing-Theory Approach
ERIC Educational Resources Information Center
Bookstein, Abraham
1972-01-01
This paper attempts to analyze the problem of congestion, using a mathematical model shown to be of value in other similar applications. Three criteria of congestion are considered, and it is found that the conclusion one can draw is sensitive to which of these criteria is paramount. (8 references) (Author/NH)
Research on the Optimization Method of Maintenance Support Unit Configuration with Queuing Theory
NASA Astrophysics Data System (ADS)
Zhang, Bo; Xu, Ying; Dong, Yue; Hou, Na; Yu, Yongli
Beginning with the conception of maintenance support unit, the maintenance support flow is analyzed, so as to confirm the relation between damaged equipment mean-time-to-repair and the number of maintenance support units. On that basis, the maintenance support unit configuration optimization model is found aiming at the minimum cost of maintenance support resources, of which the solution is given. And the process is explained at the last with a example.
Theory-Based Programme Development and Evaluation in Physiotherapy
Kay, Theresa; Klinck, Beth
2008-01-01
ABSTRACT Purpose: Programme evaluation has been defined as “the systematic process of collecting credible information for timely decision making about a particular program.” Where possible, findings are used to develop, revise, and improve programmes. Theory-based programme development and evaluation provides a comprehensive approach to programme evaluation. Summary of key points: In order to obtain meaningful information from evaluation activities, relevant programme components need to be understood. Theory-based programme development and evaluation starts with a comprehensive description of the programme. A useful tool to describe a programme is the Sidani and Braden Model of Program Theory, consisting of six programme components: problem definition, critical inputs, mediating factors, expected outcomes, extraneous factors, and implementation issues. Articulation of these key components may guide physiotherapy programme implementation and delivery and assist in the development of key evaluation questions and methodologies. Using this approach leads to a better understanding of client needs, programme processes, and programme outcomes and can help to identify barriers to and enablers of successful implementation. Two specific examples, representing public and private sectors, will illustrate the application of this approach to clinical practice. Conclusions: Theory-based programme development helps clinicians, administrators, and researchers develop an understanding of who benefits the most from which types of programmes and facilitates the implementation of processes to improve programmes. PMID:20145741
Correlation theory-based signal processing method for CMF signals
NASA Astrophysics Data System (ADS)
Shen, Yan-lin; Tu, Ya-qing
2016-06-01
Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.
Ensemble method: Community detection based on game theory
NASA Astrophysics Data System (ADS)
Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.
2014-08-01
Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.
Infrared small target detection based on Danger Theory
NASA Astrophysics Data System (ADS)
Lan, Jinhui; Yang, Xiao
2009-11-01
To solve the problem that traditional method can't detect the small objects whose local SNR is less than 2 in IR images, a Danger Theory-based model to detect infrared small target is presented in this paper. First, on the analog with immunology, the definition is given, in this paper, to such terms as dangerous signal, antigens, APC, antibodies. Besides, matching rule between antigen and antibody is improved. Prior to training the detection model and detecting the targets, the IR images are processed utilizing adaptive smooth filter to decrease the stochastic noise. Then at the training process, deleting rule, generating rule, crossover rule and the mutation rule are established after a large number of experiments in order to realize immediate convergence and obtain good antibodies. The Danger Theory-based model is built after the training process, and this model can detect the target whose local SNR is only 1.5.
Research on Capturing of Customer Requirements Based on Innovation Theory
NASA Astrophysics Data System (ADS)
junwu, Ding; dongtao, Yang; zhenqiang, Bao
To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.
Infrared Image Simulation Based On Statistical Learning Theory
NASA Astrophysics Data System (ADS)
Chaochao, Huang; Xiaodi, Wu; Wuqin, Tong
2007-12-01
A real-time simulation algorithm of infrared image based on statistical learning theory is presented. The method includes three contents to achieve real-time simulation of infrared image, such as acquiring the training sample, forecasting the scene temperature field value by statistical learning machine, data processing and data analysis of temperature field. The simulation result shows this algorithm based on ν - support vector regression have better maneuverability and generalization than the other method, and the simulation precision and real-time quality are satisfying.
Queuing model analysis of the Fujitsu VP2000 with dual scalar architecture
Ishiguro, M. )
1991-01-01
This paper reports on a queuing model analysis which is given for the performance evaluation of a dual scalar processor (DSP), which is composed to two scalar units and one vector unit. The performance evaluation is the three different processor models with equivalent hardware capacity: homogeneous DSP (DSP1), heterogeneous DSP (DSP2), and multiprocessor (MP), with particular attention given to the case of short vector lengths. It is found that the performance of DSP1 is preferable to that of MP for most actual workloads, and in particular for the work-load of the Japan Atomic Energy Research Institute. The effect of the dual scalar capacity, that is, of attaching a secondary scalar unit, is also investigated. It is shown that the throughput of DSP1 is about 1.8 times that of the uniprocessor, for a vectorization ratio of 90%.
Improved virtual queuing and dynamic EPD techniques for TCP over ATM
Wu, Y.; Siu, K.Y.; Ren, W.
1998-11-01
It is known that TCP throughput can degrade significantly over UBR service in a congested ATM network, and the early packet discard (EPD) technique has been proposed to improve the performance. However, recent studies show that EPD cannot ensure fairness among competing VCs in a congested network, but the degree of fairness can be improved using various forms of fair buffer allocation techniques. The authors propose an improved scheme that utilizes only a single shared FIFO queue for all VCs and admits simple implementation for high speed ATM networks. The scheme achieves nearly perfect fairness and throughput among multiple TCP connections, comparable to the expensive per-VC queuing technique. Analytical and simulation results are presented to show the validity of this new scheme and significant improvement in performance as compared with existing fair buffer allocation techniques for TCP over ATM.
NASA Technical Reports Server (NTRS)
Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter
1999-01-01
This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.
A Danger-Theory-Based Immune Network Optimization Algorithm
Li, Tao; Xiao, Xin; Shi, Yuanquan
2013-01-01
Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853
Control theory based airfoil design using the Euler equations
NASA Technical Reports Server (NTRS)
Jameson, Antony; Reuther, James
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.
An intelligent diagnosis model based on rough set theory
NASA Astrophysics Data System (ADS)
Li, Ze; Huang, Hong-Xing; Zheng, Ye-Lu; Wang, Zhou-Yuan
2013-03-01
Along with the popularity of computer and rapid development of information technology, how to increase the accuracy of the agricultural diagnosis becomes a difficult problem of popularizing the agricultural expert system. Analyzing existing research, baseing on the knowledge acquisition technology of rough set theory, towards great sample data, we put forward a intelligent diagnosis model. Extract rough set decision table from the samples property, use decision table to categorize the inference relation, acquire property rules related to inference diagnosis, through the means of rough set knowledge reasoning algorithm to realize intelligent diagnosis. Finally, we validate this diagnosis model by experiments. Introduce the rough set theory to provide the agricultural expert system of great sample data a effective diagnosis model.
Ranking streamflow model performance based on Information theory metrics
NASA Astrophysics Data System (ADS)
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
Forewarning model for water pollution risk based on Bayes theory.
Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis
2014-02-01
In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413
Game Theory and Risk-Based Levee System Design
NASA Astrophysics Data System (ADS)
Hui, R.; Lund, J. R.; Madani, K.
2014-12-01
Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.
Transportation Optimization with Fuzzy Trapezoidal Numbers Based on Possibility Theory
He, Dayi; Li, Ran; Huang, Qi; Lei, Ping
2014-01-01
In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods. PMID:25137239
Identifying influential nodes in weighted networks based on evidence theory
NASA Astrophysics Data System (ADS)
Wei, Daijun; Deng, Xinyang; Zhang, Xiaoge; Deng, Yong; Mahadevan, Sankaran
2013-05-01
The design of an effective ranking method to identify influential nodes is an important problem in the study of complex networks. In this paper, a new centrality measure is proposed based on the Dempster-Shafer evidence theory. The proposed measure trades off between the degree and strength of every node in a weighted network. The influences of both the degree and the strength of each node are represented by basic probability assignment (BPA). The proposed centrality measure is determined by the combination of these BPAs. Numerical examples are used to illustrate the effectiveness of the proposed method.
Theory of metascreen-based acoustic passive phased array
NASA Astrophysics Data System (ADS)
Li, Yong; Qi, Shuibao; Badreddine Assouar, M.
2016-04-01
The metascreen-based acoustic passive phased array provides a new degree of freedom for manipulating acoustic waves due to their fascinating properties, such as a fully shifting phase, keeping impedance matching, and holding subwavelength spatial resolution. We develop acoustic theories to analyze the transmission/reflection spectra and the refracted pressure fields of a metascreen composed of elements with four Helmholtz resonators (HRs) in series and a straight pipe. We find that these properties are also valid under oblique incidence with large angles, with the underlying physics stemming from the hybrid resonances between the HRs and the straight pipe. By imposing the desired phase profiles, the refracted fields can be tailored in an anomalous yet controllable manner. In particular, two types of negative refraction are exhibited, based on two distinct mechanisms: one is formed from classical diffraction theory and the other is dominated by the periodicity of the metascreen. Positive (normal) and negative refractions can be converted by simply changing the incident angle, with the coexistence of two types of refraction in a certain range of incident angles.
Feature Selection with Neighborhood Entropy-Based Cooperative Game Theory
Zeng, Kai; She, Kun; Niu, Xinzheng
2014-01-01
Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones. PMID:25276120
TDDFT-based local control theory for chemical reactions
NASA Astrophysics Data System (ADS)
Tavernelli, Ivano; Curchod, Basile F. E.; Penfold, Thomas J.
In this talk I will describe the implementation of local control theory for laser pulse shaping within the framework of TDDFT-based nonadiabatic dynamics. The method is based on a set of modified Tully's surface hopping equations and provides an efficient way to control the population of a selected reactive state of interest through the coupling with an external time-dependent electric field generated on-the-fly during the dynamics. This approach is applied to the investigation of the photoinduced intramolecular proton transfer reaction in 4-hydroxyacridine in gas phase and in solution. The generated pulses reveal important information about the underlying excited-state nuclear dynamics highlighting the involvement of collective vibrational modes that would be neglected in studies performed on model systems. Finally, this approach can help to shed new light on the photophysics and photochemistry of complex molecular systems and guide the design of novel reaction paths.
Master equation based steady-state cluster perturbation theory
NASA Astrophysics Data System (ADS)
Nuss, Martin; Dorn, Gerhard; Dorda, Antonius; von der Linden, Wolfgang; Arrigoni, Enrico
2015-09-01
A simple and efficient approximation scheme to study electronic transport characteristics of strongly correlated nanodevices, molecular junctions, or heterostructures out of equilibrium is provided by steady-state cluster perturbation theory. In this work, we improve the starting point of this perturbative, nonequilibrium Green's function based method. Specifically, we employ an improved unperturbed (so-called reference) state ρ̂S, constructed as the steady state of a quantum master equation within the Born-Markov approximation. This resulting hybrid method inherits beneficial aspects of both the quantum master equation as well as the nonequilibrium Green's function technique. We benchmark this scheme on two experimentally relevant systems in the single-electron transistor regime: an electron-electron interaction based quantum diode and a triple quantum dot ring junction, which both feature negative differential conductance. The results of this method improve significantly with respect to the plain quantum master equation treatment at modest additional computational cost.
Theory based design and optimization of materials for spintronics applications
NASA Astrophysics Data System (ADS)
Xu, Tianyi
The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.
Plato: A localised orbital based density functional theory code
NASA Astrophysics Data System (ADS)
Kenny, S. D.; Horsfield, A. P.
2009-12-01
The Plato package allows both orthogonal and non-orthogonal tight-binding as well as density functional theory (DFT) calculations to be performed within a single framework. The package also provides extensive tools for analysing the results of simulations as well as a number of tools for creating input files. The code is based upon the ideas first discussed in Sankey and Niklewski (1989) [1] with extensions to allow high-quality DFT calculations to be performed. DFT calculations can utilise either the local density approximation or the generalised gradient approximation. Basis sets from minimal basis through to ones containing multiple radial functions per angular momenta and polarisation functions can be used. Illustrations of how the package has been employed are given along with instructions for its utilisation. Program summaryProgram title: Plato Catalogue identifier: AEFC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 219 974 No. of bytes in distributed program, including test data, etc.: 1 821 493 Distribution format: tar.gz Programming language: C/MPI and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux and Mac OS X Has the code been vectorised or parallelised?: Yes, up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Nature of problem: Density functional theory study of electronic structure and total energies of molecules, crystals and surfaces. Solution method: Localised orbital based density functional theory. Restrictions: Tight-binding and density functional theory only, no exact exchange. Unusual features: Both atom centred and uniform meshes available
Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis
ERIC Educational Resources Information Center
Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.
2012-01-01
Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…
Treatment of adolescent sexual offenders: theory-based practice.
Sermabeikian, P; Martinez, D
1994-11-01
The treatment of adolescent sexual offenders (ASO) has its theoretical underpinnings in social learning theory. Although social learning theory has been frequently cited in literature, a comprehensive application of this theory, as applied to practice, has not been mapped out. The social learning and social cognitive theories of Bandura appear to be particularly relevant to the group treatment of this population. The application of these theories to practice, as demonstrated in a program model, is discussed as a means of demonstrating how theory-driven practice methods can be developed. PMID:7850605
A molecularly based theory for electron transfer reorganization energy
NASA Astrophysics Data System (ADS)
Zhuang, Bilin; Wang, Zhen-Gang
2015-12-01
Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule's permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.
A molecularly based theory for electron transfer reorganization energy
Zhuang, Bilin; Wang, Zhen-Gang
2015-12-14
Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule’s permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.
Quantum Hall transitions: An exact theory based on conformal restriction
NASA Astrophysics Data System (ADS)
Bettelheim, E.; Gruzberg, I. A.; Ludwig, A. W. W.
2012-10-01
We revisit the problem of the plateau transition in the integer quantum Hall effect. Here we develop an analytical approach for this transition, and for other two-dimensional disordered systems, based on the theory of “conformal restriction.” This is a mathematical theory that was recently developed within the context of the Schramm-Loewner evolution which describes the “stochastic geometry” of fractal curves and other stochastic geometrical fractal objects in two-dimensional space. Observables elucidating the connection with the plateau transition include the so-called point-contact conductances (PCCs) between points on the boundary of the sample, described within the language of the Chalker-Coddington network model for the transition. We show that the disorder-averaged PCCs are characterized by a classical probability distribution for certain geometric objects in the plane (which we call pictures), occurring with positive statistical weights, that satisfy the crucial so-called restriction property with respect to changes in the shape of the sample with absorbing boundaries; physically, these are boundaries connected to ideal leads. At the transition point, these geometrical objects (pictures) become fractals. Upon combining this restriction property with the expected conformal invariance at the transition point, we employ the mathematical theory of “conformal restriction measures” to relate the disorder-averaged PCCs to correlation functions of (Virasoro) primary operators in a conformal field theory (of central charge c=0). We show how this can be used to calculate these functions in a number of geometries with various boundary conditions. Since our results employ only the conformal restriction property, they are equally applicable to a number of other critical disordered electronic systems in two spatial dimensions, including for example the spin quantum Hall effect, the thermal metal phase in symmetry class D, and classical diffusion in two
Theory for a gas composition sensor based on acoustic properties
NASA Technical Reports Server (NTRS)
Phillips, Scott; Dain, Yefim; Lueptow, Richard M.
2003-01-01
Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent.
Frequency-domain direct waveform inversion based on perturbation theory
NASA Astrophysics Data System (ADS)
Kwak, Sangmin; Kim, Youngseo; Shin, Changsoo
2014-05-01
A direct waveform inversion based on perturbation theory is proposed to delineate a subsurface velocity structure from seismic data. This technique can directly compute the difference between the actual subsurface velocity and an initial guess of the velocity, while full waveform inversion updates the velocity model in the directions of reducing the data residual. Unlike full waveform inversion using the steepest descent method, the direct waveform inversion does not require a proper step length to iteratively update the velocity model. We present an algorithm for the waveform inversion method in the frequency domain and numerical examples demonstrating how the inversion method can reconstruct subsurface velocity structures using surface seismic data. The time-domain seismograms synthesized in the inversion procedure match the corresponding shot-gather seismograms of field data.
Theory for a gas composition sensor based on acoustic properties.
Phillips, Scott; Dain, Yefim; Lueptow, Richard M
2003-01-01
Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent. PMID:14552356
Correlated digital back propagation based on perturbation theory.
Liang, Xiaojun; Kumar, Shiva
2015-06-01
We studied a simplified digital back propagation (DBP) scheme by including the correlation between neighboring signal samples. An analytical expression for calculating the correlation coefficients is derived based on a perturbation theory. In each propagation step, nonlinear distortion due to phase-dependent terms in the perturbative expansion are ignored which enhances the computational efficiency. The performance of the correlated DBP is evaluated by simulating a single-channel single-polarization fiber-optic system operating at 28 Gbaud, 32-quadrature amplitude modulation (32-QAM), and 40 × 80 km transmission distance. As compared to standard DBP, correlated DBP reduces the total number of propagation steps by a factor of 10 without performance penalty. Correlated DBP with only 2 steps per link provides about one dB improvement in Q-factor over linear compensation. PMID:26072825
Xinjiang resources efficiency based on superior technical theory
NASA Astrophysics Data System (ADS)
Amut, Aniwaer; Li, Zeyuan
2005-09-01
The new concept about the resource efficiency in Xinjiang has been discussed in this study based on the advanced technology theory in policy making perspective. The analysis is focused on the resources advantage in the development, resource pressure, resource efficiency and technical approach to resource efficiency. The idea of industrialized development centered on resource efficiency, its control factors and basic technical framework for realization of resource efficiency factors, which include technique in application of recycled materials; water-saving technique oriented for efficiency in applying water resource; bio-technology for high yield and better quality of farm crops; comprehensive technique involved in farm-produce further process and agricultural industrialization; information technology around information support and information-oriented society; technique in transforming resources including oil and natural gas, mineral products and wind power; technique in control of desertification and biological security.
Intelligent control based on fuzzy logic and neural net theory
NASA Technical Reports Server (NTRS)
Lee, Chuen-Chien
1991-01-01
In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.
A Lie based 4-dimensional higher Chern-Simons theory
NASA Astrophysics Data System (ADS)
Zucchini, Roberto
2016-05-01
We present and study a model of 4-dimensional higher Chern-Simons theory, special Chern-Simons (SCS) theory, instances of which have appeared in the string literature, whose symmetry is encoded in a skeletal semistrict Lie 2-algebra constructed from a compact Lie group with non discrete center. The field content of SCS theory consists of a Lie valued 2-connection coupled to a background closed 3-form. SCS theory enjoys a large gauge and gauge for gauge symmetry organized in an infinite dimensional strict Lie 2-group. The partition function of SCS theory is simply related to that of a topological gauge theory localizing on flat connections with degree 3 second characteristic class determined by the background 3-form. Finally, SCS theory is related to a 3-dimensional special gauge theory whose 2-connection space has a natural symplectic structure with respect to which the 1-gauge transformation action is Hamiltonian, the 2-curvature map acting as moment map.
Evaluating Theory-Based Evaluation: Information, Norms, and Adherence
ERIC Educational Resources Information Center
Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose
2012-01-01
Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…
The elliptic wing based on the potential theory
NASA Technical Reports Server (NTRS)
Krienes, Klaus
1941-01-01
This article is intended as a contribution to the theory of the lifting surface. The aerodynamics of the elliptic wing in straight and oblique flow are explored on the basis of potential theory. The foundation of the calculation is the linearized theory of the acceleration potential in which all small quantities of higher order are disregarded.
Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education
ERIC Educational Resources Information Center
Amrein-Beardsley, A.; Haladyna, T.
2012-01-01
Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…
Improved network convergence and quality of service by strict priority queuing of routing traffic
NASA Astrophysics Data System (ADS)
Balandin, Sergey; Heiner, Andreas P.
2002-07-01
During the transient period after a link failure the network cannot guarantee the agreed service levels to user data. This is due to the fact that forwarding tables in the network are inconsistent. Moreover, link states can inadvertently be advertised wrong due to protocol time outs, which may result in persistent route flaps. Reducing the probability of wrongly advertised link states, and the time during which the forwarding tables are inconsistent, is therefore of eminent importance to provide consistent and high level QoS to user data. By queuing routing traffic in a queue with strict priority over all other (data) queues, i.e. assigning the highest priority in a Differentiated Services model, we were able to reduce the probability of routing data loss to almost zero, and reduce flooding times almost to their theoretical limit. The quality of service provided to user traffic was considerable higher than without the proposed modification. The scheme is independent of the routing protocol, and can be used with most differentiated service models. It is compatible with the current OSPF standard, and can be used in conjunction with other improvements in the protocol with similar objectives.
SARA: A Text-Based and Reader-Based Theory of Signaling
ERIC Educational Resources Information Center
Lemarie, Julie; Lorch, Robert F., Jr.; Eyrolle, Helene; Virbel, Jacques
2008-01-01
We propose a two-component theory of text signaling devices. The first component is a text-based analysis that characterizes any signaling device along four dimensions: (a) the type of information it makes available, (b) its scope, (c) how it is realized in the text, and (d) its location with respect to the content it cues. The second component is…
The Energetic Assessment of Frictional Instability Based on Rowe's Theory
NASA Astrophysics Data System (ADS)
Hirata, M.; Muto, J.; Nagahama, H.
2015-12-01
Frictional instability that controls the occurrence of unstable slips has been related to (1) rate and state dependent friction law (Dieterich, 1979; Ruina, 1983) and (2) shear localization in a gouge layer (e.g., Byerlee et al., 1978; Logan et al., 1979). Ikari et al. (2011) indicated that the transitions of frictional parameters obtained from the rate and state dependent friction law involve shear localization. However, the underlining theoretical background for their link has been unknown. Therefore, in this study, we investigate their relation theoretically and experimentally based on Rowe's theory on constant minimum energy ratio (Rowe, 1962) describing particle deformations quantitatively by energetic analysis. In theoretical analysis using analytical dynamics and irreversible thermodynamics, the energetic criterion about frictional instability is obtained; unstable slip occurs at energy ratios below 1. In friction experiments using a gas medium apparatus, simulated fault gouge deforms obeying the Rowe's theory. Additionally, the energy ratios change gradually with shear and show below 1 before the occurrence of unstable slip. Moreover, energy ratios are derived from volume changes. Transition of energy ratios from increase to decrease, which has been confirmed at the end of compaction, indicates the onset of volume increase toward the occurrence of unstable slip. The volume increases likely correspond to the formation of R1-shears with open mode character, which occurs prior to the unstable slip. Shear localization leads to a change in internal friction angle which is a statistical parameter to constitute a energy ratio. In short, changes in internal friction angle play an important role in evolving from being frictionally stable to unstable. From these results, the physical and energetic background for their link between the frictional parameter and shear localization becomes clear.
Simulation-based learning: From theory to practice.
DeCaporale-Ryan, Lauren N; Dadiz, Rita; Peyre, Sarah E
2016-06-01
Comments on the article, "Stimulating Reflective Practice Using Collaborative Reflective Training in Breaking Bad News Simulations," by Kim, Hernandez, Lavery, and Denmark (see record 2016-18380-001). Kim et al. are applauded for engaging and supporting the development of simulation-based education, and for their efforts to create an interprofessional learning environment. However, we hope further work on alternate methods of debriefing leverage the already inherent activation of learners that builds on previous experience, fosters reflection and builds skills. What is needed is the transference of learning theories into our educational research efforts that measure the effectiveness, validation, and reliability of behavior based performance change. The majority of breaking bad news (BBN) curricula limit program evaluations to reports of learner satisfaction, confidence and self-efficacy, rather than determining the successful translation of effective and humanistic interpersonal skills into long-term clinical practice (Rosenbaum et al., 2004). Research is needed to investigate how educational programs affect provider-patient-family interaction, and ultimately patient and family understanding, to better inform our teaching BBN skills. (PsycINFO Database Record PMID:27270248
Bowel anastomoses: The theory, the practice and the evidence base
Goulder, Frances
2012-01-01
Since the introduction of stapling instruments in the 1970s various studies have compared the results of sutured and stapled bowel anastomoses. A literature search was performed from 1960 to 2010 and articles relating to small bowel, colonic and colorectal anastomotic techniques were reviewed. References from these articles were also reviewed, and relevant articles obtained. Either a stapled or sutured gastrointestinal tract anastomosis is acceptable in most situations. The available evidence suggests that in the following situations, however, particular anastomotic techniques may result in fewer complications: A stapled side-to-side ileocolic anastomosis is preferable following a right hemicolectomy for cancer. A stapled side-to-side anastomosis is likely also preferable after an ileocolic resection for Crohn’s disease. Colorectal anastomoses can be sutured or stapled with similar results, although the incidence of strictures is higher following stapled anastomoses. Following reversal of loop ileostomy there is some evidence to suggest that a stapled side-to-side anastomosis or sutured enterotomy closure (rather than spout resection and sutured anastomosis) results in fewer complications. Non-randomised data has indicated that small bowel anastomoses are best sutured in the trauma patient. This article reviews the theory, practice and evidence base behind the various gastrointestinal anastomoses to help the practising general surgeon make evidence based operative decisions. PMID:23293735
Theory-based approaches for improving biomedical communications.
Boutwell, W B
1994-01-01
Using communication theory to improve biomedical communication may seem overly academic and beyond the practicality of day-to-day project deadlines. However, using communication theory allows us to organize the "big picture" while providing insight concerning the elements of successful communication. This article reviews and illustrates how the organizational power of communication theory can help biomedical communicators approach difficult communication tasks with greater opportunity for success. Overviews of selected theoretical frameworks and models are discussed along with practical implications. PMID:8014170
Density functional theory based generalized effective fragment potential method
Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.
2014-06-28
We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.
IMMAN: free software for information theory-based chemometric analysis.
Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo
2015-05-01
The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA
Density functional theory based generalized effective fragment potential method.
Nguyen, Kiet A; Pachter, Ruth; Day, Paul N
2014-06-28
We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes. PMID:24985612
Feature extraction algorithm for space targets based on fractal theory
NASA Astrophysics Data System (ADS)
Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin
2007-11-01
In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.
Scheduling for indoor visible light communication based on graph theory.
Tao, Yuyang; Liang, Xiao; Wang, Jiaheng; Zhao, Chunming
2015-02-01
Visible light communication (VLC) has drawn much attention in the field of high-rate indoor wireless communication. While most existing works focused on point-to-point VLC technologies, few studies have concerned multiuser VLC, where multiple optical access points (APs) transmit data to multiple user receivers. In such scenarios, inter-user interference constitutes the major factor limiting the system performance. Therefore, a proper scheduling scheme has to be proposed to coordinate the interference and optimize the whole system performance. In this work, we aim to maximize the sum rate of the system while taking into account user fairness by appropriately assigning LED lamps to multiple users. The formulated scheduling problem turns out to be a maximum weighted independent set problem. We then propose a novel and efficient resource allocation method based on graph theory to achieve high sum rates. Moreover, we also introduce proportional fairness into our scheduling scheme to ensure the user fairness. Our proposed scheduling scheme can, with low complexity, achieve more multiplexing gains, higher sum rate, and better fairness than the existing works. PMID:25836136
Towards a theory-based positive youth development programme.
Brink, Andrea Jw; Wissing, Marié P
2013-01-01
The aim of this study was to develop and describe an intervention programme for young adolescents, guided by the Positive Youth Development Intervention (PYDI) model, which provides a perspective on the facilitation of development in a more positive trajectory. The key concepts and processes suggested by the PYDI model were further analysed and broadened using existing literature for operationalisation and application purposes. Self-regulation is the central process effectuating developmental change, within the contexts of: a) the navigation of stressors; and b) the formulation and effective pursuit of relevant personal goals. Self-regulation, together with a developmental perspective, provided guidelines regarding the relevant skills and knowledge. These are facilitating: a) identity development; b) formulation of goals congruent with the latter; c) decision-making skills; d) coping skills; e) regulation of affect and cognition; and f) socialisation skills. The relevant content areas and the manner of the facilitation of these are indicated. The theory-based programme can be implemented and its effect empirically evaluated. Levels of hope, problem-solving efficacy and social efficacy may serve as, inter alia, indicators of developmental change. PMID:25860303
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611
Modeling for Convective Heat Transport Based on Mixing Length Theory
NASA Astrophysics Data System (ADS)
Yamagishi, Y.; Yanagisawa, T.
2002-12-01
Convection is the most important mechanism for the Earth's internal dynamics, and plays a substantial role on its evolution. On investigating the thermal history of the Earth, convective heat transport should be taken into account. However, it is difficult to treat full convective flow throughout the Earth's entire history. Therefore, the parameterized convection has been developed and widely used. Convection occurring in the Earth's interior has some complicated aspects. It has large variation of viscosity, internal heating, phase boundaries, etc. Especially, the viscosity contrast has significant effect on the efficiency of the heat transport of the convection. The parameterized convection treats viscosity variation artificially, so it has many limitations. We developed an alternative method based on the concept of "mixing length theory". We can relate local thermal gradient with local convective velocity of fluid parcel. Convective heat transport is identified with effective thermal diffusivity, and we can calculate horizontally averaged temperature profile and heat flux by solving a thermal conduction problem. On estimating the parcel's velocity, we can include such as the effect of variable viscosity. In this study, we confirm that the temperature profile can be calculated correctly by this method, on comparing the experimental and 2D calculation results. We further show the effect of the viscosity contrast on the thermal structure of the convective fluid, and calculate the relationship between Nusselt number and modified Rayleigh number.
LSST Telescope Alignment Plan Based on Nodal Aberration Theory
NASA Astrophysics Data System (ADS)
Sebag, J.; Gressler, W.; Schmid, T.; Rolland, J. P.; Thompson, K. P.
2012-04-01
The optical alignment of the Large Synoptic Survey Telescope (LSST) is potentially challenging, due to its fast three-mirror optical design and its large 3.5° field of view (FOV). It is highly advantageous to align the three-mirror optical system prior to the integration of the complex science camera on the telescope, which corrects the FOV via three refractive elements and includes the operational wavefront sensors. A telescope alignment method based on nodal aberration theory (NAT) is presented here to address this challenge. Without the science camera installed on the telescope, the on-axis imaging performance of the telescope is diffraction-limited, but the field of view is not corrected. The nodal properties of the three-mirror telescope design have been analyzed and an alignment approach has been developed using the intrinsically linear nodal behavior, which is linked via sensitivities to the misalignment parameters. Since mirror figure errors will exist in any real application, a methodology to introduce primary-mirror figure errors into the analysis has been developed and is also presented.
A Theory-Based Approach to Restructuring Middle Level Schools.
ERIC Educational Resources Information Center
Midgley, Carol; Maehr, Martin L.
This paper describes the implementation of a reform program in a middle school located in a relatively large school district in southeastern Michigan. First, an integrative theory is presented as a promising framework for reforming middle-grade schools. The theory was developed within a social-cognitive framework that emphasizes the importance of…
An Approach to Theory-Based Youth Programming
ERIC Educational Resources Information Center
Duerden, Mat D.; Gillard, Ann
2011-01-01
A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…
The Development of an Attribution-Based Theory of Motivation: A History of Ideas
ERIC Educational Resources Information Center
Weiner, Bernard
2010-01-01
The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the incorporation of…
Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier
2014-01-01
Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839
Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier
2014-01-01
Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839
Roybal, H; Baxendale, S J; Gupta, M
1999-01-01
Activity-based costing and the theory of constraints have been applied successfully in many manufacturing organizations. Recently, those concepts have been applied in service organizations. This article describes the application of activity-based costing and the theory of constraints in a managed care mental health and substance abuse organization. One of the unique aspects of this particular application was the integration of activity-based costing and the theory of constraints to guide process improvement efforts. This article describes the activity-based costing model and the application of the theory of constraint's focusing steps with an emphasis on unused capacities of activities in the organization. PMID:10350791
Toward A Brain-Based Theory of Beauty
Ishizu, Tomohiro; Zeki, Semir
2011-01-01
We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1–9, with 9 being the most beautiful. This allowed us to select three sets of stimuli–beautiful, indifferent and ugly–which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources–musical and visual–and probably by other sources as well. This has led us to formulate a brain-based theory of beauty. PMID:21755004
Design of traveling wave tubes based on field theory
Vanderplaats, N.R.; Kodis, M.A. . Vacuum Electronics Branch); Freund, H.P. )
1994-07-01
A method is described for the design of helix traveling wave tubes (TWT) which is based on the linear field analysis of the coupled beam-wave system. The dispersion relations are obtained by matching of radial admittances at boundaries instead of the individual field components. This approach provides flexibility in modeling various beam and circuit configurations with relative ease by choosing the appropriate admittance functions for each case. The method is illustrated for the case of a solid beam inside a sheath helix which is loaded externally by lossy dielectric material, a conducting cylinder, and axial vanes. Extension of the analysis to include a thin tape helix model is anticipated in the near future. The TWT model may be divided into axial regions to include velocity tapers, lossy materials and severs, with the helix geometry in each region varied arbitrarily. The relations between the ac velocities, current densities, and axial electric fields are used to derive a general expression for the new amplitudes of the three forward waves at each axial boundary. The sum of the fields for the three forward waves (two waves in a drift region) is followed to the circuit output. Numerical results of the field analysis are compared with the coupled-mode Pierce theory. A method is suggested for applying the field analysis to accurate design of practical TWT's that have a more complex circuit geometry, which starts with a simple measurement of the dispersion of the helix circuit. The field analysis may then be used to generate a circuit having properties very nearly equivalent to those of the actual circuit.
Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert
2012-09-01
Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change. PMID:24073123
ERIC Educational Resources Information Center
Han, Gang; Newell, Jay
2014-01-01
This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…
How Is a Science Lesson Developed and Implemented Based on Multiple Intelligences Theory?
ERIC Educational Resources Information Center
Kaya, Osman Nafiz
2008-01-01
The purpose of this study is to present the whole process step-by-step of how a science lesson can be planned and implemented based on Multiple Intelligences (MI) theory. First, it provides the potential of the MI theory for science teaching and learning. Then an MI science lesson that was developed based on a modified model in the literature and…
Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors
ERIC Educational Resources Information Center
Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.
2008-01-01
This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…
A Theory-Driven Integrative Process/Outcome Evaluation of a Concept-Based Nursing Curriculum
ERIC Educational Resources Information Center
Fromer, Rosemary F.
2013-01-01
The current trend in curriculum revision in nursing education is concept-based learning, but little research has been done on concept-based curricula in nursing education. The study used a theory-driven integrative process/outcome evaluation. Embedded in this theory-driven integrative process/outcome evaluation was a causal comparative…
Models for Theory-Based M.A. and Ph.D. Programs.
ERIC Educational Resources Information Center
Botan, Carl; Vasquez, Gabriel
1999-01-01
Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…
Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli
2013-01-01
M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037
Optimisation of a honeybee-colony's energetics via social learning based on queuing delays
NASA Astrophysics Data System (ADS)
Thenius, Ronald; Schmickl, Thomas; Crailsheim, Karl
2008-06-01
Natural selection shaped the foraging-related processes of honeybees in such a way that a colony can react to changing environmental conditions optimally. To investigate this complex dynamic social system, we developed a multi-agent model of the nectar flow inside and outside of a honeybee colony. In a honeybee colony, a temporal caste collects nectar in the environment. These foragers bring their harvest into the colony, where they unload their nectar loads to one or more storer bees. Our model predicts that a cohort of foragers, collecting nectar from a single nectar source, is able to detect changes in quality in other food sources they have never visited, via the nectar processing system of the colony. We identified two novel pathways of forager-to-forager communication. Foragers can gain information about changes in the nectar flow in the environment via changes in their mean waiting time for unloadings and the number of experienced multiple unloadings. This way two distinct groups of foragers that forage on different nectar sources and that never communicate directly can share information via a third cohort of worker bees. We show that this noisy and loosely knotted social network allows a colony to perform collective information processing, so that a single forager has all necessary information available to be able to 'tune' its social behaviour, like dancing or dance-following. This way the net nectar gain of the colony is increased.
Stochastic extension of cellular manufacturing systems: a queuing-based analysis
NASA Astrophysics Data System (ADS)
Fardis, Fatemeh; Zandi, Afagh; Ghezavati, Vahidreza
2013-07-01
Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.
Effective Contraceptive Use: An Exploration of Theory-Based Influences
ERIC Educational Resources Information Center
Peyman, N.; Oakley, D.
2009-01-01
The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…
Course Management and Students' Expectations: Theory-Based Considerations
ERIC Educational Resources Information Center
Buckley, M. Ronald; Novicevic, Milorad M.; Halbesleben, Jonathon R. B.; Harvey, Michael
2004-01-01
This paper proposes a theoretical, yet practical, framework for managing the formation process of students unrealistic expectations in a college course. Using relational contracting theory, alternative teacher interventions, aimed at effective management of students expectations about the course, are described. Also, the formation of the student…
Videogames, Tools for Change: A Study Based on Activity Theory
ERIC Educational Resources Information Center
Méndez, Laura; Lacasa, Pilar
2015-01-01
Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…
Logical Thinking in Children; Research Based on Piaget's Theory.
ERIC Educational Resources Information Center
Sigel, Irving E., Ed.; Hooper, Frank H., Ed.
Theoretical and empirical research derived from Piagetian theory is collected on the intellectual development of the elementary school child and his acquisition and utilization of conservation concepts. The articles present diversity of method and motive in the results of replication (validation studies of the description of cognitive growth) and…
PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis
ERIC Educational Resources Information Center
Waycott, Jenny; Jones, Ann; Scanlon, Eileen
2005-01-01
This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…
Impact of an Evidence-Based Medicine Curriculum Based on Adult Learning Theory
Green, Michael L; Ellis, Peter J
1997-01-01
OBJECTIVE To develop and implement an evidence-based medicine (EBM) curriculum and determine its effectiveness in improving residents' EBM behaviors and skills. DESIGN Description of the curriculum and a multifaceted evaluation, including a pretest-posttest controlled trial. SETTING University-based primary care internal medicine residency program. PARTICIPANTS Second- and third-year internal medicine residents (N =34). INTERVENTIONS A 7-week EBM curriculum in which residents work through the steps of evidence-based decisions for their own patients. Based on adult learning theory, the educational strategy included a resident-directed tutorial format, use of real clinical encounters, and specific EBM facilitating techniques for faculty. MEASUREMENTS AND MAIN RESULTS Behaviors and self-assessed competencies in EBM were measured with questionnaires. Evidence-based medicine skills were assessed with a 17-point test, which required free text responses to questions based on a clinical vignette and a test article. After the intervention, residents participating in the curriculum (case subjects) increased their use of original studies to answer clinical questions, their examination of methods and results sections of articles, and their self-assessed EBM competence in three of five domains of EBM, while the control subjects did not. The case subjects significantly improved their scores on the EBM skills test (8.5 to 11.0, p =.001), while the control subjects did not (8.5 to 7.1, p =.09). The difference in the posttest scores of the two groups was 3.9 points (p =.001, 95% confidence interval 1.9, 5.9). CONCLUSIONS An EBM curriculum based on adult learning theory improves residents' EBM skills and certain EBM behaviors. The description and multifaceted evaluation can guide medical educators involved in EBM training. PMID:9436893
ERIC Educational Resources Information Center
Bishop, John
An analysis of the argument that a market imperfection (wage differentials and queuing caused by unions) raises the marginal social product (MSP) of college education above the average before-tax private wage premium (APP) for college (this discrepancy is called a union-Q-nality) focuses on verifying five hypotheses: (1) Workers with identical…
Local control theory in trajectory-based nonadiabatic dynamics
Curchod, Basile F. E.; Penfold, Thomas J.; Rothlisberger, Ursula; Tavernelli, Ivano
2011-10-15
In this paper, we extend the implementation of nonadiabatic molecular dynamics within the framework of time-dependent density-functional theory in an external field described in Tavernelli et al.[Phys. Rev. A 81, 052508 (2010)] by calculating on-the-fly pulses to control the population transfer between electronic states using local control theory. Using Tully's fewest switches trajectory surface hopping method, we perform MD to control the photoexcitation of LiF and compare the results to quantum dynamics (QD) calculations performed within the Heidelberg multiconfiguration time-dependent Hartree package. We show that this approach is able to calculate a field that controls the population transfer between electronic states. The calculated field is in good agreement with that obtained from QD, and the differences that arise are discussed in detail.
Circuit theory and model-based inference for landscape connectivity
Hanks, Ephraim M.; Hooten, Mevin B.
2013-01-01
Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.
NASA Technical Reports Server (NTRS)
Krempl, E.; Lu, H.; Yao, D.
1988-01-01
Short term strain rate change, creep and relaxation tests were performed in an MTS computer controlled servohydraulic testing machine. Aging and recovery were found to be insignificant for test times not exceeding 30 hrs. The material functions and constants of the theory were identified from results of strain rate change tests. Numerical integration of the theory for relaxation and creep tests showed good predictive capabilities of the viscoplasticity theory based on overstress.
Capacity and delay estimation for roundabouts using conflict theory.
Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin
2014-01-01
To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982
Collective learning modeling based on the kinetic theory of active particles
NASA Astrophysics Data System (ADS)
Burini, D.; De Lillo, S.; Gibelli, L.
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.
Collective learning modeling based on the kinetic theory of active particles.
Burini, D; De Lillo, S; Gibelli, L
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. PMID:26542123
A theory-based approach to teaching young children about health: A recipe for understanding
Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley
2011-01-01
The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237
Game theory based band selection for hyperspectral images
NASA Astrophysics Data System (ADS)
Shi, Aiye; He, Zhenyu; Huang, Fengchen
2015-12-01
This paper proposes a new evaluation criterion for band selection for hyperspectral imagery. The combination of information and class separability is used to be as a new evaluation criterion, at the same time, the correlation between bands is used as a constraint condition. In addition, the game theory is introduced into the band selection to coordinate the potential conflict of search the optimal band combination using information and class separability these two evaluation criteria. The experimental results show that the proposed method is effective on AVIRIS hyperspectral data.
Wireless network traffic modeling based on extreme value theory
NASA Astrophysics Data System (ADS)
Liu, Chunfeng; Shu, Yantai; Yang, Oliver W. W.; Liu, Jiakun; Dong, Linfang
2006-10-01
In this paper, Extreme Value Theory (EVT) is presented to analyze wireless network traffic. The role of EVT is to allow the development of procedures that are scientifically and statistically rational to estimate the extreme behavior of random processes. There are two primary methods for studying extremes: the Block Maximum (BM) method and the Points Over Threshold (POT) method. By taking limited traffic data that is greater than the threshold value, our experiment and analysis show the wireless network traffic model obtained with the EVT fits well with that of empirical distribution of traffic, thus illustrating that EVT has a good application foreground in the analysis of wireless network traffic.
Patient and nurse experiences of theory-based care.
Flanagan, Jane
2009-04-01
The pre-surgery nursing practice model derived from Newman's theory was developed to change the delivery of nursing care in a pre-surgical clinic. Guided by the theoretical knowledge of health as expanding consciousness, transpersonal caring, and reflective practice, key practice changes included a) incorporating Newman's praxis process, b) changing the physical space, and c) providing opportunities to reflect on practice. The purpose of this study was to utilize a phenomenological approach to evaluate a new model of care among 31 patients and 4 nurses. PMID:19342715
Assembly models for Papovaviridae based on tiling theory
NASA Astrophysics Data System (ADS)
Keef, T.; Taormina, A.; Twarock, R.
2005-09-01
A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail.
Assembly models for Papovaviridae based on tiling theory.
Keef, T; Taormina, A; Twarock, R
2005-09-01
A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail. PMID:16224123
Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory
ERIC Educational Resources Information Center
Johnson, David W.; Johnson, Roger T.; Smith, Karl A.
2014-01-01
Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…
ERIC Educational Resources Information Center
Bresciani, Marilee J.
2011-01-01
The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…
ERIC Educational Resources Information Center
O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen
2013-01-01
Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…
The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation
ERIC Educational Resources Information Center
Moorer, Cleamon, Jr.
2014-01-01
This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…
Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact
ERIC Educational Resources Information Center
McQuillin, Samuel D.; Lyons, Michael D.
2016-01-01
This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…
Determination of the Sediment Carrying Capacity Based on Perturbed Theory
Ni, Zhi-hui; Zeng, Qiang; Li-chun, Wu
2014-01-01
According to the previous studies of sediment carrying capacity, a new method of sediment carrying capacity on perturbed theory was proposed. By taking into account the average water depth, average flow velocity, settling velocity, and other influencing factors and introducing the median grain size as one main influencing factor in deriving the new formula, we established a new sediment carrying capacity formula. The coefficients were determined by the principle of dimensional analysis, multiple linear regression method, and the least square method. After that, the new formula was verified through measuring data of natural rivers and flume tests and comparing the verified results calculated by Cao Formula, Zhang Formula, Li Formula, Engelung-Hansen Formula, Ackers-White Formula, and Yang Formula. According to the compared results, it can be seen that the new method is of high accuracy. It could be a useful reference for the determination of sediment carrying capacity. PMID:25136652
Dynamic Simulation of Backward Diffusion Based on Random Walk Theory
NASA Astrophysics Data System (ADS)
Dung, Vu Ba; Nguyen, Bui Huu
2016-06-01
Results of diffusion study in silicon showed that diffusion of the selfinterstitial and vacancy could be backward diffusion and their diffusivity could be negative [1]. The backward diffusion process and negative diffusivity is contrary to the fundamental laws of diffusion such as the law of Fick law, namely the diffusive flux of backward diffusion goes from regions of low concentration to regions of high concentration. The backward diffusion process have been explained [2]. In this paper, the backward diffusion process is simulated. Results is corresponding to theory and show that when thermal velocity of the low concentration area is greater than thermal velocity of the high concentration area, the backward diffusion can be occurred.
Effects of a social cognitive theory-based hip fracture prevention web site for older adults.
Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley
2010-01-01
The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408
An anti-attack model based on complex network theory in P2P networks
NASA Astrophysics Data System (ADS)
Peng, Hao; Lu, Songnian; Zhao, Dandan; Zhang, Aixin; Li, Jianhua
2012-04-01
Complex network theory is a useful way to study many real systems. In this paper, an anti-attack model based on complex network theory is introduced. The mechanism of this model is based on a dynamic compensation process and a reverse percolation process in P2P networks. The main purpose of the paper is: (i) a dynamic compensation process can turn an attacked P2P network into a power-law (PL) network with exponential cutoff; (ii) a local healing process can restore the maximum degree of peers in an attacked P2P network to a normal level; (iii) a restoring process based on reverse percolation theory connects the fragmentary peers of an attacked P2P network together into a giant connected component. In this way, the model based on complex network theory can be effectively utilized for anti-attack and protection purposes in P2P networks.
Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia
2015-01-01
We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using 'actors'. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit-they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ρ of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)-a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ρ = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (ρ = 11.9%) produced a longer average individual escape time. PMID:25693170
Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia
2015-01-01
We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using ‘actors’. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit–they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ρ of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)—a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ρ = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (ρ = 11.9%) produced a longer average individual escape time. PMID:25693170
ERIC Educational Resources Information Center
Knox, A. Whitney; Miller, Bruce A.
1980-01-01
Describes a method for estimating the number of cathode ray tube terminals needed for public use of an online library catalog. Authors claim method could also be used to estimate needed numbers of microform readers for a computer output microform (COM) catalog. Formulae are included. (Author/JD)
NASA Technical Reports Server (NTRS)
Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander
2010-01-01
Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.
Improved routing strategy based on gravitational field theory
NASA Astrophysics Data System (ADS)
Song, Hai-Quan; Guo, Jin
2015-10-01
Routing and path selection are crucial for many communication and logistic applications. We study the interaction between nodes and packets and establish a simple model for describing the attraction of the node to the packet in transmission process by using the gravitational field theory, considering the real and potential congestion of the nodes. On the basis of this model, we propose a gravitational field routing strategy that considers the attractions of all of the nodes on the travel path to the packet. In order to illustrate the efficiency of proposed routing algorithm, we introduce the order parameter to measure the throughput of the network by the critical value of phase transition from a free flow phase to a congested phase, and study the distribution of betweenness centrality and traffic jam. Simulations show that, compared with the shortest path routing strategy, the gravitational field routing strategy considerably enhances the throughput of the network and balances the traffic load, and nearly all of the nodes are used efficiently. Project supported by the Technology and Development Research Project of China Railway Corporation (Grant No. 2012X007-D) and the Key Program of Technology and Development Research Foundation of China Railway Corporation (Grant No. 2012X003-A).
Treatment motivation in drug users: a theory-based analysis.
Longshore, Douglas; Teruya, Cheryl
2006-02-01
Motivation for drug use treatment is widely regarded as crucial to a client's engagement in treatment and success in quitting drug use. Motivation is typically measured with items reflecting high treatment readiness (e.g., perceived need for treatment and commitment to participate) and low treatment resistance (e.g., skepticism regarding benefits of treatment). Building upon reactance theory and the psychotherapeutic construct of resistance, we conceptualized these two aspects of treatment motivation - readiness and resistance - as distinct constructs and examined their predictive power in a sample of 1295 drug-using offenders referred to treatment while on probation. The sample was 60.7% African Americans, 33.5% non-Hispanic Whites, and 21.2% women; their ages ranged from 16 to 63 years old. Interviews occurred at treatment entry and 6 months later. Readiness (but not resistance) predicted treatment retention during the 6-month period. Resistance (but not readiness) predicted drug use, especially among offenders for whom the treatment referral was coercive. These findings suggest that readiness and resistance should both be assessed among clients entering treatment, especially when the referral is coercive. Intake and counseling protocols should address readiness and resistance separately. PMID:16051447
Solar Activity Predictions Based on Solar Dynamo Theories
NASA Astrophysics Data System (ADS)
Schatten, Kenneth H.
2009-05-01
We review solar activity prediction methods, statistical, precursor, and recently the Dikpati and the Choudhury groups’ use of numerical flux-dynamo methods. Outlining various methods, we compare precursor techniques with weather forecasting. Precursors involve events prior to a solar cycle. First started by the Russian geomagnetician Ohl, and then Brown and Williams; the Earth's field variations near solar minimum was used to predict the next solar cycle, with a correlation of 0.95. From the standpoint of causality, as well as energetically, these relationships were somewhat bizarre. One index used was the "number of anomalous quiet days,” an antiquated, subjective index. Scientific progress cannot be made without some suspension of disbelief; otherwise old paradigms become tautologies. So, with youthful naïveté, Svalgaard, Scherrer, Wilcox and I viewed the results through rose-colored glasses and pressed ahead searching for understanding. We eventually fumbled our way to explaining how the Sun could broadcast the state of its internal dynamo to Earth. We noted one key aspect of the Babcock-Leighton Flux Dynamo theory: the polar field at the end of a cycle serves as a seed for the next cycle's growth. Near solar minimum this field usually bathes the Earth, and thereby affects geomagnetic indices then. We found support by examining 8 previous solar cycles. Using our solar precursor technique we successfully predicted cycles 21, 22 and 23 using WSO and MWSO data. Pesnell and I improved the method using a SODA (SOlar Dynamo Amplitude) Index. In 2005, nearing cycle 23's minimum, Svalgaard and I noted an unusually weak polar field, and forecasted a small cycle 24. We discuss future advances: the flux-dynamo methods. As far as future solar activity, I shall let the Sun decide; it will do so anyhow.
A Composition Curriculum Based on James Britton's Theories.
ERIC Educational Resources Information Center
Monahan, Brian D.; Zelner, Jane
In 1979, the Yonkers Public School district (New York) launched a project to design and implement secondary school language arts curriculum guides with an emphasis on written composition. A theoretical framework was developed, based on the work of James Britton and the philosophy of the Bay Area Writing Project (BAWP). Britton's work provided the…
Integrated Models of School-Based Prevention: Logic and Theory
ERIC Educational Resources Information Center
Domitrovich, Celene E.; Bradshaw, Catherine P.; Greenberg, Mark T.; Embry, Dennis; Poduska, Jeanne M.; Ialongo, Nicholas S.
2010-01-01
School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to…
Designing Site-Based Systems, Deriving a Theory of Practice.
ERIC Educational Resources Information Center
Bauer, Scott C.
1998-01-01
Reviews five dimensions (focus, scope, structure, process, and capacity) of an organizational design used by 20 New York districts planning for site-based management (SBM) implementation. The confusion surrounding devolution of decision making hinders districts' efforts to effect changes in intermediate variables (job satisfaction and staff…
Content Based Image Retrieval and Information Theory: A General Approach.
ERIC Educational Resources Information Center
Zachary, John; Iyengar, S. S.; Barhen, Jacob
2001-01-01
Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…
A Conceptual Framework Based on Activity Theory for Mobile CSCL
ERIC Educational Resources Information Center
Zurita, Gustavo; Nussbaum, Miguel
2007-01-01
There is a need for collaborative group activities that promote student social interaction in the classroom. Handheld computers interconnected by a wireless network allow people who work on a common task to interact face to face while maintaining the mediation afforded by a technology-based system. Wirelessly interconnected handhelds open up new…
Learning Trajectory Based Instruction: Toward a Theory of Teaching
ERIC Educational Resources Information Center
Sztajn, Paola; Confrey, Jere; Wilson, P. Holt; Edgington, Cynthia
2012-01-01
In this article, we propose a theoretical connection between research on learning and research on teaching through recent research on students' learning trajectories (LTs). We define learning trajectory based instruction (LTBI) as teaching that uses students' LTs as the basis for instructional decisions. We use mathematics as the context for our…
An Efficacious Theory-Based Intervention for Stepfamilies
ERIC Educational Resources Information Center
Forgatch, Marion S.; DeGarmo, David S.; Beldavs, Zintars G.
2005-01-01
This article evaluates the efficacy of the Oregon model of Parent Management Training (PMTO) in the stepfamily context. Sixty-seven of 110 participants in the Marriage and Parenting in Stepfamilies (MAPS) program received a PMTO-based intervention. Participants in the randomly assigned experimental group displayed a large effect in benefits to…
Boundary based on exchange symmetry theory for multilevel simulations. I. Basic theory.
Shiga, Motoyuki; Masia, Marco
2013-07-28
In this paper, we lay the foundations for a new method that allows multilevel simulations of a diffusive system, i.e., a system where a flux of particles through the boundaries might disrupt the primary region. The method is based on the use of flexible restraints that maintain the separation between inner and outer particles. It is shown that, by introducing a bias potential that accounts for the exchange symmetry of the system, the correct statistical distribution is preserved. Using a toy model consisting of non-interacting particles in an asymmetric potential well, we prove that the method is formally exact, and that it could be simplified by considering only up to a couple of particle exchanges without a loss of accuracy. A real-world test is then made by considering a hybrid MM(∗)/MM calculation of cesium ion in water. In this case, the single exchange approximation is sound enough that the results superimpose to the exact solutions. Potential applications of this method to many different hybrid QM/MM systems are discussed, as well as its limitations and strengths in comparison to existing approaches. PMID:23901973
Non-fragile H∞ synchronization of memristor-based neural networks using passivity theory.
Mathiyalagan, K; Anbuvithya, R; Sakthivel, R; Park, Ju H; Prakash, P
2016-02-01
In this paper, we formulate and investigate the mixed H∞ and passivity based synchronization criteria for memristor-based recurrent neural networks with time-varying delays. Some sufficient conditions are obtained to guarantee the synchronization of the considered neural network based on the master-slave concept, differential inclusions theory and Lyapunov-Krasovskii stability theory. Also, the memristive neural network is considered with two different types of memductance functions and two types of gain variations. The results for non-fragile observer-based synchronization are derived in terms of linear matrix inequalities (LMIs). Finally, the effectiveness of the proposed criterion is demonstrated through numerical examples. PMID:26655373
Scale-invariant entropy-based theory for dynamic ordering
Mahulikar, Shripad P. E-mail: spm@aero.iitb.ac.in; Kumari, Priti
2014-09-01
Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations.
Research on e-learning services based on ontology theory
NASA Astrophysics Data System (ADS)
Liu, Rui
2013-07-01
E-learning services can realize network learning resource sharing and interoperability, but they can't realize automatic discovery, implementation and integration of services. This paper proposes a framework of e-learning services based on ontology, the ontology technology is applied to the publication and discovery process of e-learning services, in order to realize accurate and efficient retrieval and utilization of e-learning services.
Ab initio theory of iron-based superconductors
NASA Astrophysics Data System (ADS)
Essenberger, F.; Sanna, A.; Buczek, P.; Ernst, A.; Sandratskii, L.; Gross, E. K. U.
2016-07-01
We report a first-principles study of the superconducting critical temperature and other properties of Fe-based superconductors taking into account, on equal footing, phonon, charge, and spin-fluctuation mediated Cooper pairing. We show that in FeSe this leads to a modulated s ± gap symmetry and that the antiferromagnetic paramagnons are the leading mechanism for superconductivity in FeSe, overcoming the strong repulsive effect of both phonons and charge pairing.
An open-shell restricted Hartree-Fock perturbation theory based on symmetric spin orbitals
NASA Technical Reports Server (NTRS)
Lee, Timothy J.; Jayatilaka, Dylan
1993-01-01
A new open-shell perturbation theory is formulated in terms of symmetric spin orbitals. Only one set of spatial orbitals is required, thereby reducing the number of independent coefficients in the perturbed wavefunctions. For second order, the computational cost is shown to be similar to a closed-shell calculation. This formalism is therefore more efficient than the recently developed RMP, ROMP or RMP-MBPT theories. The perturbation theory described herein was designed to have a close correspondence with our recently proposed coupled-cluster theory based on symmetric spin orbitals. The first-order wavefunction contains contributions from only doubly excited determinants. Equilibrium structures and vibrational frequencies determined from second-order perturbation theory are presented for OH, NH, CH, 02, NH2 and CH2.
Ductile damage modeling based on void coalescence and percolation theories
Tonks, D.L.; Zurek, A.K.; Thissell, W.R.
1995-09-01
A general model for ductile damage in metals is presented. It includes damage induced by shear stress as well as damage caused by volumetric tension. Spallation is included as a special case. Strain induced damage is also treated. Void nucleation and growth are included, and give rise to strain rate effects. Strain rate effects also arise in the model through elastic release wave propagation between damage centers. The underlying physics of the model is the nucleation, growth, and coalescence of voids in a plastically flowing solid. The model is intended for hydrocode based computer simulation. An experimental program is underway to validate the model.
Venture Capital Investment Base on Grey Relational Theory
NASA Astrophysics Data System (ADS)
Zhang, Xubo
This paper builds a venture capital investment projects selection evaluation model base on risk-weight investment return using grey relational analysis. The risk and return in venture capital investment projects selection process is analyses. These risk and return mainly constricted in management ability, operation ability, market ability, exit obtain and investment cost. The 18 sub-indicators are the impact factors contributed to these five evaluation aspects. Grey relation analysis is use to evaluate the venture capital investment selection. Get the optimal solution of risk-weight double objective investment selection evaluation model. An example is used to demonstrate the model in this paper.
Mindfulness-based cognitive therapy: theory and practice.
Sipe, Walter E B; Eisendrath, Stuart J
2012-02-01
Mindfulness-based cognitive therapy (MBCT) incorporates elements of cognitive-behavioural therapy with mindfulness-based stress reduction into an 8-session group program. Initially conceived as an intervention for relapse prevention in people with recurrent depression, it has since been applied to various psychiatric conditions. Our paper aims to briefly describe MBCT and its putative mechanisms of action, and to review the current findings about the use of MBCT in people with mood and anxiety disorders. The therapeutic stance of MBCT focuses on encouraging patients to adopt a new way of being and relating to their thoughts and feelings, while placing little emphasis on altering or challenging specific cognitions. Preliminary functional neuroimaging studies are consistent with an account of mindfulness improving emotional regulation by enhancing cortical regulation of limbic circuits and attentional control. Research findings from several randomized controlled trials suggest that MBCT is a useful intervention for relapse prevention in patients with recurrent depression, with efficacy that may be similar to maintenance antidepressants. Preliminary studies indicate MBCT also shows promise in the treatment of active depression, including treatment-resistant depression. Pilot studies have also evaluated MBCT in bipolar disorder and anxiety disorders. Patient and clinician resources for further information on mindfulness and MBCT are provided. PMID:22340145
Following Human Footsteps: Proposal of a Decision Theory Based on Human Behavior
NASA Technical Reports Server (NTRS)
Mahmud, Faisal
2011-01-01
Human behavior is a complex nature which depends on circumstances and decisions varying from time to time as well as place to place. The way a decision is made either directly or indirectly related to the availability of the options. These options though appear at random nature, have a solid directional way for decision making. In this paper, a decision theory is proposed which is based on human behavior. The theory is structured with model sets that will show the all possible combinations for making a decision, A virtual and simulated environment is considered to show the results of the proposed decision theory
Game Theory Based Trust Model for Cloud Environment
Gokulnath, K.; Uthariaraj, Rhymend
2015-01-01
The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365
NASA Astrophysics Data System (ADS)
Oral, I.; Dogan, O.
2007-04-01
The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.
Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory
NASA Astrophysics Data System (ADS)
Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui
The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.
Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment
ERIC Educational Resources Information Center
Chen, Jing
2012-01-01
Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…
Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance
ERIC Educational Resources Information Center
Burguillo, Juan C.
2010-01-01
This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…
ERIC Educational Resources Information Center
Bridgstock, Ruth
2007-01-01
This paper documents the initial development and validation of a brief quantitative measure of career development influences based on the Systems Theory Framework (STF) of career development (McMahon & Patton, 1995; Patton & McMahon, 1997, 1999, 2006). Initial exploratory factor analyses of pilot study data revealed a six-factor structure based on…
Controlling Retrieval during Practice: Implications for Memory-Based Theories of Automaticity
ERIC Educational Resources Information Center
Wilkins, Nicolas J.; Rawson, Katherine A.
2011-01-01
Memory-based processing theories of automaticity assume that shifts from algorithmic to retrieval-based processing underlie practice effects on response times. The current work examined the extent to which individuals can exert control over the involvement of retrieval during skill acquisition and the factors that may influence control. In two…
Strategies for Integrating Computer-Based Training in College Music Theory Courses.
ERIC Educational Resources Information Center
Hess, George J., Jr.
During the fall semester of 1993, a curriculum-based computer-based training (CBT) program was used to replace all in-class drills in intervals and chord identification for one section of freshman music theory at the University of Northern Colorado. This study was conducted to determine whether aural skills can be taught as effectively through the…
Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again
ERIC Educational Resources Information Center
Ohlsson, Stellan
2016-01-01
The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to…
Time dependent mechanical modeling for polymers based on network theory
NASA Astrophysics Data System (ADS)
Billon, Noëlle
2016-05-01
Despite of a lot of attempts during recent years, complex mechanical behaviour of polymers remains incompletely modelled, making industrial design of structures under complex, cyclic and hard loadings not totally reliable. The non linear and dissipative viscoelastic, viscoplastic behaviour of those materials impose to take into account non linear and combined effects of mechanical and thermal phenomena. In this view, a visco-hyperelastic, viscoplastic model, based on network description of the material has recently been developed and designed in a complete thermodynamic frame in order to take into account those main thermo-mechanical couplings. Also, a way to account for coupled effects of strain-rate and temperature was suggested. First experimental validations conducted in the 1D limit on amorphous rubbery like PMMA in isothermal conditions led to pretty goods results. In this paper a more complete formalism is presented and validated in the case of a semi crystalline polymer, a PA66 and a PET (either amorphous or semi crystalline) are used. Protocol for identification of constitutive parameters is described. It is concluded that this new approach should be the route to accurately model thermo-mechanical behaviour of polymers using a reduced number of parameters of some physicl meaning.
Microfluidic, Bead-Based Assay: Theory and Experiments
Thompson, Jason A.; Bau, Haim H.
2009-01-01
Microbeads are frequently used as a solid support for biomolecules such as proteins and nucleic acids in heterogeneous microfluidic assays. However, relatively few studies investigate the binding kinetics on modified bead surfaces in a microfluidics context. In this study, a customized hot embossing technique is used to stamp microwells in a thin plastic substrate where streptavidin-coated agarose beads are selectively placed and subsequently immobilized within a conduit. Biotinylated quantum dots are used as a label to monitor target analyte binding to the bead's surface. Three-dimensional finite element simulations are carried out to model the binding kinetics on the bead's surface. The model accounts for surface exclusion effects resulting from a single quantum dot occluding multiple receptor sites. The theoretical predictions are compared and favorably agree with experimental observations. The theoretical simulations provide a useful tool to predict how varying parameters affect microbead reaction kinetics and sensor performance. This study enhances our understanding of bead-based microfluidic assays and provides a design tool for developers of point-of-care, lab-on-chip devices for medical diagnosis, food and water quality inspection, and environmental monitoring. PMID:19766545
Stanton, John F
2016-07-21
Semiclassical transition-state theory based on fourth-order vibrational perturbation theory (VPT4-SCTST) is applied to compute the barrier transmission coefficient for the symmetric Eckart potential. For a barrier parametrized to mimic the H2 + H exchange reaction, the results obtained are in excellent agreement with exact quantum calculations over a range of energy that extends down to roughly 1% of the barrier height, V0, where tunneling is negligible. The VPT2-SCTST treatment, which is commonly used in chemical kinetics studies, also performs quite well but already shows an error of a few percent at ca. 0.8 V0 where tunneling is still important. This suggests that VPT4-SCTST could offer an improvement over VPT2-SCTST in applications studies. However, the computational effort for VPT4-SCTST treatments of molecules is excessive, and any improvement gained is unlikely to warrant the increased effort. Nevertheless, the treatment of the symmetric Eckart barrier problem here suggests a simple modification of the usual VPT2-SCTST protocol that warrants further investigation. PMID:27358083
A Christian faith-based recovery theory: understanding God as sponsor.
Timmons, Shirley M
2012-12-01
This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions. PMID:21046250
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
Critical temperature of trapped interacting bosons from large-N -based theories
NASA Astrophysics Data System (ADS)
Kim, Tom; Chien, Chih-Chun
2016-03-01
Ultracold atoms provide clues to an important many-body problem regarding the dependence of the Bose-Einstein condensation transition temperature Tc on interactions. However, cold atoms are trapped in harmonic potentials and theoretical evaluations of the Tc shift of trapped interacting Bose gases are challenging. While previous predictions of the leading-order shift have been confirmed, more recent experiments exhibit higher-order corrections beyond available mean-field theories. By implementing two large-N -based theories with the local density approximation (LDA), we extract next-order corrections of the Tc shift. The leading-order large-N theory produces results quantitatively different from the latest experimental data. The leading-order auxiliary-field (LOAF) theory, containing both normal and anomalous density fields, captures the Tc shift accurately in the weak-interaction regime. However, the LOAF theory shows incompatible behavior with the LDA, and forcing the LDA leads to density discontinuities in the trap profiles. We present a phenomenological model based on the LOAF theory, which repairs the incompatibility and provides a prediction of the Tc shift in the stronger-interaction regime.
ERIC Educational Resources Information Center
Grigorenko, Elena L.; Sternberg, Robert J.; Ehrman, Madeline E.
2000-01-01
Presents a rationale, description, and partial construct validation of a new theory of foreign language aptitude: CANAL-F--Cognitive Ability for Novelty in Acquisition of Language (foreign). The theory was applied and implemented in a test of foreign language aptitude (CANAL-FT). Outlines the CANAL-F theory and details of its instrumentation…
ERIC Educational Resources Information Center
Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele
2009-01-01
In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…
Hlushak, Stepan
2015-09-28
An analytical expression for the Laplace transform of the radial distribution function of a mixture of hard-sphere chains of arbitrary segment size and chain length is used to rigorously formulate the first-order Barker-Henderson perturbation theory for the contribution of the segment-segment dispersive interactions into thermodynamics of the Lennard-Jones chain mixtures. Based on this approximation, a simple variant of the statistical associating fluid theory is proposed and used to predict properties of several mixtures of chains of different lengths and segment sizes. The theory treats the dispersive interactions more rigorously than the conventional theories and provides means for more accurate description of dispersive interactions in the mixtures of highly asymmetric components.
Non-equilibrium quantum theory for nanodevices based on the Feynman-Vernon influence functional
NASA Astrophysics Data System (ADS)
Jin, Jinshuang; Wei-Yuan Tu, Matisse; Zhang, Wei-Min; Yan, YiJing
2010-08-01
In this paper, we present a non-equilibrium quantum theory for transient electron dynamics in nanodevices based on the Feynman-Vernon influence functional. Applying the exact master equation for nanodevices we recently developed to the more general case in which all the constituents of a device vary in time in response to time-dependent external voltages, we obtained non-perturbatively the transient quantum transport theory in terms of the reduced density matrix. The theory enables us to study transient quantum transport in nanostructures with back-reaction effects from the contacts, with non-Markovian dissipation and decoherence being fully taken into account. For a simple illustration, we apply the theory to a single-electron transistor subjected to ac bias voltages. The non-Markovian memory structure and the nonlinear response functions describing transient electron transport are obtained.
NASA Astrophysics Data System (ADS)
Boon, Greet; De Proft, Frank; Langenaeker, Wilfried; Geerlings, Paul
1998-10-01
Molecular similarity is studied via density functional theory-based similarity indices using a numerical integration method. Complementary to the existing similarity indices, we introduce a reactivity-related similarity index based on the local softness. After a study of some test systems, a series of peptide isosteres is studied in view of their importance in pharmacology. The whole of the present work illustrates the importance of the study of molecular similarity based on both shape and reactivity.
NASA Astrophysics Data System (ADS)
Zhang, Lei; Chen, Lingen; Sun, Fengrui
2016-03-01
The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.
Cost performance satellite design using queueing theory. [via digital simulation
NASA Technical Reports Server (NTRS)
Hein, G. F.
1975-01-01
A modified Poisson arrival, infinite server queuing model is used to determine the effects of limiting the number of broadcast channels (C) of a direct broadcast satellite used for public service purposes (remote health care, education, etc.). The model is based on the reproductive property of the Poisson distribution. A difference equation has been developed to describe the change in the Poisson parameter. When all initially delayed arrivals reenter the system a (C plus 1) order polynomial must be solved to determine the effective value of the Poisson parameter. When less than 100% of the arrivals reenter the system the effective value must be determined by solving a transcendental equation. The model was used to determine the minimum number of channels required for a disaster warning satellite without degradation in performance. Results predicted by the queuing model were compared with the results of digital simulation.
Theory of normal and superconducting properties of fullerene-based solids
Cohen, M.L.
1992-10-01
Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ``standard model`` of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.
Theory of normal and superconducting properties of fullerene-based solids
Cohen, M.L.
1992-10-01
Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.
Studying thin film damping in a micro-beam resonator based on non-classical theories
NASA Astrophysics Data System (ADS)
Ghanbari, Mina; Hossainpour, Siamak; Rezazadeh, Ghader
2016-06-01
In this paper, a mathematical model is presented for studying thin film damping of the surrounding fluid in an in-plane oscillating micro-beam resonator. The proposed model for this study is made up of a clamped-clamped micro-beam bound between two fixed layers. The micro-gap between the micro-beam and fixed layers is filled with air. As classical theories are not properly capable of predicting the size dependence behaviors of the micro-beam, and also behavior of micro-scale fluid media, hence in the presented model, equation of motion governing longitudinal displacement of the micro-beam has been extracted based on non-local elasticity theory. Furthermore, the fluid field has been modeled based on micro-polar theory. These coupled equations have been simplified using Newton-Laplace and continuity equations. After transforming to non-dimensional form and linearizing, the equations have been discretized and solved simultaneously using a Galerkin-based reduced order model. Considering slip boundary conditions and applying a complex frequency approach, the equivalent damping ratio and quality factor of the micro-beam resonator have been obtained. The obtained values for the quality factor have been compared to those based on classical theories. We have shown that applying non-classical theories underestimate the values of the quality factor obtained based on classical theories. The effects of geometrical parameters of the micro-beam and micro-scale fluid field on the quality factor of the resonator have also been investigated.
Hernán, Miguel A
2015-01-15
The relative weights of empirical facts (data) and assumptions (theory) in causal inference vary across disciplines. Typically, disciplines that ask more complex questions tend to better tolerate a greater role of theory and modeling in causal inference. As epidemiologists move toward increasingly complex questions, Marshall and Galea (Am J Epidemiol. 2015;181(2):92-99) support a reweighting of data and theory in epidemiologic research via the use of agent-based modeling. The parametric g-formula can be viewed as an intermediate step between traditional epidemiologic methods and agent-based modeling and therefore is a method that can ease the transition toward epidemiologic methods that rely heavily on modeling. PMID:25480820
Mixture theory-based poroelasticity as a model of interstitial tissue growth.
Cowin, Stephen C; Cardoso, Luis
2012-01-01
This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481
Mixture theory-based poroelasticity as a model of interstitial tissue growth
Cowin, Stephen C.; Cardoso, Luis
2011-01-01
This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481
Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates
ERIC Educational Resources Information Center
Raju, Nambury S.; Oshima, T.C.
2005-01-01
Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…
ERIC Educational Resources Information Center
Schuitema, Jaap; Peetsma, Thea; van der Veen, Ineke
2014-01-01
The authors investigated the effects of an intervention developed to enhance student motivation in the first years of secondary education. The intervention, based on future time perspective (FTP) theory, has been found to be effective in prevocational secondary education (T. T. D. Peetsma & I. Van der Veen, 2008, 2009). The authors extend the…
Predicting Study Abroad Intentions Based on the Theory of Planned Behavior
ERIC Educational Resources Information Center
Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi
2012-01-01
The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…
ERIC Educational Resources Information Center
Woods, Carol M.; Thissen, David
2006-01-01
The purpose of this paper is to introduce a new method for fitting item response theory models with the latent population distribution estimated from the data using splines. A spline-based density estimation system provides a flexible alternative to existing procedures that use a normal distribution, or a different functional form, for the…
A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.
ERIC Educational Resources Information Center
Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.
Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…
ERIC Educational Resources Information Center
Li, Zhenying
2012-01-01
Based on Constructivism Theory, this paper aims to investigate the application of online multimedia courseware to college English teaching. By making experiments and students' feedback, some experience has been accumulated, and some problems are discovered and certain revelations are acquired as well in English teaching practice, which pave the…
ERIC Educational Resources Information Center
Campbell, Chris; MacPherson, Seonaigh; Sawkins, Tanis
2014-01-01
This case study describes how sociocultural and activity theory were applied in the design of a publicly funded, Canadian Language Benchmark (CLB)-based English as a Second Language (ESL) credential program and curriculum for immigrant and international students in postsecondary institutions in British Columbia, Canada. The ESL Pathways Project…
Science Teaching Based on Cognitive Load Theory: Engaged Students, but Cognitive Deficiencies
ERIC Educational Resources Information Center
Meissner, Barbara; Bogner, Franz X.
2012-01-01
To improve science learning under demanding conditions, we designed an out-of-school lesson in compliance with cognitive load theory (CLT). We extracted student clusters based on individual effectiveness, and compared instructional efficiency, mental effort, and persistence of learning. The present study analyses students' engagement. 50.0% of our…
Applications of Cognitive Load Theory to Multimedia-Based Foreign Language Learning: An Overview
ERIC Educational Resources Information Center
Chen, I-Jung; Chang, Chi-Cheng; Lee, Yen-Chang
2009-01-01
This article reviews the multimedia instructional design literature based on cognitive load theory (CLT) in the context of foreign language learning. Multimedia are of particular importance in language learning materials because they incorporate text, image, and sound, thus offering an integrated learning experience of the four language skills…
ERIC Educational Resources Information Center
Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei
2010-01-01
Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…
ERIC Educational Resources Information Center
Gabriel, Rachael
2011-01-01
In 1999, Ball and Cohen proposed a practice-based theory of professional education, which would end inadequate professional development efforts with a more comprehensive approach. Their work has been referenced over the past decade, yet there have been limited attempts to actualize their ideals and research their implications. In this article, I…
A Three Year Outcome Evaluation of a Theory Based Drink Driving Education Program.
ERIC Educational Resources Information Center
Sheehan, Mary; And Others
1996-01-01
Reports on the impact of a "drink driving education program" taught to tenth-grade students. The program, which involved twelve lessons, used strategies based on the Ajzen and Madden theory of planned behavior. Students (N=1,774) were trained to use alternatives to drinking and driving and to use safer passenger behaviors, and were followed-up…
Theory and Utility-Key Themes in Evidence-Based Assessment: Comment on the Special Section
ERIC Educational Resources Information Center
McFall, Richard M.
2005-01-01
This article focuses on two key themes in the four featured reviews on evidence-based assessment. The first theme is the essential role of theory in psychological assessment. An overview of this complex, multilayered role is presented. The second theme is the need for a common metric with which to gauge the utility of specific psychological tests…
ERIC Educational Resources Information Center
Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry
2015-01-01
The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…
Web-Support for Activating Use of Theory in Group-Based Learning.
ERIC Educational Resources Information Center
van der Veen, Jan; van Riemsdijk, Maarten; Laagland, Eelco; Gommer, Lisa; Jones, Val
This paper describes a series of experiments conducted within the context of a course on organizational theory that is taught at the Department of Management Sciences at the University of Twente (Netherlands). In 1997, a group-based learning approach was adopted, but after the first year it was apparent that acquisition and application of theory…
Effects of Guided Writing Strategies on Students' Writing Attitudes Based on Media Richness Theory
ERIC Educational Resources Information Center
Lan, Yu-Feng; Hung, Chun-Ling; Hsu, Hung-Ju
2011-01-01
The purpose of this paper is to develop different guided writing strategies based on media richness theory and further evaluate the effects of these writing strategies on younger students' writing attitudes in terms of motivation, enjoyment and anxiety. A total of 66 sixth-grade elementary students with an average age of twelve were invited to…
Theory-Based Development and Testing of an Adolescent Tobacco-Use Awareness Program.
ERIC Educational Resources Information Center
Smith, Dennis W.; Colwell, Brian; Zhang, James J.; Brimer, Jennifer; McMillan, Catherine; Stevens, Stacey
2002-01-01
The Adolescent Tobacco Use Awareness and Cessation Program trial, based on social cognitive theory and transtheoretical model, was designed to develop, evaluate, and disseminate effective cessation programming related to Texas legislation. Data from participants and site facilitators indicated that significantly more participants were in the…
The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology
ERIC Educational Resources Information Center
Wang, Greg G.; Swanson, Richard A.
2008-01-01
Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…
ERIC Educational Resources Information Center
Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.
Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…
A Recursive, Reflective Instructional Design Model Based on Constructivist-Interpretivist Theory.
ERIC Educational Resources Information Center
Willis, Jerry, Ed.
1995-01-01
Discussion of instructional design focuses on a foundation for an alternative instructional design model based on social sciences theories from the constructivist family and on an interpretivist philosophy of science. Highlights include the role of language; and the nature of truth, or alternative conceptions of reality. (LRW)
Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders
ERIC Educational Resources Information Center
Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel
2012-01-01
This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…
Aligning Theory and Web-Based Instructional Design Practice with Design Patterns.
ERIC Educational Resources Information Center
Frizell, Sherri S.; Hubscher, Roland
Designing instructionally sound Web courses is a difficult task for instructors who lack experience in interaction and Web-based instructional design. Learning theories and instructional strategies can provide course designers with principles and design guidelines associated with effective instruction that can be utilized in the design of…
ERIC Educational Resources Information Center
Henderson, Ronald W.; And Others
Theory-based prototype computer-video instructional modules were developed to serve as an instructional supplement for students experiencing difficulty in learning mathematics, with special consideration given to students underrepresented in mathematics (particularly women and minorities). Modules focused on concepts and operations for factors,…
The Triarchic Theory of Intelligence and Computer-based Inquiry Learning.
ERIC Educational Resources Information Center
Howard, Bruce C.; McGee, Steven; Shin, Namsoo; Shia, Regina
2001-01-01
Discussion of the triarchic theory of intelligence focuses on a study of ninth graders that explored the relationships between student abilities and the cognitive and attitudinal outcomes that resulted from student immersion in a computer-based inquiry environment. Examines outcome variables related to content understanding, problem solving, and…
From Theory to Practice: Concept-Based Inquiry in a High School Art Classroom
ERIC Educational Resources Information Center
Walker, Margaret A.
2014-01-01
This study examines what an emerging educational theory looks like when put into practice in an art classroom. It explores the teaching methodology of a high school art teacher who has utilized concept-based inquiry in the classroom to engage his students in artmaking and analyzes the influence this methodology has had on his adolescent students.…
ERIC Educational Resources Information Center
Liaw, Shu-Sheng; Huang, Hsiu-Mei
2016-01-01
This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…
Implications and Applications of Modern Test Theory in the Context of Outcomes Based Education.
ERIC Educational Resources Information Center
Andrich, David
2002-01-01
Uses a framework previously developed to relate outcomes based education and B. Bloom's "Taxonomy of Educational Objectives" to consider ways in which modern test theory can be used to connect aspects of assessment to the curriculum framework and to consider insights this connection might provide. (SLD)
ERIC Educational Resources Information Center
Min, Shangchao; He, Lianzhen
2014-01-01
This study examined the relative effectiveness of the multidimensional bi-factor model and multidimensional testlet response theory (TRT) model in accommodating local dependence in testlet-based reading assessment with both dichotomously and polytomously scored items. The data used were 14,089 test-takers' item-level responses to the…
Supporting Self-Regulated Personalised Learning through Competence-Based Knowledge Space Theory
ERIC Educational Resources Information Center
Steiner, Christina M.; Nussbaumer, Alexander; Albert, Dietrich
2009-01-01
This article presents two current research trends in e-learning that at first sight appear to compete. Competence-Based Knowledge Space Theory (CBKST) provides a knowledge representation framework which, since its invention by Doignon & Falmagne, has been successfully applied in various e-learning systems (for example, Adaptive Learning with…
Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model
ERIC Educational Resources Information Center
de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.
2011-01-01
Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…
Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data
ERIC Educational Resources Information Center
Abdullah, Lazim
2011-01-01
Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…
Glacier mapping based on rough set theory in the Manas River watershed
NASA Astrophysics Data System (ADS)
Yan, Lili; Wang, Jian; Hao, Xiaohua; Tang, Zhiguang
2014-04-01
Precise glacier information is important for assessing climate change in remote mountain areas. To obtain more accurate glacier mapping, rough set theory, which can deal with vague and uncertainty information, was introduced to obtain optimal knowledge rules for glacier mapping. Optical images, thermal infrared band data, texture information and morphometric parameters were combined to build a decision table used in our proposed rough set theory method. After discretizing the real value attributes, decision rules were calculated through the decision rule generation algorithm for glacier mapping. A decision classifier based on the generated rules classified the multispectral image into glacier and non-glacier areas. The result of maximum likelihood classification (MLC) was used to compare with the result of the classification based on the rough set theory. Confusion matrix and visual interpretation were used to evaluate the overall accuracy of the results of the two methods. The accuracies of the rough set method and maximum likelihood classification were compared, yielding overall accuracies of 94.15% and 93.88%, respectively. It showed the area difference based on rough set was smaller by comparing the glacier areas of the rough set method and MLC with visual interpreter, respectively. The high accuracy for glacier mapping and the small area difference for glacier based on rough set theory demonstrated that this method was effective and promising for glacier mapping.
Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder
ERIC Educational Resources Information Center
Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.
2005-01-01
We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion…
Evidence-Based Practice in Kinesiology: The Theory to Practice Gap Revisited
ERIC Educational Resources Information Center
Knudson, Duane
2005-01-01
As evidence-based practice sweeps the applied health professions, it is a good time to evaluate the generation of knowledge in Kinesiology and its transmission to professionals and the public. Knowledge transmission has been debated in the past from the perspectives of the theory-to-practice gap and the discipline versus profession emphasis.…
ERIC Educational Resources Information Center
Fein, Lance; Jones, Don
2015-01-01
This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…
Examining Instruction in MIDI-based Composition through a Critical Theory Lens
ERIC Educational Resources Information Center
Louth, Paul
2013-01-01
This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…
Examining Instruction in MIDI-Based Composition through a Critical Theory Lens
ERIC Educational Resources Information Center
Louth, Paul
2013-01-01
This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…
English Textbooks Based on Research and Theory--A Possible Dream.
ERIC Educational Resources Information Center
Suhor, Charles
1984-01-01
Research based text materials will probably never dominate the textbook market. To begin with, translating theory and research into practice is a chancy business. There are also creative problems such as the inherent oversimplification involved in textbook writing. Every textbook writer who has been a classroom teacher will acknowledge that such…
Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach
ERIC Educational Resources Information Center
Mainardes, Emerson; Alves, Helena; Raposo, Mario
2013-01-01
In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…
Theory-based behavior change interventions: comments on Hobbis and Sutton.
Fishbein, Martin; Ajzen, Icek
2005-01-01
Hobbis and Sutton (this issue) suggest that Cognitive Behavior Therapy (CBT) techniques can be used in interventions based on the Theory of Planned Behavior (TPB). Although this suggestion has merit, CBT is only one of many applicable methods for producing belief and behavior change. Moreover, CBT's primary purpose is to help people carry out intended behaviors, not to influence intentions, and that it is more useful in face-to-face than in community-level interventions. Contrary to Hobbis and Sutton's critique, TPB can accommodate core beliefs or fundamental assumptions, but the theory suggests that interventions targeted at such beliefs are less effective than interventions targeted at behavior specific beliefs. PMID:15576497
A method for calculating strain energy release rate based on beam theory
NASA Technical Reports Server (NTRS)
Sun, C. T.; Pandey, R. K.
1993-01-01
The Timoshenko beam theory was used to model cracked beams and to calculate the total strain energy release rate. The root rotation of the beam segments at the crack tip were estimated based on an approximate 2D elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain energy release rate obtained using beam theory agrees very well with the 2D finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.
NASA Astrophysics Data System (ADS)
He, Xi; Wang, Wei; Liu, Xinyu; Ji, Yong
This paper proposes a new risk assessment method based on the attribute reduction theory of rough set and multiclass SVM classification. Rough set theory is introduced for data attribute reduction and multiclass SVM is used for automatic assessment of risk levels. Redundant features of data are deleted that can reduce the computation complexity of multiclass SVM and improve the learning and the generalization ability. Multiclass SVM trained with the empirical data can predict the risk level. Experiment shows that the predict result has relatively high precision, and the method is validity for power network risk assessment.
Learning control system design based on 2-D theory - An application to parallel link manipulator
NASA Technical Reports Server (NTRS)
Geng, Z.; Carroll, R. L.; Lee, J. D.; Haynes, L. H.
1990-01-01
An approach to iterative learning control system design based on two-dimensional system theory is presented. A two-dimensional model for the iterative learning control system which reveals the connections between learning control systems and two-dimensional system theory is established. A learning control algorithm is proposed, and the convergence of learning using this algorithm is guaranteed by two-dimensional stability. The learning algorithm is applied successfully to the trajectory tracking control problem for a parallel link robot manipulator. The excellent performance of this learning algorithm is demonstrated by the computer simulation results.
Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin
2015-01-01
Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288
Liu, Yanbin; Liu, Mengying; Sun, Peihua
2014-01-01
A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535
A closure for meso-scale eddy fluxes based on linear instability theory
NASA Astrophysics Data System (ADS)
Eden, Carsten
Linear instability theory is used to predict the lateral diffusivity K for eddy buoyancy fluxes in an idealized channel model, following a suggestion by Killworth (1997). The vertical structure and magnitude of K agree approximately with the non-linear model results. The lateral structure of K from linear theory lacks minima within eddy-driven zonal jets, pointing towards a non-linear mechanism for mixing barriers in the channel model. This effect can be accounted for by a modification of K from linear theory by the kinematic effect of the background flow following a recent suggestion by Ferrari and Nikurashin (2010). Implementation of this closure for K in an eddy mixing framework based on potential vorticity mixing in a zonally averaged model version yields approximate agreement with the zonally resolved version over a certain range of external parameters, in particular with respect to the reproduction of eddy-driven zonal jets.
Liu, Mengying; Sun, Peihua
2014-01-01
A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535
Experimental investigation and kinetic-theory-based model of a rapid granular shear flow
NASA Astrophysics Data System (ADS)
Wildman, R. D.; Martin, T. W.; Huntley, J. M.; Jenkins, J. T.; Viswanathan, H.; Fen, X.; Parker, D. J.
An experimental investigation of an idealized rapidly sheared granular flow was performed to test the predictions of a model based on the kinetic theory of dry granular media. Glass ballotini beads were placed in an annular shear cell and the lower boundary rotated to induce a shearing motion in the bed. A single particle was tracked using the positron emission particle tracking (PEPT) technique, a method that determines the location of a particle through the triangulation of gamma photons emitted by a radioactive tracer particle. The packing fraction and velocity fields within the three-dimensional flow were measured and compared to the predictions of a model developed using the conservation and balance equations applicable to dissipative systems, and solved incorporating constitutive relations derived from kinetic theory. The comparison showed that kinetic theory is able to capture the general features of a rapid shear flow reasonably well over a wide range of shear rates and confining pressures.
Analysis and synthesis of phase shifting algorithms based on linear systems theory
NASA Astrophysics Data System (ADS)
Servin, M.; Estrada, J. C.
2012-08-01
We review and update a recently published formalism for the theory of linear Phase Shifting Algorithms (PSAs) based on linear filtering (systems) theory, mainly using the Frequency Transfer Function (FTF). The FTF has been for decades the standard tool in Electrical Engineering to analyze and synthesize their linear systems. Given the well defined FTF approach (matured over the last century), it clarifies, in our view, many not fully understood properties of PSAs. We present easy formulae for the spectra of the PSAs (the FTF magnitude), their Signal to Noise (S/N) power-ratio gain, their detuning robustness, and their harmonic rejection in terms of the FTF. This paper has more practical appeal than previous publications by the same authors, hoping to enrich the understanding of this PSA's theory as applied to the analysis and synthesis of temporal interferometry algorithms in Optical Metrology.
Classification of PolSAR image based on quotient space theory
NASA Astrophysics Data System (ADS)
An, Zhihui; Yu, Jie; Liu, Xiaomeng; Liu, Limin; Jiao, Shuai; Zhu, Teng; Wang, Shaohua
2015-12-01
In order to improve the classification accuracy, quotient space theory was applied in the classification of polarimetric SAR (PolSAR) image. Firstly, Yamaguchi decomposition method is adopted, which can get the polarimetric characteristic of the image. At the same time, Gray level Co-occurrence Matrix (GLCM) and Gabor wavelet are used to get texture feature, respectively. Secondly, combined with texture feature and polarimetric characteristic, Support Vector Machine (SVM) classifier is used for initial classification to establish different granularity spaces. Finally, according to the quotient space granularity synthetic theory, we merge and reason the different quotient spaces to get the comprehensive classification result. Method proposed in this paper is tested with L-band AIRSAR of San Francisco bay. The result shows that the comprehensive classification result based on the theory of quotient space is superior to the classification result of single granularity space.
A comparison of design variables for control theory based airfoil optimization
NASA Technical Reports Server (NTRS)
Reuther, James; Jameson, Antony
1995-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.
Performance of Four-Leg VSC based DSTATCOM using Single Phase P-Q Theory
NASA Astrophysics Data System (ADS)
Jampana, Bangarraju; Veramalla, Rajagopal; Askani, Jayalaxmi
2016-06-01
This paper presents single-phase P-Q theory for four-leg VSC based distributed static compensator (DSTATCOM) in the distribution system. The proposed DSTATCOM maintains unity power factor at source, zero voltage regulation, eliminates current harmonics, load balancing and neutral current compensation. The advantage of using four-leg VSC based DSTATCOM is to eliminate isolated/non-isolated transformer connection at point of common coupling (PCC) for neutral current compensation. The elimination of transformer connection at PCC with proposed topology will reduce cost of DSTATCOM. The single-phase P-Q theory control algorithm is used to extract fundamental component of active and reactive currents for generation of reference source currents which is based on indirect current control method. The proposed DSTATCOM is modelled and the results are validated with various consumer loads under unity power factor and zero voltage regulation modes in the MATLAB R2013a environment using simpower system toolbox.
Magic bases, metric ansaetze and generalized graph theories in the Virasoro master equation
Halpern, M.B.; Obers, N.A. )
1991-11-15
The authors define a class of magic Lie group bases in which the Virasoro master equation admits a class of simple metric ansaetze (g{sub metric}), whose structure is visible in the high-level expansion. When a magic basis is real on compact g, the corresponding g{sub metric} is a large system of unitary, generically irrational conformal field theories. Examples in this class include the graph-theory ansatz SO(n){sub diag} in the Cartesian basis of So(n) and the ansatz SU(n){sub metric} in the Pauli-like basis of SU(n). A new phenomenon is observed in the high-level comparison of SU(n){sub metric}: Due to the trigonometric structure constants of the Pauli-like basis, irrational central charge is clearly visible at finite order of the expansion. They also define the sine-area graphs of SU(n), which label the conformal field theories of SU(n){sub metric} and note that, in a similar fashion, each magic basis of g defines a generalize graph theory on g which labels the conformal field theories of g{sub metric}.
NASA Astrophysics Data System (ADS)
Rahmani, O.; Jandaghian, A. A.
2015-06-01
In this paper, a general third-order beam theory that accounts for nanostructure-dependent size effects and two-constituent material variation through the nanobeam thickness, i.e., functionally graded material (FGM) beam is presented. The material properties of FG nanobeams are assumed to vary through the thickness according to the power law. A detailed derivation of the equations of motion based on Eringen nonlocal theory using Hamilton's principle is presented, and a closed-form solution is derived for buckling behavior of the new model with various boundary conditions. The nonlocal elasticity theory includes a material length scale parameter that can capture the size effect in a functionally graded material. The proposed model is efficient in predicting the shear effect in FG nanobeams by applying third-order shear deformation theory. The proposed approach is validated by comparing the obtained results with benchmark results available in the literature. In the following, a parametric study is conducted to investigate the influences of the length scale parameter, gradient index, and length-to-thickness ratio on the buckling of FG nanobeams and the improvement on nonlocal third-order shear deformation theory comparing with the classical (local) beam model has been shown. It is found out that length scale parameter is crucial in studying the stability behavior of the nanobeams.
Gourlan, M; Bernard, P; Bortolon, C; Romain, A J; Lareyre, O; Carayol, M; Ninot, G; Boiché, J
2016-01-01
Implementing theory-based interventions is an effective way to influence physical activity (PA) behaviour in the population. This meta-analysis aimed to (1) determine the global effect of theory-based randomised controlled trials dedicated to the promotion of PA among adults, (2) measure the actual efficacy of interventions against their theoretical objectives and (3) compare the efficacy of single- versus combined-theory interventions. A systematic search through databases and review articles was carried out. Our results show that theory-based interventions (k = 82) significantly impact the PA behaviour of participants (d = 0.31, 95% CI [0.24, 0.37]). While moderation analyses revealed no efficacy difference between theories, interventions based on a single theory (d = 0.35; 95% CI [0.26, 0.43]) reported a higher impact on PA behaviour than those based on a combination of theories (d = 0.21; 95% CI [0.11, 0.32]). In spite of the global positive effect of theory-based interventions on PA behaviour, further research is required to better identify the specificities, overlaps or complementarities of the components of interventions based on relevant theories. PMID:25402606
Testing a priority-based queue model with Linux command histories
NASA Astrophysics Data System (ADS)
Baek, Seung Ki; Kim, Tae Young; Kim, Beom Jun
2008-06-01
We study human dynamics by analyzing Linux history files. The goodness-of-fit test shows that most of the collected datasets belong to the universality class suggested in the literature by a variable-length queuing process based on priority. In order to check the validity of this model, we design two tests based on mutual information between time intervals and a mathematical relationship known as the arcsine law. Since the previously suggested queuing process fails to pass these tests, the result suggests that the modelling of human dynamics should properly consider the statistical dependency in the temporal dimension.
The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission.
Kiani, Mehdi; Ghovanloo, Maysam
2012-09-01
Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368
The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission
Kiani, Mehdi; Ghovanloo, Maysam
2014-01-01
Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368
Membrane-Based Characterization of a Gas Component — A Transient Sensor Theory
Lazik, Detlef
2014-01-01
Based on a multi-gas solution-diffusion problem for a dense symmetrical membrane this paper presents a transient theory of a planar, membrane-based sensor cell for measuring gas from both initial conditions: dynamic and thermodynamic equilibrium. Using this theory, the ranges for which previously developed, simpler approaches are valid will be discussed; these approaches are of vital interest for membrane-based gas sensor applications. Finally, a new theoretical approach is introduced to identify varying gas components by arranging sensor cell pairs resulting in a concentration independent gas-specific critical time. Literature data for the N2, O2, Ar, CH4, CO2, H2 and C4H10 diffusion coefficients and solubilities for a polydimethylsiloxane membrane were used to simulate gas specific sensor responses. The results demonstrate the influence of (i) the operational mode; (ii) sensor geometry and (iii) gas matrices (air, Ar) on that critical time. Based on the developed theory the case-specific suitable membrane materials can be determined and both operation and design options for these sensors can be optimized for individual applications. The results of mixing experiments for different gases (O2, CO2) in a gas matrix of air confirmed the theoretical predictions. PMID:24608004
Otal, Begonya; Alonso, Luis; Verikoukis, Christos
2011-01-01
The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC) protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs). The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors’ energy consumption in order to prolong sensors’ battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions). In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs), which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ) MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead. PMID:22319351
ERIC Educational Resources Information Center
Carroll, Jan B.
1993-01-01
In social learning theory, perception of self-efficacy derives from accomplishments, vicarious experiences, verbal persuasion, and physiological states. Four corresponding instructional methods (practice, modeling, suggestion, climate setting) can be designed to ensure transfer of learning. (SK)
Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne
2015-01-01
Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support. PMID:25778674
Agent-based modeling: a new approach for theory building in social psychology.
Smith, Eliot R; Conrey, Frederica R
2007-02-01
Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach. PMID:18453457
Comparison of inlet suppressor data with approximate theory based on cutoff ratio
NASA Astrophysics Data System (ADS)
Rice, E. J.; Heidelberg, L. J.
1980-01-01
This paper represents the initial quantitative comparison of inlet suppressor far-field directivity suppression with that predicted using an approximate liner design and evaluation method based upon mode cutoff ratio. The experimental data was obtained using a series of cylindrical point-reacting inlet liners on an Avco-Lycoming YF102 engine. The theoretical prediction program is based upon simplified sound propagation concepts derived from exact calculations. These indicate that all of the controlling phenomenon can be approximately correlated with mode cutoff ratio which itself is intimately related to the angles of propagation within the duct. The objective of the theory-data comparisons is to point out possible deficiencies in the approximate theory which may be corrected. After all theoretical refinements have been made, then empirical corrections can be applied.
Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping
Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E
2014-01-01
Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880
An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.
Salim, Shelly; Moh, Sangman
2016-01-01
A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290
A new method of excitation control based on fuzzy set theory
Handschin, E.; Hoffmann, W.; Reyer, F.; Stephanblome, T.; Schluecking, U.; Westermann, D. ); Ahmed, S.S. )
1994-02-01
The synthesis of the structure of the PSS and its parameterization are based entirely on method of linear system theory. Thus the desorbed effect of the PSS is limited to a bounded area around one system operating point. The use of a controller based on fuzzy set theory introduces an event controlled excitation of the synchronous machine taking into account the power system operation. The desired response of the fuzzy controller is given by a set of rules which are obtained from the limits of the voltage regulator and the undesired performance of the conventional excitation control. A fuzzy controller has been developed for which simulation results are provided. These results support the concept of a fuzzy controller for the purpose of excitation control. They show that a well designed fuzzy controller is superior to a fast excitation control with an additional PSS.
An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks
Salim, Shelly; Moh, Sangman
2016-01-01
A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290
Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping.
Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E
2014-04-01
Fear arousal-vividly showing people the negative health consequences of life-endangering behaviors-is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880
Comparison of inlet suppressor data with approximate theory based on cutoff ratio
NASA Technical Reports Server (NTRS)
Rice, E. J.; Heidelberg, L. J.
1979-01-01
Inlet suppressor far-field directivity suppression was quantitatively compared with that predicted using an approximate linear design and evaluation method based upon mode cutoff ratio. The experimental data was obtained using a series of cylindrical point-reacting inlet liners on a YF102 engine. The theoretical prediction program is based upon simplified sound propagation concepts derived from exact calculations. These indicate that all of the controlling phenomenon can be approximately correlated with mode cutoff ratio which itself is intimately related to the angles of propagation within the duct. The theory-data comparisons are intended to point out possible deficiencies in the approximate theory which may be corrected. After all theoretical refinements are made, then empirical corrections can be applied.
Design of Flexure-based Precision Transmission Mechanisms using Screw Theory
Hopkins, J B; Panas, R M
2011-02-07
This paper enables the synthesis of flexure-based transmission mechanisms that possess multiple decoupled inputs and outputs of any type (e.g. rotations, translations, and/or screw motions), which are linked by designer-specified transmission ratios. A comprehensive library of geometric shapes is utilized from which every feasible concept that possesses the desired transmission characteristics may be rapidly conceptualized and compared before an optimal concept is selected. These geometric shapes represent the rigorous mathematics of screw theory and uniquely link a body's desired motions to the flexible constraints that enable those motions. This paper's impact is most significant to the design of nano-positioners, microscopy stages, optical mounts, and sensors. A flexure-based microscopy stage was designed, fabricated, and tested to demonstrate the utility of the theory.
NASA Astrophysics Data System (ADS)
Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.
2016-06-01
The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.
A theory-based logic model for innovation policy and evaluation.
Jordan, Gretchen B.
2010-04-01
Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.
Eight myths on motivating social services workers: theory-based perspectives.
Latting, J K
1991-01-01
A combination of factors has made formal motivational and reward systems rare in human service organizations generally and virtually non-existent in social service agencies. The author reviews eight of these myths by reference to eight motivational theories which refute them: need theory, expectancy theory, feedback theory, equity theory, reinforcement theory, cognitive evaluation theory, goal setting theory, and social influence theory. Although most of these theories have been developed and applied in the private sector, relevant research has also been conducted in social service agencies. The author concludes with a summary of guidelines suggested by the eight theories for motivating human service workers. PMID:10114292
Rajendran, Arvind
2008-03-28
The design of simulated moving bed processes under reduced purity requirements for systems whose isotherm is linear is considered. Based on the equilibrium theory of chromatography, explicit equations to uniquely identify the separation region that will ensure specified extract and raffinate purities are derived. The identification of the region requires only the knowledge of Henry constants of the solutes, the concentration of the solutes in the feed and the purity specifications. These results are validated using numerical simulations. PMID:18281052
Theory of plasticity based on a new invariant of stress tensor. Two-dimensional stress
NASA Astrophysics Data System (ADS)
Revuzhenko, A. F.; Mikenina, O. A.
2015-10-01
The authors introduce a new stress tensor invariant proportional to a squared intensity of shear stresses versus maximum shear stress. The invariant means a shear stress averaged over three fans-areas along three principal stresses of the stress tensor. The theory is based on the invariant and the associated flow rule. The article gives equations of generalized two-dimensional stress state and an analysis of their types. The authors solve an axisymmetrical problem on limit state around a hole.
Applying Educational Theory to Simulation-Based Training and Assessment in Surgery.
Chauvin, Sheila W
2015-08-01
Considerable progress has been made regarding the range of simulator technologies and simulation formats. Similarly, results from research in human learning and behavior have facilitated the development of best practices in simulation-based training (SBT) and surgical education. Today, SBT is a common curriculum component in surgical education that can significantly complement clinical learning, performance, and patient care experiences. Beginning with important considerations for selecting appropriate forms of simulation, several relevant educational theories of learning are described. PMID:26210964
A simplified orthotropic formulation of the viscoplasticity theory based on overstress
NASA Technical Reports Server (NTRS)
Sutcu, M.; Krempl, E.
1988-01-01
An orthotropic, small strain viscoplasticity theory based on overstress is presented. In each preferred direction the stress is composed of time (rate) independent (or plastic) and viscous (or rate dependent) contributions. Tension-compression asymmetry can depend on direction and is included in the model. Upon a proper choice of a material constant one preferred direction can exhibit linear elastic response while the other two deform in a viscoplastic manner.
A fast algorithm for attribute reduction based on Trie tree and rough set theory
NASA Astrophysics Data System (ADS)
Hu, Feng; Wang, Xiao-yan; Luo, Chuan-jiang
2013-03-01
Attribute reduction is an important issue in rough set theory. Many efficient algorithms have been proposed, however, few of them can process huge data sets quickly. In this paper, combining the Trie tree, the algorithms for computing positive region of decision table are proposed. After that, a new algorithm for attribute reduction based on Trie tree is developed, which can be used to process the attribute reduction of large data sets quickly. Experiment results show its high efficiency.
A theory-based approach to thermal field-flow fractionation of polyacrylates.
Runyon, J Ray; Williams, S Kim Ratanathanawongs
2011-09-28
A theory-based approach is presented for the development of thermal field-flow fractionation (ThFFF) of polyacrylates. The use of ThFFF for polymer analysis has been limited by an incomplete understanding of the thermal diffusion which plays an important role in retention and separation. Hence, a tedious trial-and-error approach to method development has been the normal practice when analyzing new materials. In this work, thermal diffusion theories based on temperature dependent osmotic pressure gradient and polymer-solvent interaction parameters were used to estimate thermal diffusion coefficients (D(T)) and retention times (t(r)) for different polymer-solvent pairs. These calculations identified methyl ethyl ketone as a solvent that would cause significant retention of poly(n-butyl acrylate) (PBA) and poly(methyl acrylate) (PMA). Experiments confirmed retention of these two polymers that have not been previously analyzed by ThFFF. Theoretical and experimental D(T)s and t(r)s for PBA, PMA, and polystyrene in different solvents agreed to within 20% and demonstrate the feasibility of this theory-based approach. PMID:21872869
A general theory to analyse and design wireless power transfer based on impedance matching
NASA Astrophysics Data System (ADS)
Liu, Shuo; Chen, Linhui; Zhou, Yongchun; Cui, Tie Jun
2014-10-01
We propose a general theory to analyse and design the wireless power transfer (WPT) systems based on impedance matching. We take two commonly used structures as examples, the transformer-coupling-based WPT and the series/parallel capacitor-based WPT, to show how to design the impedance matching network (IMN) to obtain the maximum transfer efficiency and the maximum output power. Using the impedance matching theory (IMT), we derive a simple expression of the overall transfer efficiency by the coils' quality factors and the coupling coefficient, which has perfect accuracy compared to full-circuit simulations. Full-wave electromagnetic software, CST Microwave Studio, has been used to extract the parameters of coils, thus providing us a comprehensive way to simulate WPT systems directly from the coils' physical model. We have also discussed the relationship between the output power and the transfer efficiency, and found that the maximum output power and the maximum transfer efficiency may occur at different frequencies. Hence, both power and efficiency should be considered in real WPT applications. To validate the proposed theory, two types of WPT experiments have been conducted using 30 cm-diameter coils for lighting a 20 W light bulb with 60% efficiency over a distance of 50 cm. The experimental results have very good agreements to the theoretical predictions.
Ghobadi, Ahmadreza F.; Elliott, J. Richard
2013-12-21
In this work, we aim to develop a version of the Statistical Associating Fluid Theory (SAFT)-γ equation of state (EOS) that is compatible with united-atom force fields, rather than experimental data. We rely on the accuracy of the force fields to provide the relation to experimental data. Although, our objective is a transferable theory of interfacial properties for soft and fused heteronuclear chains, we first clarify the details of the SAFT-γ approach in terms of site-based simulations for homogeneous fluids. We show that a direct comparison of Helmholtz free energy to molecular simulation, in the framework of a third order Weeks-Chandler-Andersen perturbation theory, leads to an EOS that takes force field parameters as input and reproduces simulation results for Vapor-Liquid Equilibria (VLE) calculations. For example, saturated liquid density and vapor pressure of n-alkanes ranging from methane to dodecane deviate from those of the Transferable Potential for Phase Equilibria (TraPPE) force field by about 0.8% and 4%, respectively. Similar agreement between simulation and theory is obtained for critical properties and second virial coefficient. The EOS also reproduces simulation data of mixtures with about 5% deviation in bubble point pressure. Extension to inhomogeneous systems and united-atom site types beyond those used in description of n-alkanes will be addressed in succeeding papers.
NASA Astrophysics Data System (ADS)
Li, Zhenglin; Zhang, Renhe; Li, Fenghua
2010-09-01
Ocean reverberation in shallow water is often the predominant background interference in active sonar applications. It is still an open problem in underwater acoustics. In recent years, an oscillation phenomenon of the reverberation intensity, due to the interference of the normal modes, has been observed in many experiments. A coherent reverberation theory has been developed and used to explain this oscillation phenomenon [F. Li et al., Journal of Sound and Vibration, 252(3), 457-468, 2002]. However, the published coherent reverberation theory is for the range independent environment. Following the derivations by F. Li and Ellis [D. D. Ellis, J. Acoust. Soc. Am., 97(5), 2804-2814, 1995], a general reverberation model based on the adiabatic normal mode theory in a range dependent shallow water environment is presented. From this theory the coherent or incoherent reverberation field caused by sediment inhomogeneity and surface roughness can be predicted. Observations of reverberation from the 2001 Asian Sea International Acoustic Experiment (ASIAEX) in the East China Sea are used to test the model. Model/data comparison shows that the coherent reverberation model can predict the experimental oscillation phenomenon of reverberation intensity and the vertical correlation of reverberation very well.
Theory of plasma contactors in ground-based experiments and low Earth orbit
NASA Technical Reports Server (NTRS)
Gerver, M. J.; Hastings, Daniel E.; Oberhardt, M. R.
1990-01-01
Previous theoretical work on plasma contactors as current collectors has fallen into two categories: collisionless double layer theory (describing space charge limited contactor clouds) and collisional quasineutral theory. Ground based experiments at low current are well explained by double layer theory, but this theory does not scale well to power generation by electrodynamic tethers in space, since very high anode potentials are needed to draw a substantial ambient electron current across the magnetic field in the absence of collisions (or effective collisions due to turbulence). Isotropic quasineutral models of contactor clouds, extending over a region where the effective collision frequency upsilon sub e exceeds the electron cyclotron frequency omega sub ce, have low anode potentials, but would collect very little ambient electron current, much less than the emitted ion current. A new model is presented, for an anisotropic contactor cloud oriented along the magnetic field, with upsilon sub e less than omega sub ce. The electron motion along the magnetic field is nearly collisionless, forming double layers in that direction, while across the magnetic field the electrons diffuse collisionally and the potential profile is determined by quasineutrality. Using a simplified expression for upsilon sub e due to ion acoustic turbulence, an analytic solution has been found for this model, which should be applicable to current collection in space. The anode potential is low and the collected ambient electron current can be several times the emitted ion current.
A theory-based approach to understanding suicide risk in shelter-seeking women.
Wolford-Clevenger, Caitlin; Smith, Phillip N
2015-04-01
Women seeking shelter from intimate partner violence are at an increased risk for suicide ideation and attempts compared to women in the general population. Control-based violence, which is common among shelter-seeking women, may play a pivotal role in the development of suicide ideation and attempts. Current risk assessment and management practices for shelter-seeking women are limited by the lack of an empirically grounded understanding of increased risk in this population. We argue that in order to more effectively promote risk assessment and management, an empirically supported theory that is sensitive to the experiences of shelter-seeking women is needed. Such a theory-driven approach has the benefits of identifying and prioritizing targetable areas for intervention. Here, we review the evidence for the link between coercive control and suicide ideation and attempts from the perspective of Baumeister's escape theory of suicide. This theory has the potential to explain the role of coercive control in the development of suicide ideation and eventual attempts in shelter-seeking women. Implications for suicide risk assessment and prevention in domestic violence shelters are discussed. PMID:24415137
Bao, Peng
2013-01-01
An interaction energy decomposition analysis method based on the block-localized wavefunction (BLW-ED) approach is described. The first main feature of the BLW-ED method is that it combines concepts of valence bond and molecular orbital theories such that the intermediate and physically intuitive electron-localized states are variationally optimized by self-consistent field calculations. Furthermore, the block-localization scheme can be used both in wave function theory and in density functional theory, providing a useful tool to gain insights on intermolecular interactions that would otherwise be difficult to obtain using the delocalized Kohn–Sham DFT. These features allow broad applications of the BLW method to energy decomposition (BLW-ED) analysis for intermolecular interactions. In this perspective, we outline theoretical aspects of the BLW-ED method, and illustrate its applications in hydrogen-bonding and π–cation intermolecular interactions as well as metal–carbonyl complexes. Future prospects on the development of a multistate density functional theory (MSDFT) are presented, making use of block-localized electronic states as the basis configurations. PMID:21369567
Designing a Project-based Learning in a University with New Theory of Learning
NASA Astrophysics Data System (ADS)
Mima, Noyuri
New theory of learning indicates “learning” is the process of interaction which occurs in social relationships within a community containing a multitude of things beyond any single individual. From this point of view, the “project-based learning” is one of new methods of teaching and learning at university. The method of project-based learning includes of team learning, team teaching, portfolio assessment, open space, and faculty development. This paper discusses potential of university to become a learning community with the method along with results of the educational practice at Future University-Hakodate.
Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study
Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.
2014-01-01
Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159
Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets
NASA Astrophysics Data System (ADS)
Cifter, Atilla
2011-06-01
This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.
Størset, Elisabet; Holford, Nick; Hennig, Stefanie; Bergmann, Troels K; Bergan, Stein; Bremer, Sara; Åsberg, Anders; Midtvedt, Karsten; Staatz, Christine E
2014-01-01
Aims The aim was to develop a theory-based population pharmacokinetic model of tacrolimus in adult kidney transplant recipients and to externally evaluate this model and two previous empirical models. Methods Data were obtained from 242 patients with 3100 tacrolimus whole blood concentrations. External evaluation was performed by examining model predictive performance using Bayesian forecasting. Results Pharmacokinetic disposition parameters were estimated based on tacrolimus plasma concentrations, predicted from whole blood concentrations, haematocrit and literature values for tacrolimus binding to red blood cells. Disposition parameters were allometrically scaled to fat free mass. Tacrolimus whole blood clearance/bioavailability standardized to haematocrit of 45% and fat free mass of 60 kg was estimated to be 16.1 l h−1 [95% CI 12.6, 18.0 l h−1]. Tacrolimus clearance was 30% higher (95% CI 13, 46%) and bioavailability 18% lower (95% CI 2, 29%) in CYP3A5 expressers compared with non-expressers. An Emax model described decreasing tacrolimus bioavailability with increasing prednisolone dose. The theory-based model was superior to the empirical models during external evaluation displaying a median prediction error of −1.2% (95% CI −3.0, 0.1%). Based on simulation, Bayesian forecasting led to 65% (95% CI 62, 68%) of patients achieving a tacrolimus average steady-state concentration within a suggested acceptable range. Conclusion A theory-based population pharmacokinetic model was superior to two empirical models for prediction of tacrolimus concentrations and seemed suitable for Bayesian prediction of tacrolimus doses early after kidney transplantation. PMID:25279405
Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation
ERIC Educational Resources Information Center
Nam, Chang S.; Smith-Jackson, Tonya L.
2007-01-01
Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…
An optimization program based on the method of feasible directions: Theory and users guide
NASA Technical Reports Server (NTRS)
Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.
1994-01-01
The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.
Kothe, E J; Mullan, B A; Butow, P
2012-06-01
This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result. PMID:22349778
NASA Astrophysics Data System (ADS)
Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping
2014-09-01
Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.
Student learning in interprofessional practice-based environments: what does theory say?
Roberts, Chris; Kumar, Koshila
2015-01-01
Student learning in interprofessional practice-based environments has garnered significant attention in the last decade, and is reflected in a corresponding increase in published literature on the topic. We review the current empirical literature with specific attention to the theoretical frameworks that have been used to illustrate how and why student learning occurs in interprofessional practice-based environments. Our findings show there are relatively few theoretical-based studies available to guide educators and researchers alike. We recommend a more considered and consistent use of theory and suggest that professional identity and socio-cultural frameworks offer promising avenues for advancing understandings of student learning and professional identity development within interprofessional practice-based environments. PMID:26611786
Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa
2016-01-01
The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287
NASA Technical Reports Server (NTRS)
Waszak, Martin R.
1992-01-01
The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.
Formula for the rms blur circle radius of Wolter telescope based on aberration theory
NASA Technical Reports Server (NTRS)
Shealy, David L.; Saha, Timo T.
1990-01-01
A formula for the rms blur circle for Wolter telescopes has been derived using the transverse ray aberration expressions of Saha (1985), Saha (1984), and Saha (1986). The resulting formula for the rms blur circle radius over an image plane and a formula for the surface of best focus based on third-, fifth-, and seventh-order aberration theory predict results in good agreement with exact ray tracing. It has also been shown that one of the two terms in the empirical formula of VanSpeybroeck and Chase (1972), for the rms blur circle radius of a Wolter I telescope can be justified by the aberration theory results. Numerical results are given comparing the rms blur radius and the surface of best focus vs the half-field angle computed by skew ray tracing and from analytical formulas for grazing incidence Wolter I-II telescopes and a normal incidence Cassegrain telescope.
Theory of energy transfer interactions near sphere and nanoshell based plasmonic nanostructures
NASA Astrophysics Data System (ADS)
Shishodia, Manmohan S.; Fainberg, Boris D.; Nitzan, Abraham
2011-10-01
Theory of energy transfer interactions between a pair of two level molecules in the molecular nanojunction including surface plasmon (SP) dressed interaction of plasmonic nanostructure, replicating metallic leads is presented. Results on the modification of bare dipolar interaction, known to be responsible for molecular energy transfer processes, in the proximity of metallic nanosystem are presented. Specifically, the manuscript includes theoretical investigation of nanosphere (NSP) monomer, nanoshell (NSH) monomer, and coupled nanosphere pair (dimer) based nanosystems. Closed form analytical expressions for NSP and NSH structures tailored for molecular nanojunction geometry are derived in the theoretical framework of multipole spectral expansion (MSE) method, which is straightforwardly extendible to dimers and multimers. The role of size and dielectric environment on energy transfer is investigated and interpreted. Theory predicts that the monomer and dimer both enhance the dipolar interaction, yet, dimer geometry is favorable due to its spectral tuning potential originated from plasmon hybridization and true resemblance with typical molecular nanojunctions.
Dynamically Incremental K-means++ Clustering Algorithm Based on Fuzzy Rough Set Theory
NASA Astrophysics Data System (ADS)
Li, Wei; Wang, Rujing; Jia, Xiufang; Jiang, Qing
Being classic K-means++ clustering algorithm only for static data, dynamically incremental K-means++ clustering algorithm (DK-Means++) is presented based on fuzzy rough set theory in this paper. Firstly, in DK-Means++ clustering algorithm, the formula of similar degree is improved by weights computed by using of the important degree of attributes which are reduced on the basis of rough fuzzy set theory. Secondly, new data only need match granular which was clustered by K-means++ algorithm or seldom new data is clustered by classic K-means++ algorithm in global data. In this way, that all data is re-clustered each time in dynamic data set is avoided, so the efficiency of clustering is improved. Throughout our experiments showing, DK-Means++ algorithm can objectively and efficiently deal with clustering problem of dynamically incremental data.
Hybrid framework based on evidence theory for blood cell image segmentation
NASA Astrophysics Data System (ADS)
Baghli, Ismahan; Nakib, Amir; Sellam, Elie; Benazzouz, Mourtada; Chikh, Amine; Petit, Eric
2014-03-01
The segmentation of microscopic images is an important issue in biomedical image processing. Many works can be found in the literature; however, there is not a gold standard method that is able to provide good results for all kinds of microscopic images. Then, authors propose methods for a given kind of microscopic images. This paper deals with new segmentation framework based on evidence theory, called ESA (Evidential Segmentation Algorithm) to segment blood cell images. The proposed algorithm allows solving the segmentation problem of blood cell images. Herein, our goal is to extract the components of a given cell image by using evidence theory, that allows more flexibility to classify the pixels. The obtained results showed the efficiency of the proposed algorithm compared to other competing methods.
NASA Astrophysics Data System (ADS)
Sourki, R.; Hoseini, S. A. H.
2016-04-01
This paper investigates the analysis for free transverse vibration of a cracked microbeam based on the modified couple stress theory within the framework of Euler-Bernoulli beam theory. The governing equation and the related boundary conditions are derived by using Hamilton's principle. The cracked beam is modeled by dividing the beam into two segments connected by a rotational spring located at the cracked section. This model invokes the consideration of the additional strain energy caused by the crack and promotes a discontinuity in the bending slope. In this investigation, the influence of diverse crack position, crack severity, material length scale parameter as well as various Poisson's ratio on natural frequencies is studied. A comparison with the previously published studies is made, in which a good agreement is observed. The results illustrate that the aforementioned parameters are playing a significant role on the dynamic behavior of the microbeam.
Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru
2015-11-01
We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011); Chaos 22, 043128 (2012)]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics. PMID:26651761
A 3-D elasticity theory based model for acoustic radiation from multilayered anisotropic plates.
Shen, C; Xin, F X; Lu, T J
2014-05-01
A theoretical model built upon three-dimensional elasticity theory is developed to investigate the acoustic radiation from multilayered anisotropic plates subjected to a harmonic point force excitation. Fourier transform technique and stationary phase method are combined to predict the far-field radiated sound pressure of one-side water immersed plate. Compared to equivalent single-layer plate models, the present model based on elasticity theory can differentiate radiated sound pressure between dry-side and wet-side excited cases, as well as discrepancies induced by different layer sequences for multilayered anisotropic plates. These results highlight the superiority of the present theoretical model especially for handling multilayered anisotropic structures. PMID:24815294
Control theory based airfoil design for potential flow and a finite volume discretization
NASA Technical Reports Server (NTRS)
Reuther, J.; Jameson, A.
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.
An English Vocabulary Learning System Based on Fuzzy Theory and Memory Cycle
NASA Astrophysics Data System (ADS)
Wang, Tzone I.; Chiu, Ti Kai; Huang, Liang Jun; Fu, Ru Xuan; Hsieh, Tung-Cheng
This paper proposes an English Vocabulary Learning System based on the Fuzzy Theory and the Memory Cycle Theory to help a learner to memorize vocabularies easily. By using fuzzy inferences and personal memory cycles, it is possible to find an article that best suits a learner. After reading an article, a quiz is provided for the learner to improve his/her memory of the vocabulary in the article. Early researches use just explicit response (ex. quiz exam) to update memory cycles of newly learned vocabulary; apart from that approach, this paper proposes a methodology that also modify implicitly the memory cycles of learned word. By intensive reading of articles recommended by our approach, a learner learns new words quickly and reviews learned words implicitly as well, and by which the vocabulary ability of the learner improves efficiently.
Characterization of degeneration process in combustion instability based on dynamical systems theory
NASA Astrophysics Data System (ADS)
Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru
2015-11-01
We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011), 10.1063/1.3563577; Chaos 22, 043128 (2012), 10.1063/1.4766589]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics.
Paek, Hye-Jin; Hilyard, Karen; Freimuth, Vicki; Barge, J Kevin; Mindlin, Michele
2010-06-01
Recent natural and human-caused disasters have awakened public health officials to the importance of emergency preparedness. Guided by health behavior and media effects theories, the analysis of a statewide survey in Georgia reveals that self-efficacy, subjective norm, and emergency news exposure are positively associated with the respondents' possession of emergency items and their stages of emergency preparedness. Practical implications suggest less focus on demographics as the sole predictor of emergency preparedness and more comprehensive measures of preparedness, including both a person's cognitive stage of preparedness and checklists of emergency items on hand. We highlight the utility of theory-based approaches for understanding and predicting public emergency preparedness as a way to enable more effective health and risk communication. PMID:20574880
Modeling of two-phase magnetic materials based on Jiles-Atherton theory of hysteresis
NASA Astrophysics Data System (ADS)
Raghunathan, A.; Melikhov, Y.; Snyder, J. E.; Jiles, D. C.
2012-01-01
The Jiles-Atherton (JA) theory of hysteresis has been extended in the present paper to model hysteresis in two-phase magnetic materials. Two-phase materials are those that exhibit two magnetic phases in one hysteresis cycle: one at lower fields and the other at higher fields. In magnetic hysteresis, the transition from one phase to the other i.e. low field phase to high field phase depends mainly on the exchange field. Hence, the material-dependent microstructural parameters of JA theory: spontaneous magnetization, MS, pinning factor, k, domain density, a, domain coupling, α, and reversibility factor, c, are represented as functions of the exchange field. Several cases based on this model have been discussed and compared with the measured data from existing literature. The shapes of the calculated and measured hysteresis loops are in excellent agreement.
Social judgment theory based model on opinion formation, polarization and evolution
NASA Astrophysics Data System (ADS)
Chau, H. F.; Wong, C. Y.; Chow, F. K.; Fung, Chi-Hang Fred
2014-12-01
The dynamical origin of opinion polarization in the real world is an interesting topic that physical scientists may help to understand. To properly model the dynamics, the theory must be fully compatible with findings by social psychologists on microscopic opinion change. Here we introduce a generic model of opinion formation with homogeneous agents based on the well-known social judgment theory in social psychology by extending a similar model proposed by Jager and Amblard. The agents’ opinions will eventually cluster around extreme and/or moderate opinions forming three phases in a two-dimensional parameter space that describes the microscopic opinion response of the agents. The dynamics of this model can be qualitatively understood by mean-field analysis. More importantly, first-order phase transition in opinion distribution is observed by evolving the system under a slow change in the system parameters, showing that punctuated equilibria in public opinion can occur even in a fully connected social network.
Slender-Body Theory Based On Approximate Solution of the Transonic Flow Equation
NASA Technical Reports Server (NTRS)
Spreiter, John R.; Alksne, Alberta Y.
1959-01-01
Approximate solution of the nonlinear equations of the small disturbance theory of transonic flow are found for the pressure distribution on pointed slender bodies of revolution for flows with free-stream, Mach number 1, and for flows that are either purely subsonic or purely supersonic. These results are obtained by application of a method based on local linearization that was introduced recently in the analysis of similar problems in two-dimensional flows. The theory is developed for bodies of arbitrary shape, and specific results are given for cone-cylinders and for parabolic-arc bodies at zero angle of attack. All results are compared either with existing theoretical results or with experimental data.
Paying for Express Checkout: Competition and Price Discrimination in Multi-Server Queuing Systems
Deck, Cary; Kimbrough, Erik O.; Mongrain, Steeve
2014-01-01
We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809
Paying for express checkout: competition and price discrimination in multi-server queuing systems.
Deck, Cary; Kimbrough, Erik O; Mongrain, Steeve
2014-01-01
We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809
NASA Astrophysics Data System (ADS)
Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.
2012-01-01
SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area
ERIC Educational Resources Information Center
Bohart, Arthur C.
There is relatively little theory on how psychotherapy clients self-heal since most theories of therapy stress the magic of the therapist's interventions. Of the theories that exist, this paper briefly discusses Carl Rogers' theory of self-actualization; and the dialectical theories of Greenberg and his colleagues, Jenkins, and Rychlak. Gendlin's…
NASA Astrophysics Data System (ADS)
Chaudhuri, Reaz A.; Kabir, Humayun R.
1992-11-01
A new methodology based on classical shallow shell theories is presented for solution to the static response and eigenvalue problems, involving a system of one fourth-order and two third-order highly coupled linear partial differential equations with the SS2-type simply supported boundary conditions. A comparison with solutions based on the first-order shear deformation theory made it possible to establish the upper limit of validity of the present classical lamination theory (CLT) based natural frequencies for angle-ply panels. Data obtained confirmed that introduction of transverse shear stress resultants into the two surface-parallel force equilibrium equations without concomitant changes in the kinematic relations constitutes little improvement over Donnell's or Sanders' shell theories, and all four classical shallow shell theories furnish virtually indistinguishable numerical results.
Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)
2006-10-01
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.
ERIC Educational Resources Information Center
Al-Amri, Mohammed
2010-01-01
Discipline-Based Art Education (DBAE), a theory developed in the USA, has been influential but also used in Art Education institutions world-wide. One of its stated goals was to develop the quality of teaching art education. Today, it is used as a theory for identifying and assessing good practices in the field of Art Education. The purpose of…
ERIC Educational Resources Information Center
Sung, Dia; You, Yeongmahn; Song, Ji Hoon
2008-01-01
The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…
ERIC Educational Resources Information Center
Colakoglu, Ozgur M.; Akdemir, Omur
2012-01-01
The ARCS Motivation Theory was proposed to guide instructional designers and teachers who develop their own instruction to integrate motivational design strategies into the instruction. There is a lack of literature supporting the idea that instruction for blended courses if designed based on the ARCS Motivation Theory provides different…
How Does an Activity Theory Model Help to Know Better about Teaching with Electronic-Exercise-Bases?
ERIC Educational Resources Information Center
Abboud-Blanchard, Maha; Cazes, Claire
2012-01-01
The research presented in this paper relies on Activity Theory and particularly on Engestrom's model, to better understand the use of Electronic-Exercise-Bases (EEB) by mathematics teachers. This theory provides a holistic approach to illustrate the complexity of the EEB integration. The results highlight reasons and ways of using EEB and show…
ERIC Educational Resources Information Center
Fukuhara, Hirotaka; Kamata, Akihito
2011-01-01
A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…
ERIC Educational Resources Information Center
Barnhardt, Bradford; Ginns, Paul
2014-01-01
This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…
NASA Technical Reports Server (NTRS)
Krempl, Erhard; Hong, Bor Zen
1989-01-01
A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.
Simple Models for Airport Delays During Transition to a Trajectory-Based Air Traffic System
NASA Astrophysics Data System (ADS)
Brooker, Peter
It is now widely recognised that a paradigm shift in air traffic control concepts is needed. This requires state-of-the-art innovative technologies, making much better use of the information in the air traffic management (ATM) system. These paradigm shifts go under the names of NextGen in the USA and SESAR in Europe, which inter alia will make dramatic changes to the nature of airport operations. A vital part of moving from an existing system to a new paradigm is the operational implications of the transition process. There would be business incentives for early aircraft fitment, it is generally safer to introduce new technologies gradually, and researchers are already proposing potential transition steps to the new system. Simple queuing theory models are used to establish rough quantitative estimates of the impact of the transition to a more efficient time-based navigational and ATM system. Such models are approximate, but they do offer insight into the broad implications of system change and its significant features. 4D-equipped aircraft in essence have a contract with the airport runway and, in return, they would get priority over any other aircraft waiting for use of the runway. The main operational feature examined here is the queuing delays affecting non-4D-equipped arrivals. These get a reasonable service if the proportion of 4D-equipped aircraft is low, but this can deteriorate markedly for high proportions, and be economically unviable. Preventative measures would be to limit the additional growth of 4D-equipped flights and/or to modify their contracts to provide sufficient space for the non-4D-equipped flights to operate without excessive delays. There is a potential for non-Poisson models, for which there is little in the literature, and for more complex models, e.g. grouping a succession of 4D-equipped aircraft as a batch.
Equation of state of detonation products based on statistical mechanical theory
NASA Astrophysics Data System (ADS)
Zhao, Yanhong; Liu, Haifeng; Zhang, Gongmu; Song, Haifeng
2015-06-01
The equation of state (EOS) of gaseous detonation products is calculated using Ross's modification of hard-sphere variation theory and the improved one-fluid van der Waals mixture model. The condensed phase of carbon is a mixture of graphite, diamond, graphite-like liquid and diamond-like liquid. For a mixed system of detonation products, the free energy minimization principle is used to calculate the equilibrium compositions of detonation products by solving chemical equilibrium equations. Meanwhile, a chemical equilibrium code is developed base on the theory proposed in this article, and then it is used in the three typical calculations as follow: (i) Calculation for detonation parameters of explosive, the calculated values of detonation velocity, the detonation pressure and the detonation temperature are in good agreement with experimental ones. (ii) Calculation for isentropic unloading line of RDX explosive, whose starting points is the CJ point. Comparison with the results of JWL EOS it is found that the calculated value of gamma is monotonically decreasing using the presented theory in this paper, while double peaks phenomenon appears using JWL EOS.
Toward a limited realism for psychiatric nosology based on the coherence theory of truth.
Kendler, K S
2015-04-01
A fundamental debate in the philosophy of science is whether our central concepts are true or only useful instruments to help predict and manipulate the world. The first position is termed 'realism' and the second 'instrumentalism'. Strong support for the instrumentalist position comes from the 'pessimistic induction' (PI) argument. Given that many key scientific concepts once considered true (e.g., humors, ether, epicycles, phlogiston) are now considered false, how, the argument goes, can we assert that our current concepts are true? The PI argument applies strongly to psychiatric diagnoses. Given our long history of abandoned diagnoses, arguments that we have finally 'gotten it right' and developed definitive psychiatric categories that correspond to observer-independent reality are difficult to defend. For our current diagnostic categories, we should settle for a less ambitious vision of truth. For this, the coherence theory, which postulates that something is true when it fits well with the other things we confidently know about the world, can serve us well. Using the coherence theory, a diagnosis is real to the extent that it is well integrated into our accumulating scientific data base. Furthermore, the coherence theory establishes a framework for us to evaluate our diagnostic categories and can provide a set of criteria, closely related to our concept of validators, for deciding when they are getting better. Finally, we need be much less skeptical about the truth status of the aggregate concept of psychiatric illness than we are regarding the specific categories in our current nosology. PMID:25181016
Wang, Huaqing; Chen, Peng
2009-01-01
This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021
Evidence-based medicine: a Kuhnian perspective of a transvestite non-theory.
Couto, J S
1998-11-01
Evidence-based medicine (EBM) has been presented by its protagonists as a new paradigm for medical practice. In this article that claim is analysed through the theory of scientific development proposed by Thomas S. Kuhn in 1962. Traditional medical paradigms are discussed, as well as the assumptions of the supposedly 'new' paradigm of EBM. The value of the results of randomized clinical trials (RCTs) for the elaboration of clinical guidelines is analysed within the context of the assumptions of EBM and the paradigm concept of Thomas S. Kuhn. It is argued that the results of RCTs, whenever contradicted by fundamental medical theory, constitute inadmissible evidence for the development of clinical guidelines. The supremacy of results of clinical trials over traditional medical paradigms, advocated by the protagonists of EBM, is rejected. Fundamental contradictions of EBM are also exposed: the fact that there is no evidence to support the utility of EBM and its call for a new type of authoritarianism in medicine. Finally, it is suggested that 'epidemiology-based medical practice' is a better, rhetoric-free designation for what is currently termed 'evidence-based medicine'. It is concluded that EBM is not what it claims to be and that its assumptions are simply irrational. PMID:9927237
Wang, Huaqing; Chen, Peng
2009-01-01
This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021
Third-order theory of the Risley-prism-based beam steering system.
Li, Yajun
2011-02-10
Nonparaxial ray tracing is performed to investigate the field scanned out by a single beam through two rotatable thick prisms with different parameters, and a general solution is obtained and then expanded into a power series to establish the third-order theory for Risley prisms that paves the way to investigate topics of interest such as optical distortions in the scan pattern and an analytical solution of the inverse problem of a Risley-prism-based laser beam steering system; i.e., the problem is concerned with how to direct a laser beam to any specified direction within the angular range of the system. PMID:21343989
Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159
Yannopapas, Vassilios
2007-12-15
A rigorous theory for the determination of the van der Waals interactions in colloidal systems is presented. The method is based on fluctuational electrodynamics and a multiple-scattering method which provides the electromagnetic Green's tensor. In particular, expressions for the Green's tensor are presented for arbitrary, finite collections of colloidal particles, for infinitely periodic or defected crystals, as well as for finite slabs of crystals. The presented formalism allows for ab initio calculations of the van der Waals interactions in colloidal systems since it takes fully into account retardation, many-body, multipolar, and near-field effects.
Density functional theory based simulations of silicon nanowire field effect transistors
NASA Astrophysics Data System (ADS)
Shin, Mincheol; Jeong, Woo Jin; Lee, Jaehyun
2016-04-01
First-principles density functional theory (DFT) based, atomistic, self-consistent device simulations are performed for realistically sized Si nanowire field effect transistors (NW FETs) having tens of thousands of atoms. Through mode space transformation, DFT Hamiltonian and overlap matrices are reduced in size from a few thousands to around one hundred. Ultra-efficient quantum-mechanical transport calculations in the non-equilibrium Green's function formalism in a non-orthogonal basis are therefore made possible. The n-type and p-type Si NW FETs are simulated and found to exhibit similar device performance in the nanoscale regime.
Constraints on Neutron Star Radii Based on Chiral Effective Field Theory Interactions
Hebeler, K.; Lattimer, J. M.; Pethick, C. J.; Schwenk, A.
2010-10-15
We show that microscopic calculations based on chiral effective field theory interactions constrain the properties of neutron-rich matter below nuclear densities to a much higher degree than is reflected in commonly used equations of state. Combined with observed neutron star masses, our results lead to a radius R=9.7-13.9 km for a 1.4M{sub {center_dot}} star, where the theoretical range is due, in about equal amounts, to uncertainties in many-body forces and to the extrapolation to high densities.
NASA Astrophysics Data System (ADS)
Cao, Xianzhong; Wang, Feng; Zheng, Zhongmei
The paper reports an educational experiment on the e-Learning instructional design model based on Cognitive Flexibility Theory, the experiment were made to explore the feasibility and effectiveness of the model in promoting the learning quality in ill-structured domain. The study performed the experiment on two groups of students: one group learned through the system designed by the model and the other learned by the traditional method. The results of the experiment indicate that the e-Learning designed through the model is helpful to promote the intrinsic motivation, learning quality in ill-structured domains, ability to resolve ill-structured problem and creative thinking ability of the students.
Risk evaluation of bogie system based on extension theory and entropy weight method.
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159
Unique laminar-flow stability limit based shallow-water theory
Chen, Cheng-lung
1993-01-01
Two approaches are generally taken in deriving the stability limit for the Froude member (Fs) for laminar sheet flow. The first approach used the Orr-Sommerfeld equation, while the second uses the cross-section-averaged equations of continuity and motion. Because both approaches are based on shallow-water theory, the values of Fs obtained from both approaches should be identical, yet in the literature they are not. This suggests that a defect exists in at least one of the two approaches. After examining the governing equations used in both approaches, one finds that the existing cross-section -averaged equation of motion is dependent on the frame of reference.
Evaluation of a preschool nutrition education program based on the theory of multiple intelligences.
Cason, K L
2001-01-01
This report describes the evaluation of a preschool nutrition education program based on the theory of multiple intelligences. Forty-six nutrition educators provided a series of 12 lessons to 6102 preschool-age children. The program was evaluated using a pretest/post-test design to assess differences in fruit and vegetable identification, healthy snack choices, willingness to taste foods, and eating behaviors. Subjects showed significant improvement in food identification and recognition, healthy snack identification, willingness to taste foods, and frequency of fruit, vegetable, meat, and dairy consumption. The evaluation indicates that the program was an effective approach for educating preschool children about nutrition. PMID:11953232
A description of the mechanical behavior of composite solid propellants based on molecular theory
NASA Technical Reports Server (NTRS)
Landel, R. F.
1976-01-01
Both the investigation and the representation of the stress-strain response (including rupture) of gum and filled elastomers can be based on a simple functional statement. Internally consistent experiments are used to sort out the effects of time, temperature, strain and crosslink density on gum rubbers. All effects are readily correlated and shown to be essentially independent of the elastomer when considered in terms of non-dimensionalized stress, strain and time. A semiquantitative molecular theory is developed to explain this result. The introduction of fillers modifies the response, but, guided by the framework thus provided, their effects can be readily accounted for.
NASA Astrophysics Data System (ADS)
Zhang, Bing; Lin, Zhen; Zhang, Xiao; Yu, Xiang; Wei, Jiali; Wang, Xiaoping
2014-05-01
Based on an innovative application of van der Pauw's theory, a system was developed for the absolute measurement of electrolytic conductivity in aqueous solutions. An electrolytic conductivity meter was designed that uses a four-electrode system with an axial-radial two-dimensional adjustment structure coupled to an ac voltage excitation source and signal collecting circuit. The measurement accuracy, resolution and repeatability of the measurement system were examined through a series of experiments. Moreover, the measurement system and a high-precision electrolytic conductivity meter were compared using some actual water samples.
Electron-deuteron scattering based on the Chiral Effective Field Theory
NASA Astrophysics Data System (ADS)
Rozpȩdzik, Dagmara
2014-06-01
Based on the Chiral Effective Field Theory (ChEFT) dynamical picture of the two-pion exchange (TPE) contributions to the nuclear current operator which appear at higher order chiral expansions were considered. Their role in the electron-deuteron scattering reactions was studied and chiral predictions were compared with those obtained in the conventional framework. Results for cross section and various polarization observables are presented. The bound and scattering states were calculated with five different chiral nucleon-nucleon (NN) potentials which leads to the so-called theoretical uncertainty bands for the predicted results.
The Stability Analysis for an Extended Car Following Model Based on Control Theory
NASA Astrophysics Data System (ADS)
Ge, Hong-Xia; Meng, Xiang-Pei; Zhu, Ke-Qiang; Cheng, Rong-Jun
2014-08-01
A new method is proposed to study the stability of the car-following model considering traffic interruption probability. The stability condition for the extended car-following model is obtained by using the Lyapunov function and the condition for no traffic jam is also given based on the control theory. Numerical simulations are conducted to demonstrate and verify the analytical results. Moreover, numerical simulations show that the traffic interruption probability has an influence on driving behavior and confirm the effectiveness of the method on the stability of traffic flow.
NASA Astrophysics Data System (ADS)
Liu, Jun; Xu, Hui; Liu, Yaping; Xu, Yang
With the increasing pressure in energy conservation and emissions reduction, the new energy revolution in China is imminent. The implementation of electric energy substitution and cleaner alternatives is an important way to resolve the contradiction among economic growth, energy saving and emission reduction. This article demonstrates that China is in the second stage which energy consumption and GDP is increasing together with the reducing of energy consumption intensity based on the theory of decoupling. At the same time, new energy revolution needs to be realized through the increasing of the carbon productivity and the proportion of new energy.
Jiang, Ping; Yang, Huajun; Mao, Shengqian
2015-10-01
A Cassegrain antenna system and an optical fiber coupling system which consists of a plano-concave lens and a plano-convex lens are designed based on the vector theory of reflection and refraction, so as to improve the transmission performance of the optical antenna and fiber coupling system. Three-dimensional ray tracing simulation are performed and results of the optical aberrations calculation and the experimental test show that the aberrations caused by on-axial defocusing, off-axial defocusing and deflection of receiving antenna can be well corrected by the optical fiber coupling system. PMID:26480125
Superior coexistence: systematicALLY regulatING land subsidence BASED on set pair theory
NASA Astrophysics Data System (ADS)
Chen, Y.; Gong, S.-L.
2015-11-01
Anthropogenic land subsidence is an environmental side effect of exploring and using natural resources in the process of economic development. The key points of the system for controlling land subsidence include cooperation and superior coexistence while the economy develops, exploring and using natural resources, and geological environmental safety. Using the theory and method of set pair analysis (SPA), this article anatomises the factors, effects, and transformation of land subsidence. Based on the principle of superior coexistence, this paper promotes a technical approach to the system for controlling land subsidence, in order to improve the prevention and control of geological hazards.
Research on the city's water affairs dispatchment system based on rough sets theory
NASA Astrophysics Data System (ADS)
Li, Xuwu; Hua, Jixue; Li, Chenghai; Li, Yanlei
2007-11-01
According to the main characteristic of the city's water affairs dispatchment, the structure of water affairs dispatchment based on rough sets theory was proposed. After each factors were considered synthetically, knowledge expression system was set up, and the water affairs dispatchment control regulation was reduced and acquired. To some extent, it's a new method of processing the uncertain information in the water affairs dispatchment. The example demonstrates that this method has reduced the dispatchment control, and its regulation acquired is of objectivity, so it can solve preferably the control problem of the city's water affairs dispatchment.
Nicol, Ginger E; Morrato, Elaine H; Johnson, Mark C; Campagna, Elizabeth; Yingling, Michael D; Pham, Victor; Newcomer, John W
2011-01-01
There is public health interest in the identification and treatment of modifiable cardiometabolic risk factors among patients treated with antipsychotic medications. However, best-practice screening recommendations endorsed by multiple medical organizations have not translated into real-world clinical practice. Quality improvement strategies may help to address the gap between policy and implementation. This column describes the successful implementation of a best-practice glucose screening program in a large network of community mental health centers that was based on Six Sigma and diffusion of innovation theory. PMID:21209293
The Advancement of Family Therapy Theory Based on the Science of Self-Organizing Complex Systems.
NASA Astrophysics Data System (ADS)
Ramsey-Kemper, Valerie Ann
1995-01-01
Problem. The purpose of this study was to review the literature which presents the latest advancements in the field of family therapy theory. Since such advancement has relied on the scientific developments in the study of autopoietic self-organizing complex systems, then the review began with an historical overview of the development of these natural scientific concepts. The study then examined how the latest scientific concepts have been integrated with family therapy practice. The document is built on the theory that individuals are living, complex, self-organizing, autopoietic systems. When individual systems interact with other individual systems (such as in family interaction, or in interaction between therapist and client), then a third system emerges, which is the relationship. It is through interaction in the relationship that transformation of an individual system can occur. Method. The historical antecedents of the field of family therapy were outlined. It was demonstrated, via literature review, that the field of family therapy has traditionally paralleled developments in the hard sciences. Further, it was demonstrated via literature review that the newest understandings of the development of individuals, family systems, and therapeutic systems also parallel recent natural science developments, namely those developments based on the science of self-organizing complex systems. Outcome. The results of the study are twofold. First, the study articulates an expanded theory of the therapist, individual, and family as autopoietic self-organizing complex systems. Second, the study provides an expanded hypothesis which concerns recommendations for future research which will further advance the latest theories of family therapy. More precisely, the expanded hypothesis suggests that qualitative research, rather than quantitative research, is the method of choice for studying the effectiveness of phenomenological therapy.
A queuing model for designing multi-modality buried target detection systems: preliminary results
NASA Astrophysics Data System (ADS)
Malof, Jordan M.; Morton, Kenneth D.; Collins, Leslie M.; Torrione, Peter A.
2015-05-01
Many remote sensing modalities have been developed for buried target detection, each one offering its own relative advantages over the others. As a result there has been interest in combining several modalities into a single detection platform that benefits from the advantages of each constituent sensor, without suffering from their weaknesses. Traditionally this involves collecting data continuously on all sensors and then performing data, feature, or decision level fusion. While this is effective for lowering false alarm rates, this strategy neglects the potential benefits of a more general system-level fusion architecture. Such an architecture can involve dynamically changing which modalities are in operation. For example, a large standoff modality such as a forward-looking infrared (FLIR) camera can be employed until an alarm is encountered, at which point a high performance (but short standoff) sensor, such as ground penetrating radar (GPR), is employed. Because the system is dynamically changing its rate of advance and sensors, it becomes difficult to evaluate the expected false alarm rate and advance rate. In this work, a probabilistic model is proposed that can be used to estimate these quantities based on a provided operating policy. In this model the system consists of a set of states (e.g., sensors employed) and conditions encountered (e.g., alarm locations). The predictive accuracy of the model is evaluated using a collection of collocated FLIR and GPR data and the results indicate that the model is effective at predicting the desired system metrics.
A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua
NASA Astrophysics Data System (ADS)
Taylor, Washington; Wang, Yi-Nan
2016-01-01
We use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be ˜ 1048. The distribution of bases peaks around h 1,1 ˜ 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We find that the number of non-Higgsable gauge group factors grows roughly linearly in h 1,1 of the threefold base. Typical bases have ˜ 6 isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) × SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) × SU(2) is the third most common connected two-factor product group, following SU(2) × SU(2) and G 2 × SU(2), which arise more frequently.
Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain
Brette, Romain
2015-01-01
Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496
A theory-based evaluation of a community-based funding scheme in a disadvantaged suburban city area.
Hickey, Gráinne; McGilloway, Sinead; O'Brien, Morgan; Leckey, Yvonne; Devlin, Maurice
2015-10-01
Community-driven development (CDD) initiatives frequently involve funding schemes which are aimed at channelling financial investment into local need and fostering community participation and engagement. This exploratory study examined, through a program theory approach, the design and implementation of a small-scale, community-based fund in Ireland. Observations, documentary analysis, interviews and group discussions with 19 participants were utilized to develop a detailed understanding of the program mechanisms, activities and processes, as well as the experiences of key stakeholders engaged with the funding scheme and its implementation. The findings showed that there were positive perceptions of the scheme and its function within the community. Overall, the availability of funding was perceived by key stakeholders as being beneficial. However, there were concerns over the accessibility of the scheme for more marginalized members of the community, as well as dissatisfaction with the openness and transparency surrounding funding eligibility. Lessons for the implementation of small-scale CDD funds are elaborated and the utility of program theory approaches for evaluators and planners working with programs that fund community-based initiatives is outlined. PMID:25933408
Nikolaev, Evgeni V; Sontag, Eduardo D
2016-04-01
Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic") or environmental ("extrinsic") noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic") behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a
A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study
Voth, Elizabeth C; Oelke, Nelly D
2016-01-01
Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps
Nikolaev, Evgeni V.
2016-01-01
Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of “on-off” switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular (“intrinsic”) or environmental (“extrinsic”) noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a “majority-vote” correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of “monotone” dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable (“chaotic”) behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation
Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign
Taguri, Masataka; Ishikawa, Yoshiki
2016-01-01
Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health
Theory of chemical kinetics and charge transfer based on nonequilibrium thermodynamics.
Bazant, Martin Z
2013-05-21
Advances in the fields of catalysis and electrochemical energy conversion often involve nanoparticles, which can have kinetics surprisingly different from the bulk material. Classical theories of chemical kinetics assume independent reactions in dilute solutions, whose rates are determined by mean concentrations. In condensed matter, strong interactions alter chemical activities and create variations that can dramatically affect the reaction rate. The extreme case is that of a reaction coupled to a phase transformation, whose kinetics must depend not only on the order parameter but also on its gradients at phase boundaries. Reaction-driven phase transformations are common in electrochemistry, when charge transfer is accompanied by ion intercalation or deposition in a solid phase. Examples abound in Li-ion, metal-air, and lead-acid batteries, as well as metal electrodeposition-dissolution. Despite complex thermodynamics, however, the standard kinetic model is the Butler-Volmer equation, based on a dilute solution approximation. The Marcus theory of charge transfer likewise considers isolated reactants and neglects elastic stress, configurational entropy, and other nonidealities in condensed phases. The limitations of existing theories recently became apparent for the Li-ion battery material LixFePO4 (LFP). It has a strong tendency to separate into Li-rich and Li-poor solid phases, which scientists believe limits its performance. Chemists first modeled phase separation in LFP as an isotropic "shrinking core" within each particle, but experiments later revealed striped phase boundaries on the active crystal facet. This raised the question: What is the reaction rate at a surface undergoing a phase transformation? Meanwhile, dramatic rate enhancement was attained with LFP nanoparticles, and classical battery models could not predict the roles of phase separation and surface modification. In this Account, I present a general theory of chemical kinetics, developed over
Hanbury, Andria; Thompson, Carl; Mannion, Russell
2011-07-01
Tailored implementation strategies targeting health professionals' adoption of evidence-based recommendations are currently being developed. Research has focused on how to select an appropriate theoretical base, how to use that theoretical base to explore the local context, and how to translate theoretical constructs associated with the key factors found to influence innovation adoption into feasible and tailored implementation strategies. The reasons why an intervention is thought not to have worked are often cited as being: inappropriate choice of theoretical base; unsystematic development of the implementation strategies; and a poor evidence base to guide the process. One area of implementation research that is commonly overlooked is how to synthesize the data collected in a local context in order to identify what factors to target with the implementation strategies. This is suggested to be a critical process in the development of a theory-based intervention. The potential of multilevel modelling techniques to synthesize data collected at different hierarchical levels, for example, individual attitudes and team level variables, is discussed. Future research is needed to explore further the potential of multilevel modelling for synthesizing contextual data in implementation studies, as well as techniques for synthesizing qualitative and quantitative data. PMID:21543383
Prior, Maria; Burr, Jennifer M; Ramsay, Craig R; Jenkinson, David; Campbell, Susan
2012-01-01
Objective To identify factors associated with intention to attend a hypothetical eye health test and provide an evidence base for developing an intervention to maximise attendance, for use in studies evaluating glaucoma screening programmes. Design Theory-based cross-sectional survey, based on an extended Theory of Planned Behaviour (TPB) and the Common Sense Self-Regulation Model, conducted in June 2010. Participants General population including oversampling from low socioeconomic areas. Setting Aberdeenshire and the London Boroughs of Lewisham and Southwark, UK. Results From 867 questionnaires posted, 327 completed questionnaires were returned (38%). In hierarchical regression analysis, the three theoretical predictors in the TPB (Attitude, Subjective norm and Perceived Behavioural Control) accounted for two-thirds of the variance in intention scores (adjusted R2=0.65). All three predictors contributed significantly to prediction. Adding ‘Anticipated regret’ as a factor in the TPB model resulted in a significant increase in prediction (adjusted R2=0.74). In the Common Sense Self-Regulation Model, only illness representations about the personal consequences of glaucoma (How much do you think glaucoma would affect your life?) and illness concern (How concerned are you about getting glaucoma?) significantly predicted. The final model explained 75% of the variance in intention scores, with ethnicity significantly contributing to prediction. Conclusions In this population-based sample (including over-representation of lower socioeconomic groupings), the main predictors of intention to attend a hypothetical eye health test were Attitude, Perceived control over attendance, Anticipated regret if did not attend and black ethnicity. This evidence informs the design of a behavioural intervention with intervention components targeting low intentions and predicted to influence health-related behaviours. PMID:22382121
Dissemination of a theory-based online bone health program: Two intervention approaches.
Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa
2015-06-01
With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668
Coding theory based models for protein translation initiation in prokaryotic organisms.
May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.
2003-03-01
Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.
A Development of Very Short-Term Load Forecasting Based on Chaos Theory
NASA Astrophysics Data System (ADS)
Kawauchi, Seiji; Sugihara, Hiroaki; Sasaki, Hiroshi
It is indispensable to accurately perform the short-term load forecasting of 10 minutes ahead in order to avoid undesirable disturbances in power system operations. The authors have so far developed such a forecasting method based on the conventional chaos theory. However, this approach is unable to give accurate forecasting results in case where the loads consecutively exceed than the historical maximum or lower than the minimum. Also, electric furnace loads with steep fluctuations have been another factor to degrade the forecast accuracy. This paper presents an improved forecasting method based on Chaos theory. Especially, the potential of the Local Fuzzy Reconstruction Method, a variant of the localized reconstruction methods, is fully exploited to realize accurate forecast as much as possible. To resolve the forecast deterioration due to sudden change loads such as by electric furnaces, they are separated from the rest and smoothing operations are carried out afterwards. The separated loads are forecasted independently from the remaining components. Several error correction methods are incorporated to enhance the proposed forecasting method. Furthermore, a consistent measure of obtaining the optimal combination of parameters to be used in the forecasting method is given. The effectiveness of the proposed methods is verified by using real load data for one year.
Improving breast cancer control among Latinas: evaluation of a theory-based educational program.
Mishra, S I; Chavez, L R; Magaña, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A
1998-10-01
The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384
Chen, Yu; Song, Guobao; Yang, Fenglin; Zhang, Shushen; Zhang, Yun; Liu, Zhenyu
2012-01-01
According to risk systems theory and the characteristics of the chemical industry, an index system was established for risk assessment of enterprises in chemical industrial parks (CIPs) based on the inherent risk of the source, effectiveness of the prevention and control mechanism, and vulnerability of the receptor. A comprehensive risk assessment method based on catastrophe theory was then proposed and used to analyze the risk levels of ten major chemical enterprises in the Songmu Island CIP, China. According to the principle of equal distribution function, the chemical enterprise risk level was divided into the following five levels: 1.0 (very safe), 0.8 (safe), 0.6 (generally recognized as safe, GRAS), 0.4 (unsafe), 0.2 (very unsafe). The results revealed five enterprises (50%) with an unsafe risk level, and another five enterprises (50%) at the generally recognized as safe risk level. This method solves the multi-objective evaluation and decision-making problem. Additionally, this method involves simple calculations and provides an effective technique for risk assessment and hierarchical risk management of enterprises in CIPs. PMID:23208298
Adapting evidence-based interventions using a common theory, practices, and principles.
Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D
2014-01-01
Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed. PMID:24079747
An information theory criteria based blind method for enumerating active users in DS-CDMA system
NASA Astrophysics Data System (ADS)
Samsami Khodadad, Farid; Abed Hodtani, Ghosheh
2014-11-01
In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.
Credibility theory based dynamic control bound optimization for reservoir flood limited water level
NASA Astrophysics Data System (ADS)
Jiang, Zhiqiang; Sun, Ping; Ji, Changming; Zhou, Jianzhong
2015-10-01
The dynamic control operation of reservoir flood limited water level (FLWL) can solve the contradictions between reservoir flood control and beneficial operation well, and it is an important measure to make sure the security of flood control and realize the flood utilization. The dynamic control bound of FLWL is a fundamental key element for implementing reservoir dynamic control operation. In order to optimize the dynamic control bound of FLWL by considering flood forecasting error, this paper took the forecasting error as a fuzzy variable, and described it with the emerging credibility theory in recent years. By combining the flood forecasting error quantitative model, a credibility-based fuzzy chance constrained model used to optimize the dynamic control bound was proposed in this paper, and fuzzy simulation technology was used to solve the model. The FENGTAN reservoir in China was selected as a case study, and the results show that, compared with the original operation water level, the initial operation water level (IOWL) of FENGTAN reservoir can be raised 4 m, 2 m and 5.5 m respectively in the three division stages of flood season, and without increasing flood control risk. In addition, the rationality and feasibility of the proposed forecasting error quantitative model and credibility-based dynamic control bound optimization model are verified by the calculation results of extreme risk theory.
Massive Yang-Mills theory based on the nonlinearly realized gauge group
Bettinelli, D.; Ferrari, R.; Quadri, A.
2008-02-15
We propose a subtraction scheme for a massive Yang-Mills theory realized via a nonlinear representation of the gauge group [here SU(2)]. It is based on the subtraction of the poles in D-4 of the amplitudes, in dimensional regularization, after a suitable normalization has been performed. Perturbation theory is in the number of loops, and the procedure is stable under iterative subtraction of the poles. The unphysical Goldstone bosons, the Faddeev-Popov ghosts, and the unphysical mode of the gauge field are expected to cancel out in the unitarity equation. The spontaneous symmetry breaking parameter is not a physical variable. We use the tools already tested in the nonlinear sigma model: hierarchy in the number of Goldstone boson legs and weak-power-counting property (finite number of independent divergent amplitudes at each order). It is intriguing that the model is naturally based on the symmetry SU(2){sub L} local x SU(2){sub R} global. By construction the physical amplitudes depend on the mass and on the self-coupling constant of the gauge particle and moreover on the scale parameter of the radiative corrections. The Feynman rules are in the Landau gauge.
Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice
2008-01-01
Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future. PMID:18935883
A Theory-Based Approach to Teaching Young Children about Health: A Recipe for Understanding
ERIC Educational Resources Information Center
Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley
2011-01-01
The theory-theory account of conceptual development posits that children's concepts are integrated into theories. Concept-learning studies have documented the central role that theories play in children's learning of experimenter-defined categories but have yet to extensively examine complex, real-world concepts, such as health. The present study…
Region-based perceptual grouping: a cooperative approach based on Dempster-Shafer theory
NASA Astrophysics Data System (ADS)
Zlatoff, Nicolas; Tellez, Bruno; Baskurt, Atilla
2006-02-01
As segmentation step does not allow recovering semantic objects, perceptual grouping is often used to overcome segmentation's lacks. This refers to the ability of human visual system to impose structure and regularity over signal-based data. Gestalt psychologists have exhibited some properties which seem to be at work for perceptual grouping and some implementations have been proposed by computer vision. However, few of these works model the use of several properties in order to trigger a grouping, even if it can lead to an increase in robustness. We propose a cooperative approach for perceptual grouping by combining the influence of several Gestalt properties for each hypothesis. We make use of Dempster-Shafer formalism, as it can prevent conflicting hypotheses from jamming the grouping process.
Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling
Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann
2015-01-01
Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.
2014-01-01
Nonlinear and bifurcation buckling equations for elastic, stiffened, geometrically perfect, right-circular cylindrical, anisotropic shells subjected to combined loads are presented that are based on Sanders' shell theory. Based on these equations, a three-parameter approximate Rayleigh-Ritz solution and a classical solution to the buckling problem are presented for cylinders with simply supported edges. Extensive comparisons of results obtained from these solutions with published results are also presented for a wide range of cylinder constructions. These comparisons include laminated-composite cylinders with a wide variety of shell-wall orthotropies and anisotropies. Numerous results are also given that show the discrepancies between the results obtained by using Donnell's equations and variants of Sanders' equations. For some cases, nondimensional parameters are identified and "master" curves are presented that facilitate the concise representation of results.
Greeley, J.; Norskov, J.; Center for Nanoscale Materials; Technical Univ. of Denmark
2009-03-26
A density functional theory (DFT) -based, combinatorial search for improved oxygen reduction reaction (ORR) catalysts is presented. A descriptor-based approach to estimate the ORR activity of binary surface alloys, wherein alloying occurs only in the surface layer, is described, and rigorous, potential-dependent computational tests of the stability of these alloys in aqueous, acidic environments are presented. These activity and stability criteria are applied to a database of DFT calculations on nearly 750 binary transition metal surface alloys; of these, many are predicted to be active for the ORR but, with few exceptions, they are found to be thermodynamically unstable in the acidic environments typical of low-temperature fuel cells. The results suggest that, absent other thermodynamic or kinetic mechanisms to stabilize the alloys, surface alloys are unlikely to serve as useful ORR catalysts over extended periods of operation.
NASA Technical Reports Server (NTRS)
Rolfes, R.; Noor, A. K.; Sparr, H.
1998-01-01
A postprocessing procedure is presented for the evaluation of the transverse thermal stresses in laminated plates. The analytical formulation is based on the first-order shear deformation theory and the plate is discretized by using a single-field displacement finite element model. The procedure is based on neglecting the derivatives of the in-plane forces and the twisting moments, as well as the mixed derivatives of the bending moments, with respect to the in-plane coordinates. The calculated transverse shear stiffnesses reflect the actual stacking sequence of the composite plate. The distributions of the transverse stresses through-the-thickness are evaluated by using only the transverse shear forces and the thermal effects resulting from the finite element analysis. The procedure is implemented into a postprocessing routine which can be easily incorporated into existing commercial finite element codes. Numerical results are presented for four- and ten-layer cross-ply laminates subjected to mechanical and thermal loads.
Predictive models based on sensitivity theory and their application to practical shielding problems
Bhuiyan, S.I.; Roussin, R.W.; Lucius, J.L.; Bartine, D.E.
1983-01-01
Two new calculational models based on the use of cross-section sensitivity coefficients have been devised for calculating radiation transport in relatively simple shields. The two models, one an exponential model and the other a power model, have been applied, together with the traditional linear model, to 1- and 2-m-thick concrete-slab problems in which the water content, reinforcing-steel content, or composition of the concrete was varied. Comparing the results obtained with the three models with those obtained from exact one-dimensional discrete-ordinates transport calculations indicates that the exponential model, named the BEST model (for basic exponential shielding trend), is a particularly promising predictive tool for shielding problems dominated by exponential attenuation. When applied to a deep-penetration sodium problem, the BEST model also yields better results than do calculations based on second-order sensitivity theory.
Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics
NASA Technical Reports Server (NTRS)
Xu, Kun
1998-01-01
A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.
Control Theory based Shape Design for the Incompressible Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Cowles, G.; Martinelli, L.
2003-12-01
A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.
NASA Astrophysics Data System (ADS)
Zha, Guofeng; Wang, Hongqiang; Yang, Zhaocheng; Cheng, Yongqiang; Qin, Yuliang
2016-03-01
As a complementary imaging technology, coincidence imaging radar (CIR) achieves super-resolution in real aperture staring radar imagery via employing the temporal-spatial independent array detecting (TSIAD) signals. The characters of TSIAD signals are impacted by the array geometry and the imaging performance are influenced by the relative imaging position with respect to antennas array. In this paper, the effect of array geometry on CIR system is investigated in detail based on the judgment criteria of the effective rank theory. In course of analyzing of these influences, useful system design guidance about the array geometry is remarked for the CIR system. With the design guidance, the target images are reconstructed based on the Tikhonov regularization algorithm. Simulation results are presented to validate the whole analysis and the efficiency of the design guidance.
Optimization of a photovoltaic pumping system based on the optimal control theory
Betka, A.; Attali, A.
2010-07-15
This paper suggests how an optimal operation of a photovoltaic pumping system based on an induction motor driving a centrifugal pump can be realized. The optimization problem consists in maximizing the daily pumped water quantity via the optimization of the motor efficiency for every operation point. The proposed structure allows at the same time the minimization the machine losses, the field oriented control and the maximum power tracking of the photovoltaic array. This will be attained based on multi-input and multi-output optimal regulator theory. The effectiveness of the proposed algorithm is described by simulation and the obtained results are compared to those of a system working with a constant air gap flux. (author)
Coupled mode theory analysis for circular photonic crystal ring resonator-based add-drop filter
NASA Astrophysics Data System (ADS)
Robinson, Savarimuthu; Nakkeeran, Rangaswamy
2012-11-01
A two-dimensional circular photonic crystal ring resonator (PCRR)-based add-drop filter (ADF) is designed for ITU-T G.694.2 eight-channel coarse wavelength division multiplexing systems. The resonant wavelength and pass-band width of the ADF are 1491 and 13 nm, respectively. Close to 100% of coupling and dropping efficiencies and a 114.69 quality factor are observed through simulation. Then the coupled mode theory (CMT) analysis of circular PCRR-based ADF is attempted to compare obtained CMT response into simulated finite difference time domain method response. The overall size of the device is much smaller; that is, 11.4×11.4 μm, which is highly suitable for photonic integrated circuits and all optical photonic network applications.
Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K
2012-05-01
The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls. PMID:22471316
Re-Examining of Moffitt's Theory of Delinquency through Agent Based Modeling.
Leaw, Jia Ning; Ang, Rebecca P; Huan, Vivien S; Chan, Wei Teng; Cheong, Siew Ann
2015-01-01
Moffitt's theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022
Multi-source remote sensing image fusion classification based on DS evidence theory
NASA Astrophysics Data System (ADS)
Liu, Chunping; Ma, Xiaohu; Cui, Zhiming
2007-11-01
A new adaptive remote sensing image fusion classification based on the Dempster-Shafer theory of evidence is presented. This method uses a limited number of prototypes as items of evidence, which is automatically generated by modified Fuzzy Kohonen Clustering Network (FKCN). The class fuzzy membership of each prototype is also determined using reference pattern set. For each input vector a basic probability assignment (BPA) function are computed based on these distances and on the degree of membership of prototypes to each class. And lastly this evidence is combined using Dempster's rule. This proposed method can be implemented in a modified FKCN with specific architecture consisting of one input layer, a prototype layer, a BPA layer, a combination and output layer, and decision layer. The experimental results show that the excellent performance of classification as compared to existing FKCN and basic DS fusion techniques.
Study on the salary system for IT enterprise based on double factor motivation theory
NASA Astrophysics Data System (ADS)
Zhuang, Chen; Qian, Wu
2005-12-01
To improve the fact that the IT enterprise's salary & compensation system can not motivate a company's staff efficiently, the salary system based on Hertzberg's double factor motivation theory and the enterprise characteristics is presented. The salary system includes a salary model, an assessment model and a performance model. The system is connected with a cash incentive based on the staff's performance and emphasizes that the salary alone is not a motivating factor. Health care, for example, may also play a positive role on the motivation factor. According to this system, a scientific and reasonable salary & compensation management system was established and applied in an IT enterprise. It was found to promote the enterprise's overall performance and competitive power.
Unit Template Synchronous Reference Frame Theory Based Control Algorithm for DSTATCOM
NASA Astrophysics Data System (ADS)
Bangarraju, J.; Rajagopal, V.; Jayalaxmi, A.
2014-04-01
This article proposes new and simplified unit templates instead of standard phase locked loop (PLL) for Synchronous Reference Frame Theory Control Algorithm (SRFT). The extraction of synchronizing components (sinθ and cosθ) for parks and inverse parks transformation using standard PLL takes more execution time. This execution time in control algorithm delays the extraction of reference source current generation. The standard PLL not only takes more execution time but also increases the reactive power burden on the Distributed Static Compensator (DSTATCOM). This work proposes a unit template based SRFT control algorithm for four-leg insulated gate bipolar transistor based voltage source converter for DSTATCOM in distribution systems. This will reduce the execution time and reactive power burden on the DSTATCOM. The proposed DSTATCOM suppress harmonics, regulates the terminal voltage along with neutral current compensation. The DSTATCOM in distribution systems with proposed control algorithm is modeled and simulated using MATLAB using SIMULINK and Simpower systems toolboxes.
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri
2016-04-01
The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A
Queuing and Other Idiosyncrasies.
ERIC Educational Resources Information Center
Algeo, John
1989-01-01
Examines the less obvious differences between British and American English in regard to semantics and grammar. A comparison is made, to see how American and British styles differ for public notice, in an experiment in which speakers of American English were asked to paraphrase notices from a British public utility office. (Author/OD)
Optimal control of ICU patient discharge: from theory to implementation.
Mallor, Fermín; Azcárate, Cristina; Barado, Julio
2015-09-01
This paper deals with the management of scarce health care resources. We consider a control problem in which the objective is to minimize the rate of patient rejection due to service saturation. The scope of decisions is limited, in terms both of the amount of resources to be used, which are supposed to be fixed, and of the patient arrival pattern, which is assumed to be uncontrollable. This means that the only potential areas of control are speed or completeness of service. By means of queuing theory and optimization techniques, we provide a theoretical solution expressed in terms of service rates. In order to make this theoretical analysis useful for the effective control of the healthcare system, however, further steps in the analysis of the solution are required: physicians need flexible and medically-meaningful operative rules for shortening patient length of service to the degree needed to give the service rates dictated by the theoretical analysis. The main contribution of this paper is to discuss how the theoretical solutions can be transformed into effective management rules to guide doctors' decisions. The study examines three types of rules based on intuitive interpretations of the theoretical solution. Rules are evaluated through implementation in a simulation model. We compare the service rates provided by the different policies with those dictated by the theoretical solution. Probabilistic analysis is also included to support rule validity. An Intensive Care Unit is used to illustrate this control problem. The study focuses on the Markovian case before moving on to consider more realistic LoS distributions (Weibull, Lognormal and Phase-type distribution). PMID:25763761
Hiyoshi, Ayako; Fukuda, Yoshiharu; Shipley, Martin J; Bartley, Mel; Brunner, Eric J
2013-06-01
Studies of health inequalities in Japan have increased since the millennium. However, there remains a lack of an accepted theory-based classification to measure occupation-related social position for Japan. This study attempts to derive such a classification based on the National Statistics Socio-economic Classification in the UK. Using routinely collected data from the nationally representative Comprehensive Survey of the Living Conditions of People on Health and Welfare, the Japanese Socioeconomic Classification was derived using two variables - occupational group and employment status. Validation analyses were conducted using household income, home ownership, self-rated good or poor health, and Kessler 6 psychological distress (n ≈ 36,000). After adjustment for age, marital status, and area (prefecture), one step lower social class was associated with mean 16% (p < 0.001) lower income, and a risk ratio of 0.93 (p < 0.001) for home ownership. The probability of good health showed a trend in men and women (risk ratio 0.94 and 0.93, respectively, for one step lower social class, p < 0.001). The trend for poor health was significant in women (odds ratio 1.12, p < 0.001) but not in men. Kessler 6 psychological distress showed significant trends in men (risk ratio 1.03, p = 0.044) and in women (1.05, p = 0.004). We propose the Japanese Socioeconomic Classification, derived from basic occupational and employment status information, as a meaningful, theory-based and standard classification system suitable for monitoring occupation-related health inequalities in Japan. PMID:23631782
Jankowski, K; Nowakowski, K; Grabowski, I; Wasilewski, J
2009-04-28
The problem of linking the dynamic electron correlation effects defined in traditional ab initio methods [or wave function theories (WFTs)] with the structure of the individual density functional theory (DFT) exchange and correlation functionals has been analyzed for the Ne atom, for which nondynamic correlation effects play a negligible role. A density-based approach directly hinged on difference radial-density (DRD) distributions defined with respect the Hartree-Fock radial density has been employed for analyzing the impact of dynamic correlation effects on the density. Attention has been paid to the elimination of basis-set incompleteness errors. The DRD distributions calculated by several ab initio methods have been compared to their DFT counterparts generated for representatives of several generations of broadly used exchange-correlation functionals and for the recently developed orbital-dependent OEP2 exchange-correlation functional [Bartlett et al., J. Chem. Phys. 122, 034104 (2005)]. For the local, generalized-gradient, and hybrid functionals it has been found that the dynamic correlation effects are to a large extend accounted for by densities resulting from exchange-only calculations. Additional calculations with self-interaction corrected exchange potentials indicate that this finding cannot be explained as an artifact caused by the self-interaction error. It has been demonstrated that the VWN5 and LYP correlation functionals do not represent any substantial dynamical correlation effects on the electron density, whereas these effects are well represented by the orbital-dependent OEP2 correlation functional. Critical comparison of the present results with their counterparts reported in literature has been made. Some attention has been paid to demonstrating the differences between the energy- and density-based perspectives. They indicate the usefulness of density-based criteria for developing new exchange-correlation functionals. PMID:19405556
NASA Astrophysics Data System (ADS)
Jankowski, K.; Nowakowski, K.; Grabowski, I.; Wasilewski, J.
2009-04-01
The problem of linking the dynamic electron correlation effects defined in traditional ab initio methods [or wave function theories (WFTs)] with the structure of the individual density functional theory (DFT) exchange and correlation functionals has been analyzed for the Ne atom, for which nondynamic correlation effects play a negligible role. A density-based approach directly hinged on difference radial-density (DRD) distributions defined with respect the Hartree-Fock radial density has been employed for analyzing the impact of dynamic correlation effects on the density. Attention has been paid to the elimination of basis-set incompleteness errors. The DRD distributions calculated by several ab initio methods have been compared to their DFT counterparts generated for representatives of several generations of broadly used exchange-correlation functionals and for the recently developed orbital-dependent OEP2 exchange-correlation functional [Bartlett et al., J. Chem. Phys. 122, 034104 (2005)]. For the local, generalized-gradient, and hybrid functionals it has been found that the dynamic correlation effects are to a large extend accounted for by densities resulting from exchange-only calculations. Additional calculations with self-interaction corrected exchange potentials indicate that this finding cannot be explained as an artifact caused by the self-interaction error. It has been demonstrated that the VWN5 and LYP correlation functionals do not represent any substantial dynamical correlation effects on the electron density, whereas these effects are well represented by the orbital-dependent OEP2 correlation functional. Critical comparison of the present results with their counterparts reported in literature has been made. Some attention has been paid to demonstrating the differences between the energy- and density-based perspectives. They indicate the usefulness of density-based criteria for developing new exchange-correlation functionals.
Shirazi, Mandana; Emami, Amir Hosein; Mirmoosavi, ,Seyed Jamal; Alavinia, Seyed Mohammad; Zamanian, Hadi; Fathollahbeigi, Faezeh; Masiello, Italo
2014-01-01
Background: Effective leadership is of prime importance in any organization and it goes through changes based on accepted health promotion and behavior change theory. Although there are many leadership styles, transformational leadership, which emphasizes supportive leadership behaviors, seems to be an appropriate style in many settings particularly in the health care and educational sectors which are pressured by high turnover and safety demands. Iran has been moving rapidly forward and its authorities have understood and recognized the importance of matching leadership styles with effective and competent care for success in health care organizations. This study aimed to develop the Supportive Leadership Behaviors Scale based on accepted health and educational theories and to psychometrically test it in the Iranian context. Methods: The instrument was based on items from established questionnaires. A pilot study validated the instrument which was also cross-validated via re-translation. After validation, 731 participants answered the questionnaire. Results: The instrument was finalized and resulted in a 20-item questionnaire using the exploratory factor analysis, which yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors (all above 0.6). Mapping these four measures of leadership behaviors can be beneficial to determine whether effective leadership could support innovation and improvements in medical education and health care organizations on the national level. The reliability measured as Cronbach’s alpha was 0.84. Conclusion: This new instrument yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors which are applicable in health and educational settings and are helpful in improving self –efficacy among health and academic staff. PMID:25679004
Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm
2015-01-01
A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001
[Material base on Chinese medical theory of 'Fei and Dachang being interior-exteriorly correlated'].
Li, Jie; Cheng, Xin; Jia, Yu-Hua
2011-02-01
By reviewing pertinent literatures, we found that there existed some defects in studying material base on Chinese medical theory of "Fei and Dachang being interior-exteriorly related", such as the low efficacy of research methods; the neglect of intestinal and respiratory microhabitat and Chinese medical functional condition; and the unconformity of research design with evidence-based medicinal requirements. Thereby, the authors offered that the researches method of initiating merely from sole material or line linkage path should be rejected. The new research strategy should be established based on the feature of the lung and large intestine network connective structure, cutting-in from correlative changes in the two terminals (respiratory system and intestinal tissue), and the intermedial key knot of connection (blood serum), screen out in high throughput the relevant materials adopting microecological, proteomic and metabonomic techniques, and catch hold of the knots of network as much as possible. Based on these to perfect the researches on coordinating mechanism of the network, and to establish a new strategy for future researching. PMID:21425586
Debanne, T; Laffaye, G
2015-08-01
This study was based on the naturalistic decision-making paradigm and regulatory focus theory. Its aim was to model coaches' decision-making processes for handball teams' defensive systems based on relevant cues of the reward structure, and to determine the weight of each of these cues. We collected raw data by video-recording 41 games that were selected using a simple random method. We considered the defensive strategy (DEF: aligned or staged) to be the dependent variable, and the three independent variables were (a) numerical difference between the teams; (b) score difference between the teams; and (c) game periods. We used a logistic regression design (logit model) and a multivariate logistic model to explain the link between DEF and the three category independent variables. Each factor was weighted differently during the decision-making process to select the defensive system, and combining these variables increased the impact on this process; for instance, a staged defense is 43 times more likely to be chosen during the final period in an unfavorable situation and in a man advantage. Finally, this shows that the coach's decision-making process could be based on a simple match or could require a diagnosis of the situation based on the relevant cues. PMID:25262855
Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.
Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat
2015-01-01
A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001
Effective meson masses in nuclear matter based on a cutoff field theory
Nakano, M.; Noda, N.; Mitsumori, T.; Koide, K.; Kouno, H.; Hasegawa, A.
1997-02-01
Effective masses of {sigma}, {omega}, {pi}, and {rho} mesons in nuclear matter are calculated based on a cutoff field theory. Instead of the traditional density-Feynman representation, we adopt the particle-hole-antiparticle representation for nuclear propagators so that unphysical components are not included in the meson self-energies. For an estimation of the contribution from the divergent particle-antiparticle excitations, i.e., vacuum polarization in nuclear matter, the idea of the renormalization group method is adopted. In this cutoff field theory, all the counterterms are finite and calculated numerically. It is shown that the predicted meson masses converge even if the cutoff {Lambda} is changed as long as {Lambda} is sufficiently large and that the prescription works well also for so-called nonrenormalized mesons such as {pi} and {rho}. According to this method, it is concluded that meson masses in nuclear matter have a weak dependence on the baryon density. {copyright} {ital 1997} {ital The American Physical Society}
NASA Astrophysics Data System (ADS)
Muscettola, Nicola; Smith, Steven S.
1996-09-01
This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.
General Formalism of Decision Making Based on Theory of Open Quantum Systems
NASA Astrophysics Data System (ADS)
Asano, M.; Ohya, M.; Basieva, I.; Khrennikov, A.
2013-01-01
We present the general formalism of decision making which is based on the theory of open quantum systems. A person (decision maker), say Alice, is considered as a quantum-like system, i.e., a system which information processing follows the laws of quantum information theory. To make decision, Alice interacts with a huge mental bath. Depending on context of decision making this bath can include her social environment, mass media (TV, newspapers, INTERNET), and memory. Dynamics of an ensemble of such Alices is described by Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) equation. We speculate that in the processes of evolution biosystems (especially human beings) designed such "mental Hamiltonians" and GKSL-operators that any solution of the corresponding GKSL-equation stabilizes to a diagonal density operator (In the basis of decision making.) This limiting density operator describes population in which all superpositions of possible decisions has already been resolved. In principle, this approach can be used for the prediction of the distribution of possible decisions in human populations.
NASA Astrophysics Data System (ADS)
Hashemi-Dezaki, Hamed; Mohammadalizadeh-Shabestary, Masoud; Askarian-Abyaneh, Hossein; Rezaei-Jegarluei, Mohammad
2014-01-01
In electrical distribution systems, a great amount of power are wasting across the lines, also nowadays power factors, voltage profiles and total harmonic distortions (THDs) of most loads are not as would be desired. So these important parameters of a system play highly important role in wasting money and energy, and besides both consumers and sources are suffering from a high rate of distortions and even instabilities. Active power filters (APFs) are innovative ideas for solving of this adversity which have recently used instantaneous reactive power theory. In this paper, a novel method is proposed to optimize the allocation of APFs. The introduced method is based on the instantaneous reactive power theory in vectorial representation. By use of this representation, it is possible to asses different compensation strategies. Also, APFs proper placement in the system plays a crucial role in either reducing the losses costs and power quality improvement. To optimize the APFs placement, a new objective function has been defined on the basis of five terms: total losses, power factor, voltage profile, THD and cost. Genetic algorithm has been used to solve the optimization problem. The results of applying this method to a distribution network illustrate the method advantages.
Process Reengineering for Quality Improvement in ICU Based on Taylor's Management Theory.
Tao, Ziqi
2015-06-01
Using methods including questionnaire-based surveys and control analysis, we analyzed the improvements in the efficiency of ICU rescue, service quality, and patients' satisfaction, in Xuzhou Central Hospital after the implementation of fine management, with an attempt to further introduce the concept of fine management and implement the brand construction. Originating in Taylor's "Theory of Scientific Management" (1982), fine management uses programmed, standardized, digitalized, and informational approaches to ensure each unit of an organization is running with great accuracy, high efficiency, strong coordination, and at sustained duration (Wang et al., Fine Management, 2007). The nature of fine management is a process that breaks up the strategy and goal, and executes it. Strategic planning takes place at every part of the process. Fine management demonstrates that everybody has a role to play in the management process, every area must be examined through the management process, and everything has to be managed (Zhang et al., The Experience of Hospital Nursing Precise Management, 2006). In other words, this kind of management theory demands all people to be involved in the entire process (Liu and Chen, Med Inf, 2007). As public hospital reform is becoming more widespread, it becomes imperative to "build a unified and efficient public hospital management system" and "improve the quality of medical services" (Guidelines on the Pilot Reform of Public Hospitals, 2010). The execution of fine management is of importance in optimizing the medical process, improving medical services and building a prestigious hospital brand. PMID:25548005
NASA Astrophysics Data System (ADS)
Timme, Marc; Geisel, Theo; Wolf, Fred
2006-03-01
We analyze the dynamics of networks of spiking neural oscillators. First, we present an exact linear stability theory of the synchronous state for networks of arbitrary connectivity. For general neuron rise functions, stability is determined by multiple operators, for which standard analysis is not suitable. We describe a general nonstandard solution to the multioperator problem. Subsequently, we derive a class of neuronal rise functions for which all stability operators become degenerate and standard eigenvalue analysis becomes a suitable tool. Interestingly, this class is found to consist of networks of leaky integrate-and-fire neurons. For random networks of inhibitory integrate-and-fire neurons, we then develop an analytical approach, based on the theory of random matrices, to precisely determine the eigenvalue distributions of the stability operators. This yields the asymptotic relaxation time for perturbations to the synchronous state which provides the characteristic time scale on which neurons can coordinate their activity in such networks. For networks with finite in-degree, i.e., finite number of presynaptic inputs per neuron, we find a speed limit to coordinating spiking activity. Even with arbitrarily strong interaction strengths neurons cannot synchronize faster than at a certain maximal speed determined by the typical in-degree.
Informational Theory of Aging: The Life Extension Method Based on the Bone Marrow Transplantation
Karnaukhov, Alexey V.; Karnaukhova, Elena V.; Sergievich, Larisa A.; Karnaukhova, Natalia A.; Bogdanenko, Elena V.; Manokhina, Irina A.; Karnaukhov, Valery N.
2015-01-01
The method of lifespan extension that is a practical application of the informational theory of aging is proposed. In this theory, the degradation (error accumulation) of the genetic information in cells is considered a main cause of aging. According to it, our method is based on the transplantation of genetically identical (or similar) stem cells with the lower number of genomic errors to the old recipients. For humans and large mammals, this method can be realized by cryopreservation of their own stem cells, taken in a young age, for the later autologous transplantation in old age. To test this method experimentally, we chose laboratory animals of relatively short lifespan (mouse). Because it is difficult to isolate the required amount of the stem cells (e.g., bone marrow) without significant damage for animals, we used the bone marrow transplantation from sacrificed inbred young donors. It is shown that the lifespan extension of recipients depends on level of their genetic similarity (syngeneity) with donors. We have achieved the lifespan increase of the experimental mice by 34% when the transplantation of the bone marrow with high level of genetic similarity was used. PMID:26491435
Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M
1996-01-01
Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153
A system model for ultrasonic NDT based on the Physical Theory of Diffraction (PTD).
Darmon, M; Dorval, V; Kamta Djakou, A; Fradkin, L; Chatillon, S
2016-01-01
Simulation of ultrasonic Non Destructive Testing (NDT) is helpful for evaluating performances of inspection techniques and requires the modelling of waves scattered by defects. Two classical flaw scattering models have been previously usually employed and evaluated to deal with inspection of planar defects, the Kirchhoff approximation (KA) for simulating reflection and the Geometrical Theory of Diffraction (GTD) for simulating diffraction. Combining them so as to retain advantages of both, the Physical Theory of Diffraction (PTD) initially developed in electromagnetism has been recently extended to elastodynamics. In this paper a PTD-based system model is proposed for simulating the ultrasonic response of crack-like defects. It is also extended to provide good description of regions surrounding critical rays where the shear diffracted waves and head waves interfere. Both numerical and experimental validation of the PTD model is carried out in various practical NDT configurations, such as pulse echo and Time of Flight Diffraction (TOFD), involving both crack tip and corner echoes. Numerical validation involves comparison of this model with KA and GTD as well as the Finite-Element Method (FEM). PMID:26323548
An integrated finite element simulation of cardiomyocyte function based on triphasic theory
Hatano, Asuka; Okada, Jun-Ichi; Washio, Takumi; Hisada, Toshiaki; Sugiura, Seiryo
2015-01-01
In numerical simulations of cardiac excitation-contraction coupling, the intracellular potential distribution and mobility of cytosol and ions have been mostly ignored. Although the intracellular potential gradient is small, during depolarization it can be a significant driving force for ion movement, and is comparable to diffusion in terms of net flux. Furthermore, fluid in the t-tubules is thought to advect ions to facilitate their exchange with the extracellular space. We extend our previous finite element model that was based on triphasic theory to examine the significance of these factors in cardiac physiology. Triphasic theory allows us to study the behavior of solids (proteins), fluids (cytosol) and ions governed by mechanics and electrochemistry in detailed subcellular structures, including myofibrils, mitochondria, the sarcoplasmic reticulum, membranes, and t-tubules. Our simulation results predicted an electrical potential gradient inside the t-tubules at the onset of depolarization, which corresponded to the Na+ channel distribution therein. Ejection and suction of fluid between the t-tubules and the extracellular compartment during isometric contraction were observed. We also examined the influence of t-tubule morphology and mitochondrial location on the electrophysiology and mechanics of the cardiomyocyte. Our results confirm that the t-tubule structure is important for synchrony of Ca2+ release, and suggest that mitochondria in the sub-sarcolemmal region might serve to cancel Ca2+ inflow through surface sarcolemma, thereby maintaining the intracellular Ca2+ environment in equilibrium. PMID:26539124
A linear viscoelastic biphasic model for soft tissues based on the Theory of Porous Media.
Ehlers, W; Markert, B
2001-10-01
Based on the Theory of Porous Media (mixture theories extended by the concept of volume fractions), a model describing the mechanical behavior of hydrated soft tissues such as articular cartilage is presented. As usual, the tissue will be modeled as a materially incompressible binary medium of one linear viscoelastic porous solid skeleton saturated by a single viscous pore-fluid. The contribution of this paper is to combine a descriptive representation of the linear viscoelasticity law for the organic solid matrix with an efficient numerical treatment of the strongly coupled solid-fluid problem. Furthermore, deformation-dependent permeability effects are considered. Within the finite element method (FEM), the weak forms of the governing model equations are set up in a system of differential algebraic equations (DAE) in time. Thus, appropriate embedded error-controlled time integration methods can be applied that allow for a reliable and efficient numerical treatment of complex initial boundary-value problems. The applicability and the efficiency of the presented model are demonstrated within canonical, numerical examples, which reveal the influence of the intrinsic dissipation on the general behavior of hydrated soft tissues, exemplarily on articular cartilage. PMID:11601726
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Smith, Steven S.
1996-01-01
This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.
An integrated finite element simulation of cardiomyocyte function based on triphasic theory.
Hatano, Asuka; Okada, Jun-Ichi; Washio, Takumi; Hisada, Toshiaki; Sugiura, Seiryo
2015-01-01
In numerical simulations of cardiac excitation-contraction coupling, the intracellular potential distribution and mobility of cytosol and ions have been mostly ignored. Although the intracellular potential gradient is small, during depolarization it can be a significant driving force for ion movement, and is comparable to diffusion in terms of net flux. Furthermore, fluid in the t-tubules is thought to advect ions to facilitate their exchange with the extracellular space. We extend our previous finite element model that was based on triphasic theory to examine the significance of these factors in cardiac physiology. Triphasic theory allows us to study the behavior of solids (proteins), fluids (cytosol) and ions governed by mechanics and electrochemistry in detailed subcellular structures, including myofibrils, mitochondria, the sarcoplasmic reticulum, membranes, and t-tubules. Our simulation results predicted an electrical potential gradient inside the t-tubules at the onset of depolarization, which corresponded to the Na(+) channel distribution therein. Ejection and suction of fluid between the t-tubules and the extracellular compartment during isometric contraction were observed. We also examined the influence of t-tubule morphology and mitochondrial location on the electrophysiology and mechanics of the cardiomyocyte. Our results confirm that the t-tubule structure is important for synchrony of Ca(2+) release, and suggest that mitochondria in the sub-sarcolemmal region might serve to cancel Ca(2+) inflow through surface sarcolemma, thereby maintaining the intracellular Ca(2+) environment in equilibrium. PMID:26539124
Grand unification and proton stability based on a chiral SU(8) theory
Deshpande, N.G.; Mannheim, P.D.
1980-06-01
A grand-unified model of the strong, electromagnetic, and weak interactions is presented based on a local SU(8)/sub L/ X SU(8)/sub R/ gauge theory that possesses a global U(8)/sub L/ X U(8)/sub R/ invariance. The model is spontaneously broken by the recently introduced neutrino pairing mechanism, in which a Higgs field that transforms like a pair of right-handed neutrinos acquires a vacuum expectation value. This neutrino pairing breaks the model down to the standard Weinberg-Salam phenomenology. Further, the neutrino pairing causes the two initial global currents of the model, fermion number and axial fermion number, to mix with the non-Abelian local currents to leave unbroken two new global currents, namely, baryon number and a particular lepton number that counts charged leptons and left-handed neutrinos only. The exact conservations of these two resulting currents ensure the absolute stability of the proton, the masslessness of the observed left-handed neutrinos, and the standard lepton number conservation of the usual weak interactions. A further feature of the model is the simultaneous absence of both strong CP violations and of observable axions. The model has a testable prediction, namely, the existence of an absolutely stable, relatively light, massive neutral lepton generated entirely from the right-handed neutrino sector of the theory. 1 table.
Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin
2015-06-05
The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less
A variable-order laminated plate theory based on the variational-asymptotical method
NASA Technical Reports Server (NTRS)
Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.
1993-01-01
The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.
Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin
2015-06-05
The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.
Skelton, JA; Buehler, C; Irby, MB; Grzywacz, JG
2014-01-01
Family-based approaches to pediatric obesity treatment are considered the ‘gold-standard,’ and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment. PMID:22531090
NASA Astrophysics Data System (ADS)
Han, Yue; Wu, Z. L.; Rosenshein, Joseph S.; Thomsen, Marshall; Zhao, Qiang; Moncur, Kent
1999-12-01
We present a comprehensive theoretical model suitable for treating the effect of pulsed collinear photothermal deflection spectroscopy (PDS). The work is an extension of the theoretical model previously developed for the mirage effect, which can take into account both photothermal deflection and photothermal diffraction effects based on the Fresnel diffraction theory. With the diffraction model, both the collinear PDS and the photothermal lensing spectroscopy techniques can be treated in a unified manner. The model provides a detailed analysis of the laser-induced optical diffraction effect and can be used to optimize experimental parameters. The modeled results are presented in detail, with an emphasis on the advantages of using a near-field detection scheme for achieving the best sensitivity to local temperature change and better experimental stability against environmental noise.
NASA Astrophysics Data System (ADS)
You, W. J.; Zhang, Y. L.
2015-08-01
Huaihe River is one of the seven largest rivers in China, in which floods occurred frequently. Disasters cause huge casualties and property losses to the basin, and also make it famous for high social vulnerability to floods. Based on the latest social-economic data, the index system of social vulnerability to floods was constructed, and Catastrophe theory method was used in the assessment process. The conclusion shows that social vulnerability as a basic attribute attached to urban environment, with significant changes from city to city across the Huaihe River basin. Different distribution characteristics are present in population, economy, flood prevention vulnerability. It is important to make further development of social vulnerability, which will play a positive role in disaster prevention, improvement of comprehensive ability to respond to disasters.
A three year outcome evaluation of a theory based drink driving education program.
Sheehan, M; Schonfeld, C; Ballard, R; Schofield, F; Najman, J; Siskind, V
1996-01-01
This study reports on the impact of a "drink driving education program" taught to grade ten high school students. The program which involves twelve lessons uses strategies based on the Ajzen and Madden theory of planned behavior. Students were trained to use alternatives to drink driving and passenger behaviors. One thousand seven hundred and seventy-four students who had been taught the program in randomly assigned control and intervention schools were followed up three years later. There had been a major reduction in drink driving behaviors in both intervention and control students. In addition to this cohort change there was a trend toward reduced drink driving in the intervention group and a significant reduction in passenger behavior in this group. Readiness to use alternatives suggested that the major impact of the program was on students who were experimenting with the behavior at the time the program was taught. The program seems to have optimized concurrent social attitude and behavior change. PMID:8952213
Bernoulli Euler beam model based on a modified couple stress theory
NASA Astrophysics Data System (ADS)
Park, S. K.; Gao, X.-L.
2006-11-01
A new model for the bending of a Bernoulli-Euler beam is developed using a modified couple stress theory. A variational formulation based on the principle of minimum total potential energy is employed. The new model contains an internal material length scale parameter and can capture the size effect, unlike the classical Bernoulli-Euler beam model. The former reduces to the latter in the absence of the material length scale parameter. As a direct application of the new model, a cantilever beam problem is solved. It is found that the bending rigidity of the cantilever beam predicted by the newly developed model is larger than that predicted by the classical beam model. The difference between the deflections predicted by the two models is very significant when the beam thickness is small, but is diminishing with the increase of the beam thickness. A comparison shows that the predicted size effect agrees fairly well with that observed experimentally.
Detection and control of combustion instability based on the concept of dynamical system theory.
Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru
2014-02-01
We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO. PMID:25353548
Webster, R; Michie, S; Estcourt, C; Gerressu, M; Bailey, J V
2016-09-01
Increasing condom use to prevent sexually transmitted infections is a key public health goal. Interventions are more likely to be effective if they are theory- and evidence-based. The Behaviour Change Wheel (BCW) provides a framework for intervention development. To provide an example of how the BCW was used to develop an intervention to increase condom use in heterosexual men (the MenSS website), the steps of the BCW intervention development process were followed, incorporating evidence from the research literature and views of experts and the target population. Capability (e.g. knowledge) and motivation (e.g. beliefs about pleasure) were identified as important targets of the intervention. We devised ways to address each intervention target, including selecting interactive features and behaviour change techniques. The BCW provides a useful framework for integrating sources of evidence to inform intervention content and deciding which influences on behaviour to target. PMID:27528531