Science.gov

Sample records for queuing theory based

  1. Queuing Theory and Reference Transactions.

    ERIC Educational Resources Information Center

    Terbille, Charles

    1995-01-01

    Examines the implications of applying the queuing theory to three different reference situations: (1) random patron arrivals; (2) random durations of transactions; and (3) use of two librarians. Tables and figures represent results from spreadsheet calculations of queues for each reference situation. (JMV)

  2. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  3. Queuing Theory Based Co-Channel Interference Analysis Approach for High-Density Wireless Local Area Networks.

    PubMed

    Zhang, Jie; Han, Guangjie; Qian, Yujie

    2016-01-01

    Increased co-channel interference (CCI) in wireless local area networks (WLANs) is bringing serious resource constraints to today's high-density wireless environments. CCI in IEEE 802.11-based networks is inevitable due to the nature of the carrier sensing mechanism however can be reduced by resource optimization approaches. That means the CCI analysis is basic, but also crucial for an efficient resource management. In this article, we present a novel CCI analysis approach based on the queuing theory, which considers the randomness of end users' behavior and the irregularity and complexity of network traffic in high-density WLANs that adopts the M/M/c queuing model for CCI analysis. Most of the CCIs occur when multiple networks overlap and trigger channel contentions; therefore, we use the ratio of signal-overlapped areas to signal coverage as a probabilistic factor to the queuing model to analyze the CCI impacts in highly overlapped WLANs. With the queuing model, we perform simulations to see how the CCI influences the quality of service (QoS) in high-density WLANs. PMID:27563896

  4. Queuing Theory Based Co-Channel Interference Analysis Approach for High-Density Wireless Local Area Networks

    PubMed Central

    Zhang, Jie; Han, Guangjie; Qian, Yujie

    2016-01-01

    Increased co-channel interference (CCI) in wireless local area networks (WLANs) is bringing serious resource constraints to today’s high-density wireless environments. CCI in IEEE 802.11-based networks is inevitable due to the nature of the carrier sensing mechanism however can be reduced by resource optimization approaches. That means the CCI analysis is basic, but also crucial for an efficient resource management. In this article, we present a novel CCI analysis approach based on the queuing theory, which considers the randomness of end users’ behavior and the irregularity and complexity of network traffic in high-density WLANs that adopts the M/M/c queuing model for CCI analysis. Most of the CCIs occur when multiple networks overlap and trigger channel contentions; therefore, we use the ratio of signal-overlapped areas to signal coverage as a probabilistic factor to the queuing model to analyze the CCI impacts in highly overlapped WLANs. With the queuing model, we perform simulations to see how the CCI influences the quality of service (QoS) in high-density WLANs. PMID:27563896

  5. Queuing Theory Based Co-Channel Interference Analysis Approach for High-Density Wireless Local Area Networks.

    PubMed

    Zhang, Jie; Han, Guangjie; Qian, Yujie

    2016-08-23

    Increased co-channel interference (CCI) in wireless local area networks (WLANs) is bringing serious resource constraints to today's high-density wireless environments. CCI in IEEE 802.11-based networks is inevitable due to the nature of the carrier sensing mechanism however can be reduced by resource optimization approaches. That means the CCI analysis is basic, but also crucial for an efficient resource management. In this article, we present a novel CCI analysis approach based on the queuing theory, which considers the randomness of end users' behavior and the irregularity and complexity of network traffic in high-density WLANs that adopts the M/M/c queuing model for CCI analysis. Most of the CCIs occur when multiple networks overlap and trigger channel contentions; therefore, we use the ratio of signal-overlapped areas to signal coverage as a probabilistic factor to the queuing model to analyze the CCI impacts in highly overlapped WLANs. With the queuing model, we perform simulations to see how the CCI influences the quality of service (QoS) in high-density WLANs.

  6. An application of queuing theory to waterfowl migration

    USGS Publications Warehouse

    Sojda, Richard S.; Cornely, John E.; Fredrickson, Leigh H.; Rizzoli, A.E.; Jakeman, A.J.

    2002-01-01

    There has always been great interest in the migration of waterfowl and other birds. We have applied queuing theory to modelling waterfowl migration, beginning with a prototype system for the Rocky Mountain Population of trumpeter swans (Cygnus buccinator) in Western North America. The queuing model can be classified as a D/BB/28 system, and we describe the input sources, service mechanism, and network configuration of queues and servers. The intrinsic nature of queuing theory is to represent the spatial and temporal characteristics of entities and how they move, are placed in queues, and are serviced. The service mechanism in our system is an algorithm representing how swans move through the flyway based on seasonal life cycle events. The system uses an observed number of swans at each of 27 areas for a breeding season as input and simulates their distribution through four seasonal steps. The result is a simulated distribution of birds for the subsequent year's breeding season. The model was built as a multiagent system with one agent handling movement algorithms, with one facilitating user interface, and with one to seven agents representing specific geographic areas for which swan management interventions can be implemented. The many parallels in queuing model servers and service mechanisms with waterfowl management areas and annual life cycle events made the transfer of the theory to practical application straightforward.

  7. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  8. Queuing Theory: An Alternative Approach to Educational Research.

    ERIC Educational Resources Information Center

    Huyvaert, Sarah H.

    Queuing theory is examined in this paper in order to determine if the theory could be applied in educational settings. It is defined as a form of operations research that uses mathematical formulas and/or computer simulation to study wait and congestion in a system and, through the study of these visible phenomena, to discover malfunctions within…

  9. Application of queuing theory in production-inventory optimization

    NASA Astrophysics Data System (ADS)

    Rashid, Reza; Hoseini, Seyed Farzad; Gholamian, M. R.; Feizabadi, Mohammad

    2015-07-01

    This paper presents a mathematical model for an inventory control system in which customers' demands and suppliers' service time are considered as stochastic parameters. The proposed problem is solved through queuing theory for a single item. In this case, transitional probabilities are calculated in steady state. Afterward, the model is extended to the case of multi-item inventory systems. Then, to deal with the complexity of this problem, a new heuristic algorithm is developed. Finally, the presented bi-level inventory-queuing model is implemented as a case study in Electroestil Company.

  10. Application of queuing theory in inventory systems with substitution flexibility

    NASA Astrophysics Data System (ADS)

    Seyedhoseini, S. M.; Rashid, Reza; Kamalpour, Iman; Zangeneh, Erfan

    2015-01-01

    Considering the competition in today's business environment, tactical planning of a supply chain becomes more complex than before. In many multi-product inventory systems, substitution flexibility can improve profits. This paper aims to prepare a comprehensive substitution inventory model, where an inventory system with two substitute products with ignorable lead time has been considered, and effects of simultaneous ordering have been examined. In this paper, demands of customers for both of the products have been regarded as stochastic parameters, and queuing theory has been used to construct a mathematical model. The model has been coded by C++, and it has been analyzed due to a real example, where the results indicate efficiency of proposed model.

  11. Using Queuing Theory and Simulation Model to Optimize Hospital Pharmacy Performance

    PubMed Central

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-01-01

    Background: Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. Objectives: This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. Patients and Methods: A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results: Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation

  12. An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory

    PubMed Central

    Mousavi, Ali; Clarkson, P. John; Young, Terry

    2015-01-01

    This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation. PMID:27170899

  13. Application of Queuing Analytic Theory to Decrease Waiting Times in Emergency Department: Does it Make Sense?

    PubMed Central

    Alavi-Moghaddam, Mostafa; Forouzanfar, Reza; Alamdari, Shahram; Shahrami, Ali; Kariman, Hamid; Amini, Afshin; Pourbabaee, Shokooh; Shirvani, Armin

    2012-01-01

    Background Patients who receive care in an emergency department (ED), are usually unattended while waiting in queues. Objectives This study was done to determine, whether the application of queuing theory analysis might shorten the waiting times of patients admitted to emergency wards. Patients and Methods This was an operational study to use queuing theory analysis in the ED. In the first phase, a field study was conducted to delineate the performance of the ED and enter the data obtained into simulator software. In the second phase, "ARENA" software was used for modeling, analysis, creating a simulation and improving the movement of patients in the ED. Validity of the model was confirmed through comparison of the results with the real data using the same instrument. The third phase of the study concerned modeling in order to assess the effect of various operational strategies, on the queue waiting time of patients who were receiving care in the ED. Results In the first phase, it was shown that 47.7% of the 3000 patient records were cases referred for trauma treatment, and the remaining 52.3% were referred for non-trauma services. A total of 56% of the cases were male and 44% female. Maximum input was 4.5 patients per hour and the minimum input was 0.5 per hour. The average length of stay for patients in the trauma section was three hours, while for the non-trauma section it was four hours. In the second phase, modeling was tested with common scenarios. In the third phase, the scenario with the addition of one or more senior emergency resident(s) on each shift resulted in a decreased length of stay from 4 to 3.75 hours. Moreover, the addition of one bed to the Intensive Care Unit (ICU) and/or Critical Care Unit (CCU) in the study hospital, reduced the occupancy rate of the nursing service from 76% to 67%. By adding another clerk to take electrocardiograms (ECG) in the ED, the average time from a request to performing the procedure is reduced from 26 to 18 minutes

  14. Application of queuing theory to patient satisfaction at a tertiary hospital in Nigeria

    PubMed Central

    Ameh, Nkeiruka; Sabo, B.; Oyefabi, M. O.

    2013-01-01

    Background: Queuing theory is the mathematical approach to the analysis of waiting lines in any setting where arrival rate of subjects is faster than the system can handle. It is applicable to healthcare settings where the systems have excess capacity to accommodate random variations. Materials and Methods: A cross-sectional descriptive survey was done. Questionnaires were administered to patients who attended the general outpatient department. Observations were also made on the queuing model and the service discipline at the clinic. Questions were meant to obtain demographic characteristics and the time spent on the queue by patients before being seen by a doctor, time spent with the doctor, their views about the time spent on the queue and useful suggestions on how to reduce the time spent on the queue. A total of 210 patients were surveyed. Results: Majority of the patients (164, 78.1%) spent 2 h or less on the queue before being seen by a doctor and less than 1 h to see the doctor. Majority of the patients (144, 68.5%) were satisfied with the time they spent on the queue before being seen by a doctor. Useful suggestions proffered by the patients to decrease the time spent on the queue before seeing a doctor at the clinic included: that more doctors be employed (46, 21.9%), that doctors should come to work on time (25, 11.9%), that first-come-first served be observed strictly (32, 15.2%) and others suggested that the records staff should desist from collecting bribes from patients in order to place their cards before others. The queuing method employed at the clinic is the multiple single channel type and the service discipline is priority service. The patients who spent less time on the queue (<1 h) before seeing the doctor were more satisfied than those who spent more time (P < 0.05). Conclusion: The study has revealed that majority of the patients were satisfied with the practice at the general outpatient department. However, there is a need to employ

  15. Spreadsheet Analysis Of Queuing In A Computer Network

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  16. Effects of diversity and procrastination in priority queuing theory: the different power law regimes.

    PubMed

    Saichev, A; Sornette, D

    2010-01-01

    Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law approximately 1/t(alpha) with 0queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a "time deficit" control parameter beta and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail approximately 1/t(1/2), resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/t(alpha), with alpha is an element of (0.5,infinity), including the well-known case 1/t. We also study the effect of "procrastination," defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.

  17. Effects of diversity and procrastination in priority queuing theory: The different power law regimes

    NASA Astrophysics Data System (ADS)

    Saichev, A.; Sornette, D.

    2010-01-01

    Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law ˜1/tα with 0<α≤1 over a time scale of years. We present a simple model for this persistence phenomenon, framed within the standard priority queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a “time deficit” control parameter β and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail ˜1/t1/2 , resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/tα , with αɛ(0.5,∞) , including the well-known case 1/t . We also study the effect of “procrastination,” defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.

  18. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    SciTech Connect

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  19. Improving queuing service at McDonald's

    NASA Astrophysics Data System (ADS)

    Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

    2014-07-01

    Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

  20. Using Queuing Theory and Simulation Modelling to Reduce Waiting Times in An Iranian Emergency Department

    PubMed Central

    Haghighinejad, Hourvash Akbari; Kharazmi, Erfan; Hatam, Nahid; Yousefi, Sedigheh; Hesami, Seyed Ali; Danaei, Mina; Askarian, Mehrdad

    2016-01-01

    Background: Hospital emergencies have an essential role in health care systems. In the last decade, developed countries have paid great attention to overcrowding crisis in emergency departments. Simulation analysis of complex models for which conditions will change over time is much more effective than analytical solutions and emergency department (ED) is one of the most complex models for analysis. This study aimed to determine the number of patients who are waiting and waiting time in emergency department services in an Iranian hospital ED and to propose scenarios to reduce its queue and waiting time. Methods: This is a cross-sectional study in which simulation software (Arena, version 14) was used. The input information was extracted from the hospital database as well as through sampling. The objective was to evaluate the response variables of waiting time, number waiting and utilization of each server and test the three scenarios to improve them. Results: Running the models for 30 days revealed that a total of 4088 patients left the ED after being served and 1238 patients waited in the queue for admission in the ED bed area at end of the run (actually these patients received services out of their defined capacity). The first scenario result in the number of beds had to be increased from 81 to179 in order that the number waiting of the “bed area” server become almost zero. The second scenario which attempted to limit hospitalization time in the ED bed area to the third quartile of the serving time distribution could decrease the number waiting to 586 patients. Conclusion: Doubling the bed capacity in the emergency department and consequently other resources and capacity appropriately can solve the problem. This includes bed capacity requirement for both critically ill and less critically ill patients. Classification of ED internal sections based on severity of illness instead of medical specialty is another solution. PMID:26793727

  1. A soft computing-based approach to optimise queuing-inventory control problem

    NASA Astrophysics Data System (ADS)

    Alaghebandha, Mohammad; Hajipour, Vahid

    2015-04-01

    In this paper, a multi-product continuous review inventory control problem within batch arrival queuing approach (MQr/M/1) is developed to find the optimal quantities of maximum inventory. The objective function is to minimise summation of ordering, holding and shortage costs under warehouse space, service level and expected lost-sales shortage cost constraints from retailer and warehouse viewpoints. Since the proposed model is Non-deterministic Polynomial-time hard, an efficient imperialist competitive algorithm (ICA) is proposed to solve the model. To justify proposed ICA, both ganetic algorithm and simulated annealing algorithm are utilised. In order to determine the best value of algorithm parameters that result in a better solution, a fine-tuning procedure is executed. Finally, the performance of the proposed ICA is analysed using some numerical illustrations.

  2. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    PubMed Central

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  3. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    PubMed

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  4. Using queuing theory to analyse the government's 4-H completion time target in accident and emergency departments.

    PubMed

    Mayhew, L; Smith, D

    2008-03-01

    This paper uses a queuing model to evaluate completion times in Accident and Emergency (A&E) departments in the light of the Government target of completing and discharging 98% of patients inside 4 h. It illustrates how flows though an A&E can be accurately represented as a queuing process, how outputs can be used to visualise and interpret the 4-h Government target in a simple way and how the model can be used to assess the practical achievability of A&E targets in the future. The paper finds that A&E targets have resulted in significant improvements in completion times and thus deal with a major source of complaint by users of the National Health Service in the U.K. It suggests that whilst some of this improvement is attributable to better management, some is also due to the way some patients in A&E are designated and therefore counted through the system. It finds for example that the current target would not have been possible without some form of patient re-designation or re-labelling taking place. Further it finds that the current target is so demanding that the integrity of reported performance is open to question. Related incentives and demand management issues resulting from the target are also briefly discussed. PMID:18390164

  5. Human Factors of Queuing: A Library Circulation Model.

    ERIC Educational Resources Information Center

    Mansfield, Jerry W.

    1981-01-01

    Classical queuing theories and their accompanying service facilities totally disregard the human factors in the name of efficiency. As library managers we need to be more responsive to human needs in the design of service points and make every effort to minimize queuing and queue frustration. Five references are listed. (Author/RAA)

  6. A queuing model for road traffic simulation

    SciTech Connect

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-03-10

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.

  7. Development of Markov Chain-Based Queuing Model and Wireless Infrastructure for EV to Smart Meter Communication in V2G

    NASA Astrophysics Data System (ADS)

    Santoshkumar; Udaykumar, R. Y.

    2015-04-01

    The electrical vehicles (EVs) can be connected to the grid for power transaction. The vehicle-to-grid (V2G) supports the grid requirements and helps in maintaining the load demands. The grid control center (GCC), aggregator and EV are three key entities in V2G communication. The GCC sends the information about power requirements to the aggregator. The aggregator after receiving the information from the GCC sends the information to the EVs. Based on the information, the interested EV owners participate in power transaction with the grid. The aggregator facilitates the EVs by providing the parking and charging slot. In this paper the queuing model for EVs connected to the grid and development of wireless infrastructure for the EV to Smart Meter communication is proposed. The queuing model is developed and simulated. The path loss models for WiMAX are analyzed and compared. Also, the physical layer of WiMAX protocol is modeled and simulated for the EV to Smart Meter communication in V2G.

  8. Capacity Utilization Study for Aviation Security Cargo Inspection Queuing System

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E; Brumback, Daryl L

    2010-01-01

    In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

  9. Modeling and simulation of M/M/c queuing pharmacy system with adjustable parameters

    NASA Astrophysics Data System (ADS)

    Rashida, A. R.; Fadzli, Mohammad; Ibrahim, Safwati; Goh, Siti Rohana

    2016-02-01

    This paper studies a discrete event simulation (DES) as a computer based modelling that imitates a real system of pharmacy unit. M/M/c queuing theo is used to model and analyse the characteristic of queuing system at the pharmacy unit of Hospital Tuanku Fauziah, Kangar in Perlis, Malaysia. The input of this model is based on statistical data collected for 20 working days in June 2014. Currently, patient waiting time of pharmacy unit is more than 15 minutes. The actual operation of the pharmacy unit is a mixed queuing server with M/M/2 queuing model where the pharmacist is referred as the server parameters. DES approach and ProModel simulation software is used to simulate the queuing model and to propose the improvement for queuing system at this pharmacy system. Waiting time for each server is analysed and found out that Counter 3 and 4 has the highest waiting time which is 16.98 and 16.73 minutes. Three scenarios; M/M/3, M/M/4 and M/M/5 are simulated and waiting time for actual queuing model and experimental queuing model are compared. The simulation results show that by adding the server (pharmacist), it will reduce patient waiting time to a reasonable improvement. Almost 50% average patient waiting time is reduced when one pharmacist is added to the counter. However, it is not necessary to fully utilize all counters because eventhough M/M/4 and M/M/5 produced more reduction in patient waiting time, but it is ineffective since Counter 5 is rarely used.

  10. Application of queuing model in Dubai's busiest megaplex

    NASA Astrophysics Data System (ADS)

    Bhagchandani, Maneesha; Bajpai, Priti

    2013-09-01

    This paper provides a study and analysis of the extremely busy booking counters at the Megaplex in Dubai using the queuing model and simulation. Dubai is an emirate in UAE with a multicultural population. Majority of the population in Dubai is foreign born. Cinema is one of the major forms of entertainment. There are more than 13 megaplexes each with a number of screens ranging from 3 to 22. They screen movies in English, Arabic, Hindi and other languages. It has been observed that during the weekends megaplexes attract a large number of crowd resulting in long queues at the booking counters. One of the busiest megaplex was selected for the study. Queuing theory satisfies the model when tested in real time situation. The concepts of arrival rate, service rate, utilization rate, waiting time in the system, average number of people in the queue, using Little's Theorem and M/M/s queuing model along with simulation software have been used to suggest an empirical solution. The aim of the paper is twofold-To assess the present situation at the Megaplex and give recommendations to optimize the use of booking counters.

  11. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital

    PubMed Central

    Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  12. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.

    PubMed

    Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees.

  13. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.

    PubMed

    Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  14. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  15. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  16. Modeling Patient Flows Using a Queuing Network with Blocking

    PubMed Central

    KUNO, ERI; SMITH, TONY E.

    2015-01-01

    The downsizing and closing of state mental health institutions in Philadelphia in the 1990’s led to the development of a continuum care network of residential-based services. Although the diversity of care settings increased, congestion in facilities caused many patients to unnecessarily spend extra days in intensive facilities. This study applies a queuing network system with blocking to analyze such congestion processes. “Blocking” denotes situations where patients are turned away from accommodations to which they are referred, and are thus forced to remain in their present facilities until space becomes available. Both mathematical and simulation results are presented and compared. Although queuing models have been used in numerous healthcare studies, the inclusion of blocking is still rare. We found that, in Philadelphia, the shortage of a particular type of facilities may have created “upstream blocking”. Thus removal of such facility-specific bottlenecks may be the most efficient way to reduce congestion in the system as a whole. PMID:15782512

  17. Improving hospital bed occupancy and resource utilization through queuing modeling and evolutionary computation.

    PubMed

    Belciug, Smaranda; Gorunescu, Florin

    2015-02-01

    Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application.

  18. Some queuing network models of computer systems

    NASA Technical Reports Server (NTRS)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  19. Case study: applying management policies to manage distributed queuing systems

    NASA Astrophysics Data System (ADS)

    Neumair, Bernhard; Wies, René

    1996-06-01

    The increasing deployment of workstations and high performance endsystems in addition to the operation of mainframe computers leads to a situation where many companies can no longer afford for their expensive workstations to run idle for long hours during the night or with little load during daytime. Distributed queuing systems and batch systems (DQSs) provide an efficient basis to make use of these unexploited resources and allow corporations to replace expensive supercomputers with clustered workstations running DQSs. To employ these innovative DQSs on a large scale, the management policies for scheduling jobs, configuring queues, etc must be integrated in the overall management process for the IT infrastructure. For this purpose, the concepts of application management and management policies are introduced and discussed. The definition, automatic transformation, and implementation of policies on management platforms to effectively manage DQSs will show that policy-based application management is already possible using the existing management functionality found in today's systems.

  20. Agent-Based Literacy Theory

    ERIC Educational Resources Information Center

    McEneaney, John E.

    2006-01-01

    The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory…

  1. Queuing network approach for building evacuation planning

    NASA Astrophysics Data System (ADS)

    Ishak, Nurhanis; Khalid, Ruzelan; Baten, Md. Azizul; Nawawi, Mohd. Kamal Mohd.

    2014-12-01

    The complex behavior of pedestrians in a limited space layout can explicitly be modeled using an M/G/C/C state dependent queuing network. This paper implements the approach to study pedestrian flows through various corridors in a topological network. The best arrival rates and their impacts to the corridors' performances in terms of the throughput, blocking probability, expected number of occupants in the system and expected travel time were first measured using the M/G/C/C analytical model. These best arrival rates were then fed to its Network Flow Programming model to find the best arrival rates to source corridors and routes optimizing the network's total throughput. The analytical results were then validated using a simulation model. Various results of this study can be used to support the current Standard Operating Procedures (SOP) to efficiently and safely evacuate people in emergency cases.

  2. NQS - NETWORK QUEUING SYSTEM, VERSION 2.0 (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Walter, H.

    1994-01-01

    The Network Queuing System, NQS, is a versatile batch and device queuing facility for a single Unix computer or a group of networked computers. With the Unix operating system as a common interface, the user can invoke the NQS collection of user-space programs to move batch and device jobs freely around the different computer hardware tied into the network. NQS provides facilities for remote queuing, request routing, remote status, queue status controls, batch request resource quota limits, and remote output return. This program was developed as part of an effort aimed at tying together diverse UNIX based machines into NASA's Numerical Aerodynamic Simulator Processing System Network. This revision of NQS allows for creating, deleting, adding and setting of complexes that aid in limiting the number of requests to be handled at one time. It also has improved device-oriented queues along with some revision of the displays. NQS was designed to meet the following goals: 1) Provide for the full support of both batch and device requests. 2) Support all of the resource quotas enforceable by the underlying UNIX kernel implementation that are relevant to any particular batch request and its corresponding batch queue. 3) Support remote queuing and routing of batch and device requests throughout the NQS network. 4) Support queue access restrictions through user and group access lists for all queues. 5) Enable networked output return of both output and error files to possibly remote machines. 6) Allow mapping of accounts across machine boundaries. 7) Provide friendly configuration and modification mechanisms for each installation. 8) Support status operations across the network, without requiring a user to log in on remote target machines. 9) Provide for file staging or copying of files for movement to the actual execution machine. To support batch and device requests, NQS v.2 implements three queue types--batch, device and pipe. Batch queues hold and prioritize batch requests

  3. A queueing theory based model for business continuity in hospitals.

    PubMed

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals. PMID:24109839

  4. A queueing theory based model for business continuity in hospitals.

    PubMed

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.

  5. An application of a queuing model for sea states

    NASA Astrophysics Data System (ADS)

    Loffredo, L.; Monbaliu, J.; Anderson, C.

    2012-04-01

    Unimodal approaches in design practice have shown inconsistencies in terms of directionality and limitations for accurate sea states description. Spectral multimodality needs to be included in the description of the wave climate. It can provide information about the coexistence of different wave systems originating from different meteorological events, such as locally generated wind waves and swell systems from distant storms. A 20 years dataset (1989-2008) for a location on the North Sea (K13, 53.2°N 3.2°E) has been retrieved from the ECMWF ERA- Interim re-analysis data archive, providing a consistent and homogeneous dataset. The work focuses on the joint and conditional probability distributions of wind sea and swell systems. For marine operations and design applications, critical combinations of wave systems may exist. We define a critical sea state on the basis of a set of thresholds, which can be not necessarily extreme, the emphasis is given to the dangerous combination of different wave systems concerning certain operations (i.e. small vessels navigation, dredging). The distribution of non-operability windows is described by a point process model with random and independent events, whose occurrences and lengths can be described only probabilistically. These characteristics allow to treat the emerging patterns as a part of a queuing system. According to this theory, generally adopted for several applications including traffic flows and waiting lines, the input process describes the sequence of requests for a service and the service mechanism the length of time that these requests will occupy the facilities. For weather-driven processes at sea an alternating renewal process appears as a suitable model. It consists of a sequence of critical events (period of inoperability), each of random duration, separated by calms, also of random durations. Inoperability periods and calms are assumed independent. In this model it is not possible more than one critical

  6. Using multi-class queuing network to solve performance models of e-business sites.

    PubMed

    Zheng, Xiao-ying; Chen, De-ren

    2004-01-01

    Due to e-business's variety of customers with different navigational patterns and demands, multi-class queuing network is a natural performance model for it. The open multi-class queuing network(QN) models are based on the assumption that no service center is saturated as a result of the combined loads of all the classes. Several formulas are used to calculate performance measures, including throughput, residence time, queue length, response time and the average number of requests. The solution technique of closed multi-class QN models is an approximate mean value analysis algorithm (MVA) based on three key equations, because the exact algorithm needs huge time and space requirement. As mixed multi-class QN models, include some open and some closed classes, the open classes should be eliminated to create a closed multi-class QN so that the closed model algorithm can be applied. Some corresponding examples are given to show how to apply the algorithms mentioned in this article. These examples indicate that multi-class QN is a reasonably accurate model of e-business and can be solved efficiently.

  7. Modified weighted fair queuing for packet scheduling in mobile WiMAX networks

    NASA Astrophysics Data System (ADS)

    Satrya, Gandeva B.; Brotoharsono, Tri

    2013-03-01

    The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.

  8. Based on Regular Expression Matching of Evaluation of the Task Performance in WSN: A Queue Theory Approach

    PubMed Central

    Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo

    2014-01-01

    Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. PMID:25401151

  9. NAS Requirements Checklist for Job Queuing/Scheduling Software

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The increasing reliability of parallel systems and clusters of computers has resulted in these systems becoming more attractive for true production workloads. Today, the primary obstacle to production use of clusters of computers is the lack of a functional and robust Job Management System for parallel applications. This document provides a checklist of NAS requirements for job queuing and scheduling in order to make most efficient use of parallel systems and clusters for parallel applications. Future requirements are also identified to assist software vendors with design planning.

  10. Theory-Based Evaluation: Reflections Ten Years On. Theory-Based Evaluation: Past, Present, and Future

    ERIC Educational Resources Information Center

    Rogers, Patricia J.; Weiss, Carol H.

    2007-01-01

    This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly…

  11. A message-queuing framework for STAR's online monitoring and metadata collection

    SciTech Connect

    Arkhipkin, D.; Lauret, J.; Betts, W.

    2011-12-23

    We report our experience on migrating STARs Online Services (Run Control System, Data Acquisition System, Slow Control System and Subsystem Monitoring) from direct read/write database accesses to a modern non-blocking message-oriented infrastructure. Based on the Advanced Messaging Queuing Protocol (AMQP) and standards, this novel approach does not specify the message data structure, allowing great flexibility in its use. After careful consideration, we chose Google Protocol Buffers as our primary (de)serialization format for structured data exchange. This migration allows us to reduce the overall system complexity and greatly improve the reliability of the metadata collection and the performance of our online services in general. We will present this new framework through its software architecture overview, providing details about our staged and non-disruptive migration process as well as details of the implementation of pluggable components to provide future improvements without compromising stability and availability of services.

  12. Final Report for ?Queuing Network Models of Performance of High End Computing Systems?

    SciTech Connect

    Buckwalter, J

    2005-09-28

    The primary objective of this project is to perform general research into queuing network models of performance of high end computing systems. A related objective is to investigate and predict how an increase in the number of nodes of a supercomputer will decrease the running time of a user's software package, which is often referred to as the strong scaling problem. We investigate the large, MPI-based Linux cluster MCR at LLNL, running the well-known NAS Parallel Benchmark (NPB) applications. Data is collected directly from NPB and also from the low-overhead LLNL profiling tool mpiP. For a run, we break the wall clock execution time of the benchmark into four components: switch delay, MPI contention time, MPI service time, and non-MPI computation time. Switch delay is estimated from message statistics. MPI service time and non-MPI computation time are calculated directly from measurement data. MPI contention is estimated by means of a queuing network model (QNM), based in part on MPI service time. This model of execution time validates reasonably well against the measured execution time, usually within 10%. Since the number of nodes used to run the application is a major input to the model, we can use the model to predict application execution times for various numbers of nodes. We also investigate how the four components of execution time scale individually as the number of nodes increases. Switch delay and MPI service time scale regularly. MPI contention is estimated by the QNM submodel and also has a fairly regular pattern. However, non-MPI compute time has a somewhat irregular pattern, possibly due to caching effects in the memory hierarchy. In contrast to some other performance modeling methods, this method is relatively fast to set up, fast to calculate, simple for data collection, and yet accurate enough to be quite useful.

  13. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

    NASA Astrophysics Data System (ADS)

    Duan, Haoran

    1997-12-01

    This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a

  14. Queuing model of a traffic bottleneck with bimodal arrival rate

    NASA Astrophysics Data System (ADS)

    Woelki, Marko

    2016-06-01

    This paper revisits the problem of tuning the density in a traffic bottleneck by reduction of the arrival rate when the queue length exceeds a certain threshold, studied recently for variants of totally asymmetric simple exclusion process (TASEP) and Burgers equation. In the present approach, a simple finite queuing system is considered and its contrasting “phase diagram” is derived. One can observe one jammed region, one low-density region and one region where the queue length is equilibrated around the threshold. Despite the simplicity of the model the physics is in accordance with the previous approach: The density is tuned at the threshold if the exit rate lies in between the two arrival rates.

  15. Evaluation of Job Queuing/Scheduling Software: Phase I Report

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.

  16. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

    NASA Technical Reports Server (NTRS)

    Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

    1997-01-01

    The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

  17. Adaptive neural models of queuing and timing in fluent action.

    PubMed

    Bullock, Daniel

    2004-09-01

    In biological cognition, specialized representations and associated control processes solve the temporal problems inherent in skilled action. Recent data and neural circuit models highlight three distinct levels of temporal structure: sequence preparation, velocity scaling, and state-sensitive timing. Short sequences of actions are prepared collectively in prefrontal cortex, then queued for performance by a cyclic competitive process that operates on a parallel analog representation. Successful acts like ball-catching depend on coordinated scaling of effector velocities, and velocity scaling, mediated by the basal ganglia, may be coupled to perceived time-to-contact. Making acts accurate at high speeds requires state-sensitive and precisely timed activations of muscle forces in patterns that accelerate and decelerate the effectors. The cerebellum may provide a maximally efficient representational basis for learning to generate such timed activation patterns.

  18. Jigsaw Cooperative Learning: Acid-Base Theories

    ERIC Educational Resources Information Center

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  19. Spectrally queued feature selection for robotic visual odometery

    NASA Astrophysics Data System (ADS)

    Pirozzo, David M.; Frederick, Philip A.; Hunt, Shawn; Theisen, Bernard; Del Rose, Mike

    2011-01-01

    Over the last two decades, research in Unmanned Vehicles (UV) has rapidly progressed and become more influenced by the field of biological sciences. Researchers have been investigating mechanical aspects of varying species to improve UV air and ground intrinsic mobility, they have been exploring the computational aspects of the brain for the development of pattern recognition and decision algorithms and they have been exploring perception capabilities of numerous animals and insects. This paper describes a 3 month exploratory applied research effort performed at the US ARMY Research, Development and Engineering Command's (RDECOM) Tank Automotive Research, Development and Engineering Center (TARDEC) in the area of biologically inspired spectrally augmented feature selection for robotic visual odometry. The motivation for this applied research was to develop a feasibility analysis on multi-spectrally queued feature selection, with improved temporal stability, for the purposes of visual odometry. The intended application is future semi-autonomous Unmanned Ground Vehicle (UGV) control as the richness of data sets required to enable human like behavior in these systems has yet to be defined.

  20. An Educational Theory Based on Game Theory. Preliminary Work.

    ERIC Educational Resources Information Center

    Chapman, Laura Hill

    Game theory may be a fruitful basis for educational theory. After describing games and related concepts like "utility," the author sets up examples of how games may be used as a basis for selecting teaching strategies. For instance, a teacher may decide to change many aspects of his teaching style, rather than just one at a time, if there is a…

  1. Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.

    PubMed

    Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K

    2016-01-01

    Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.

  2. The Scope of Usage-Based Theory

    PubMed Central

    Ibbotson, Paul

    2013-01-01

    Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552

  3. The scope of usage-based theory.

    PubMed

    Ibbotson, Paul

    2013-01-01

    Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the "cognitive commitment" of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works.

  4. Evacuation time estimate for total pedestrian evacuation using a queuing network model and volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Kunwar, Bharat; Simini, Filippo; Johansson, Anders

    2016-02-01

    Estimating city evacuation time is a nontrivial problem due to the interaction between thousands of individual agents, giving rise to various collective phenomena, such as bottleneck formation, intermittent flow, and stop-and-go waves. We present a mean field approach to draw relationships between road network spatial attributes, the number of evacuees, and the resultant evacuation time estimate (ETE). Using volunteered geographic information, we divide 50 United Kingdom cities into a total of 704 catchment areas (CAs) which we define as an area where all agents share the same nearest exit node. 90% of the agents are within ≈6 ,847 m of CA exit nodes with ≈13 ,778 agents/CA. We establish a characteristic flow rate from catchment area attributes (population, distance to exit node, and exit node width) and a mean flow rate in a free-flow regime by simulating total evacuations using an agent based "queuing network" model. We use these variables to determine a relationship between catchment area attributes and resultant ETEs. This relationship could enable emergency planners to make a rapid appraisal of evacuation strategies and help support decisions in the run up to a crisis.

  5. Impact of Mandatory HIV Screening in the Emergency Department: A Queuing Study.

    PubMed

    Liu, Nan; Stone, Patricia W; Schnall, Rebecca

    2016-04-01

    To improve HIV screening rates, New York State in 2010 mandated that all persons 13-64 years receiving health care services, including care in emergency departments (EDs), be offered HIV testing. Little attention has been paid to the effect of screening on patient flow. Time-stamped ED visit data from patients eligible for HIV screening, 7,844 of whom were seen by providers and 767 who left before being seen by providers, were retrieved from electronic health records in one adult ED. During day shifts, 10% of patients left without being seen, and during evening shifts, 5% left without being seen. All patients seen by providers were offered testing, and 6% were tested for HIV. Queuing models were developed to evaluate the effect of HIV screening on ED length of stay, patient waiting time, and rate of leaving without being seen. Base case analysis was conducted using actual testing rates, and sensitivity analyses were conducted to evaluate the impact of increasing the testing rate. Length of ED stay of patients who received HIV tests was 24 minutes longer on day shifts and 104 minutes longer on evening shifts than for patients not tested for HIV. Increases in HIV testing rate were estimated to increase waiting time for all patients, including those who left without being seen. Our simulation suggested that incorporating HIV testing into ED patient visits not only adds to practitioner workload but also increases patient waiting time significantly during busy shifts, which may increase the rate of leaving without being seen.

  6. Modelling pedestrian travel time and the design of facilities: a queuing approach.

    PubMed

    Rahman, Khalidur; Ghani, Noraida Abdul; Kamil, Anton Abdulbasah; Mustafa, Adli; Kabir Chowdhury, Md Ahmed

    2013-01-01

    Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities.

  7. A curriculum based on systems theory.

    PubMed

    Schemm, R L; Corcoran, M; Kolodner, E; Schaaf, R

    1993-07-01

    This paper describes an entry-level curriculum based on systems theory that was designed to promote integrated thinking and a shared image of practice among all of the members of an educational community that included students, faculty, and clinicians. Initiated in 1983, the program integrates occupational therapy theory, critical thinking, and knowledge about person-environmental transactions with traditional medical, biological, psychological, and sociological course work to create a unique educational experience. The curriculum model is based on a spiral learning process that encourages integrated thinking. Furthermore, all concepts are systematically tied to the occupation core, the central theme of the program. Fieldwork is used to reinforce ideas presented in the classroom and features discrete learning experiences where students demonstrate their integration of knowledge and skills. In an evaluation of the program, responses from 78 clinician, 51 alumni, and 132 student questionnaires; feedback from 132 fieldwork supervisors; and longitudinal data from 33 alumni confirmed that graduates are critical thinkers who appreciate the diverse needs of clients while demonstrating an appreciation for the curative effect of meaningful, goal-directed activities. PMID:8322883

  8. Streamflow Prediction based on Chaos Theory

    NASA Astrophysics Data System (ADS)

    Li, X.; Wang, X.; Babovic, V. M.

    2015-12-01

    Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.

  9. Impact of Mandatory HIV Screening in the Emergency Department: A Queuing Study.

    PubMed

    Liu, Nan; Stone, Patricia W; Schnall, Rebecca

    2016-04-01

    To improve HIV screening rates, New York State in 2010 mandated that all persons 13-64 years receiving health care services, including care in emergency departments (EDs), be offered HIV testing. Little attention has been paid to the effect of screening on patient flow. Time-stamped ED visit data from patients eligible for HIV screening, 7,844 of whom were seen by providers and 767 who left before being seen by providers, were retrieved from electronic health records in one adult ED. During day shifts, 10% of patients left without being seen, and during evening shifts, 5% left without being seen. All patients seen by providers were offered testing, and 6% were tested for HIV. Queuing models were developed to evaluate the effect of HIV screening on ED length of stay, patient waiting time, and rate of leaving without being seen. Base case analysis was conducted using actual testing rates, and sensitivity analyses were conducted to evaluate the impact of increasing the testing rate. Length of ED stay of patients who received HIV tests was 24 minutes longer on day shifts and 104 minutes longer on evening shifts than for patients not tested for HIV. Increases in HIV testing rate were estimated to increase waiting time for all patients, including those who left without being seen. Our simulation suggested that incorporating HIV testing into ED patient visits not only adds to practitioner workload but also increases patient waiting time significantly during busy shifts, which may increase the rate of leaving without being seen. PMID:26829415

  10. Feature-Based Binding and Phase Theory

    ERIC Educational Resources Information Center

    Antonenko, Andrei

    2012-01-01

    Current theories of binding cannot provide a uniform account for many facts associated with the distribution of anaphors, such as long-distance binding effects and the subject-orientation of monomorphemic anaphors. Further, traditional binding theory is incompatible with minimalist assumptions. In this dissertation I propose an analysis of…

  11. State variable theories based on Hart's formulation

    SciTech Connect

    Korhonen, M.A.; Hannula, S.P.; Li, C.Y.

    1985-01-01

    In this paper a review of the development of a state variable theory for nonelastic deformation is given. The physical and phenomenological basis of the theory and the constitutive equations describing macroplastic, microplastic, anelastic and grain boundary sliding enhanced deformation are presented. The experimental and analytical evaluation of different parameters in the constitutive equations are described in detail followed by a review of the extensive experimental work on different materials. The technological aspects of the state variable approach are highlighted by examples of the simulative and predictive capabilities of the theory. Finally, a discussion of general capabilities, limitations and future developments of the theory and particularly the possible extensions to cover an even wider range of deformation or deformation-related phenomena is presented.

  12. Discrete-time Queuing Analysis of Opportunistic Spectrum Access: Single User Case

    NASA Astrophysics Data System (ADS)

    Wang, Jin-long; Xu, Yu-hua; Gao, Zhan; Wu, Qi-hui

    2011-11-01

    This article studies the discrete-time queuing dynamics of opportunistic spectrum access (OSA) systems, in which the secondary user seeks spectrum vacancies between bursty transmissions of the primary user to communicate. Since spectrum sensing and data transmission can not be performed simultaneously, the secondary user employs a sensing-then-transmission strategy to detect the presence of the primary user before accessing the licensed channel. Consequently, the transmission of the secondary user is periodically suspended for spectrum sensing. To capture the discontinuous transmission nature of the secondary user, we introduce a discrete-time queuing subjected to bursty preemption to describe the behavior of the secondary user. Specifically, we derive some important metrics of the secondary user, including secondary spectrum utilization ratio, buffer length, packet delay and packet dropping ratio. Finally, simulation results validate the proposed theoretical model and reveal that the theoretical results fit the simulated results well.

  13. MODELING AND PERFORMANCE EVALUATION FOR AVIATION SECURITY CARGO INSPECTION QUEUING SYSTEM

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Rose, Terri A; Brumback, Daryl L

    2009-01-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and throughput. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures will reduce the overall cost and shipping delays associated with the new inspection requirements.

  14. An Improved Call Admission Control Mechanism with Prioritized Handoff Queuing Scheme for BWA Networks

    NASA Astrophysics Data System (ADS)

    Chowdhury, Prasun; Saha Misra, Iti

    2014-10-01

    Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.

  15. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  16. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  17. Theory of fracture mechanics based upon plasticity

    NASA Technical Reports Server (NTRS)

    Lee, J. D.

    1976-01-01

    A theory of fracture mechanics is formulated on the foundation of continuum mechanics. Fracture surface is introduced as an unknown quantity and is incorporated into boundary and initial conditions. Surface energy is included in the global form of energy conservation law and the dissipative mechanism is formulated into constitutive equations which indicate the thermodynamic irreversibility and the irreversibility of fracture process as well.

  18. The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory

    ERIC Educational Resources Information Center

    Anil, Duygu

    2008-01-01

    In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…

  19. THEORY OF REGENERATION BASED ON MASS ACTION.

    PubMed

    Loeb, J

    1923-07-20

    1. The writer's older experiment, proving that equal masses of isolated sister leaves of Bryophyllum regenerate under equal conditions and in equal time equal masses (in dry weight) of shoots and roots, is confirmed. It is shown that in the dark this regeneration is reduced to a small fraction of that observed in light. 2. The writer's former observation is confirmed, that when a piece of stem inhibits or diminishes the regeneration in a leaf, the dry weight of the stem increases by as much or more than the weight by which the regeneration in the leaf is diminished. It is shown that this is also true when the axillary bud in the stem is removed or when the regeneration occurs in the dark. 3. These facts show that the regeneration of an isolated leaf of Bryophyllum is determined by the mass of material available or formed in the leaf during the experiment and that such a growth does not occur in a leaf connected with a normal plant for the reason that in the latter case the material available or formed in the leaf flows into the stem where it is consumed for normal growth. 4. It is shown that the sap sent out by a leaf in the descending current of a stem is capable of increasing also the rate of growth of shoots in the basal parts of the leaf when the sap has an opportunity to reach the anlagen for such shoots. 5. The fact that a defoliated piece of stem forms normally no shoots in its basal part therefore demands an explanation of the polar character of regeneration which lays no or less emphasis on the chemical difference between ascending and descending sap than does Sachs' theory of specific root- or shoot-forming substances (though such substances may in reality exist), but which uses as a basis the general mass relation as expressed in the first three statements of this summary. 6. It is suggested that the polar character of the regeneration in a stem of Bryophyllum is primarily due to the fact that the descending sap reaches normally only the root

  20. Theory of friction based on brittle fracture

    USGS Publications Warehouse

    Byerlee, J.D.

    1967-01-01

    A theory of friction is presented that may be more applicable to geologic materials than the classic Bowden and Tabor theory. In the model, surfaces touch at the peaks of asperities and sliding occurs when the asperities fail by brittle fracture. The coefficient of friction, ??, was calculated from the strength of asperities of certain ideal shapes; for cone-shaped asperities, ?? is about 0.1 and for wedge-shaped asperities, ?? is about 0.15. For actual situations which seem close to the ideal model, observed ?? was found to be very close to 0.1, even for materials such as quartz and calcite with widely differing strengths. If surface forces are present, the theory predicts that ?? should decrease with load and that it should be higher in a vacuum than in air. In the presence of a fluid film between sliding surfaces, ?? should depend on the area of the surfaces in contact. Both effects are observed. The character of wear particles produced during sliding and the way in which ?? depends on normal load, roughness, and environment lend further support to the model of friction presented here. ?? 1967 The American Institute of Physics.

  1. Graph-based linear scaling electronic structure theory.

    PubMed

    Niklasson, Anders M N; Mniszewski, Susan M; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Rubensson, Emanuel H; Djidjev, Hristo

    2016-06-21

    We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations. PMID:27334148

  2. Task-Based Language Teaching and Expansive Learning Theory

    ERIC Educational Resources Information Center

    Robertson, Margaret

    2014-01-01

    Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…

  3. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  4. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  5. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  6. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  7. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  8. Unifying ecology and macroevolution with individual-based theory

    PubMed Central

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-01-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. PMID:25818618

  9. Unifying ecology and macroevolution with individual-based theory.

    PubMed

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-05-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology.

  10. Measurement Theory in Deutsch's Algorithm Based on the Truth Values

    NASA Astrophysics Data System (ADS)

    Nagata, Koji; Nakamura, Tadao

    2016-08-01

    We propose a new measurement theory, in qubits handling, based on the truth values, i.e., the truth T (1) for true and the falsity F (0) for false. The results of measurement are either 0 or 1. To implement Deutsch's algorithm, we need both observability and controllability of a quantum state. The new measurement theory can satisfy these two. Especially, we systematically describe our assertion based on more mathematical analysis using raw data in a thoughtful experiment.

  11. Aviation security cargo inspection queuing simulation model for material flow and accountability

    SciTech Connect

    Olama, Mohammed M; Allgood, Glenn O; Rose, Terri A; Brumback, Daryl L

    2009-01-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.

  12. Theories about consensus-based conservation.

    PubMed

    Leach, William D

    2006-04-01

    "Conservation and the Myth of Consensus" (Peterson et al. 2005) levels several serious indictments against consensus-based approaches to environmental decision making. Namely, the authors argue that consensus processes (1) reinforce apathy and ignorance of conservation issues; (2) legitimize damage to the environment; (3) quash public debate about conservation; (4) solidify the existing balance of power in favor of prodevelopment forces; and (5) block progress toward an ecologically sustainable future. Careful scrutiny of consensus-based approaches is important, especially considering their surging use in conservation policy. In the spirit of advancing the debate further, I review some of the limitations of the essay and its modes of inquiry.

  13. Static analysis of rectangular nanoplates using trigonometric shear deformation theory based on nonlocal elasticity theory.

    PubMed

    Nami, Mohammad Rahim; Janghorban, Maziar

    2013-12-30

    In this article, a new higher order shear deformation theory based on trigonometric shear deformation theory is developed. In order to consider the size effects, the nonlocal elasticity theory is used. An analytical method is adopted to solve the governing equations for static analysis of simply supported nanoplates. In the present theory, the transverse shear stresses satisfy the traction free boundary conditions of the rectangular plates and these stresses can be calculated from the constitutive equations. The effects of different parameters such as nonlocal parameter and aspect ratio are investigated on both nondimensional deflections and deflection ratios. It may be important to mention that the present formulations are general and can be used for isotropic, orthotropic and anisotropic nanoplates.

  14. Elastic theory of origami-based metamaterials

    NASA Astrophysics Data System (ADS)

    Brunck, V.; Lechenault, F.; Reid, A.; Adda-Bedia, M.

    2016-03-01

    Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n -creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.

  15. Elastic theory of origami-based metamaterials.

    PubMed

    Brunck, V; Lechenault, F; Reid, A; Adda-Bedia, M

    2016-03-01

    Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n-creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.

  16. Kinetic energy decomposition scheme based on information theory.

    PubMed

    Imamura, Yutaka; Suzuki, Jun; Nakai, Hiromi

    2013-12-15

    We proposed a novel kinetic energy decomposition analysis based on information theory. Since the Hirshfeld partitioning for electron densities can be formulated in terms of Kullback-Leibler information deficiency in information theory, a similar partitioning for kinetic energy densities was newly proposed. The numerical assessments confirm that the current kinetic energy decomposition scheme provides reasonable chemical pictures for ionic and covalent molecules, and can also estimate atomic energies using a correction with viral ratios.

  17. Complexity measurement based on information theory and kolmogorov complexity.

    PubMed

    Lui, Leong Ting; Terrazas, Germán; Zenil, Hector; Alexander, Cameron; Krasnogor, Natalio

    2015-01-01

    In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.

  18. Evaluating hydrological model performance using information theory-based metrics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  19. Congestion at Card and Book Catalogs--A Queuing-Theory Approach

    ERIC Educational Resources Information Center

    Bookstein, Abraham

    1972-01-01

    This paper attempts to analyze the problem of congestion, using a mathematical model shown to be of value in other similar applications. Three criteria of congestion are considered, and it is found that the conclusion one can draw is sensitive to which of these criteria is paramount. (8 references) (Author/NH)

  20. Congestion at Card and Book Catalogs--A Queuing Theory Approach.

    ERIC Educational Resources Information Center

    Bookstein, Abraham

    The question of whether a library's catalog should consist of cards arranged in a single alphabetical order (the "dictionary catalog) or be segregated as a separate file is discussed. Development is extended to encompass related problems involved in the creation of a book catalog. A model to study the effects of congestion at the catalog is…

  1. Toward an instructionally oriented theory of example-based learning.

    PubMed

    Renkl, Alexander

    2014-01-01

    Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from worked examples, observational learning, and analogical reasoning. This theory has descriptive and prescriptive elements. The descriptive subtheory deals with (a) the relevance and effectiveness of examples, (b) phases of skill acquisition, and (c) learning processes. The prescriptive subtheory proposes instructional principles that make full exploitation of the potential of example-based learning possible.

  2. Toward an Instructionally Oriented Theory of Example-Based Learning

    ERIC Educational Resources Information Center

    Renkl, Alexander

    2014-01-01

    Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from…

  3. A Memory-Based Theory of Verbal Cognition

    ERIC Educational Resources Information Center

    Dennis, Simon

    2005-01-01

    The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these…

  4. Project-Based Language Learning: An Activity Theory Analysis

    ERIC Educational Resources Information Center

    Gibbes, Marina; Carson, Lorna

    2014-01-01

    This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

  5. A Natural Teaching Method Based on Learning Theory.

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    1991-01-01

    The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…

  6. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  7. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  8. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  9. Evidence for an expectancy-based theory of avoidance behaviour.

    PubMed

    Declercq, Mieke; De Houwer, Jan; Baeyens, Frank

    2008-01-01

    In most studies on avoidance learning, participants receive an aversive unconditioned stimulus after a warning signal is presented, unless the participant performs a particular response. Lovibond (2006) recently proposed a cognitive theory of avoidance learning, according to which avoidance behaviour is a function of both Pavlovian and instrumental conditioning. In line with this theory, we found that avoidance behaviour was based on an integration of acquired knowledge about, on the one hand, the relation between stimuli and, on the other hand, the relation between behaviour and stimuli.

  10. Modeling Air Traffic Management Technologies with a Queuing Network Model of the National Airspace System

    NASA Technical Reports Server (NTRS)

    Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter

    1999-01-01

    This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.

  11. Local Rule-Based Theory of Virus Shell Assembly

    NASA Astrophysics Data System (ADS)

    Berger, Bonnie; Shor, Peter W.; Tucker-Kellogg, Lisa; King, Jonathan

    1994-08-01

    A local rule-based theory is developed which shows that the self-assembly of icosahedral virus shells may depend on only the lower-level interactions of a protein subunit with its neighbors-i.e., on local rules rather than on larger structural building blocks. The local rule theory provides a framework for understanding the assembly of icosahedral viruses. These include both viruses that fall in the quasiequivalence theory of Caspar and Klug and the polyoma virus structure, which violates quasi-equivalence and has puzzled researchers since it was first observed. Local rules are essentially templates for energetically favorable arrangements. The tolerance margins for these rules are investigated through computer simulations. When these tolerance margins are exceeded in a particular way, the result is a "spiraling" malformation that has been observed in nature.

  12. Computer-based Training in Medicine and Learning Theories.

    PubMed

    Haag, Martin; Bauch, Matthias; Garde, Sebastian; Heid, Jörn; Weires, Thorsten; Leven, Franz-Josef

    2005-01-01

    Computer-based training (CBT) systems can efficiently support modern teaching and learning environments. In this paper, we demonstrate on the basis of the case-based CBT system CAMPUS that current learning theories and design principles (Bloom's Taxonomy and practice fields) are (i) relevant to CBT and (ii) are feasible to implement using computer-based training and adequate learning environments. Not all design principles can be fulfilled by the system alone, the integration of the system in adequate teaching and learning environments therefore is essential. Adequately integrated, CBT programs become valuable means to build or support practice fields for learners that build domain knowledge and problem-solving skills. Learning theories and their design principles can support in designing these systems as well as in assessing their value.

  13. Correlation theory-based signal processing method for CMF signals

    NASA Astrophysics Data System (ADS)

    Shen, Yan-lin; Tu, Ya-qing

    2016-06-01

    Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.

  14. Infrared small target detection based on Danger Theory

    NASA Astrophysics Data System (ADS)

    Lan, Jinhui; Yang, Xiao

    2009-11-01

    To solve the problem that traditional method can't detect the small objects whose local SNR is less than 2 in IR images, a Danger Theory-based model to detect infrared small target is presented in this paper. First, on the analog with immunology, the definition is given, in this paper, to such terms as dangerous signal, antigens, APC, antibodies. Besides, matching rule between antigen and antibody is improved. Prior to training the detection model and detecting the targets, the IR images are processed utilizing adaptive smooth filter to decrease the stochastic noise. Then at the training process, deleting rule, generating rule, crossover rule and the mutation rule are established after a large number of experiments in order to realize immediate convergence and obtain good antibodies. The Danger Theory-based model is built after the training process, and this model can detect the target whose local SNR is only 1.5.

  15. Ensemble method: Community detection based on game theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.

    2014-08-01

    Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.

  16. Research on Capturing of Customer Requirements Based on Innovation Theory

    NASA Astrophysics Data System (ADS)

    junwu, Ding; dongtao, Yang; zhenqiang, Bao

    To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.

  17. A danger-theory-based immune network optimization algorithm.

    PubMed

    Zhang, Ruirui; Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times.

  18. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  19. Ranking streamflow model performance based on Information theory metrics

    NASA Astrophysics Data System (ADS)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  20. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  1. A model of resurgence based on behavioral momentum theory.

    PubMed

    Shahan, Timothy A; Sweeney, Mary M

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory. PMID:21541118

  2. A model of resurgence based on behavioral momentum theory.

    PubMed

    Shahan, Timothy A; Sweeney, Mary M

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory.

  3. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  4. A communication-theory based view on telemedical communication.

    PubMed

    Schall, Thomas; Roeckelein, Wolfgang; Mohr, Markus; Kampshoff, Joerg; Lange, Tim; Nerlich, Michael

    2003-01-01

    Communication theory based analysis sheds new light on the use of health telematics. This analysis of structures in electronic medical communication shows communicative structures with special features. Current and evolving telemedical applications are analyzed. The methodology of communicational theory (focusing on linguistic pragmatics) is used to compare it with its conventional counterpart. The semiotic model, the roles of partners, the respective message and their relation are discussed. Channels, sender, addressee, and other structural roles are analyzed for different types of electronic medical communication. The communicative processes are shown as mutual, rational action towards a common goal. The types of communication/texts are analyzed in general. Furthermore the basic communicative structures of medical education via internet are presented with their special features. The analysis shows that electronic medical communication has special features compared to everyday communication: A third participant role often is involved: the patient. Messages often are addressed to an unspecified partner or to an unspecified partner within a group. Addressing in this case is (at least partially) role-based. Communication and message often directly (rather than indirectly) influence actions of the participants. Communication often is heavily regulated including legal implications like liability, and more. The conclusion from the analysis is that the development of telemedical applications so far did not sufficiently take communicative structures into consideration. Based on these results recommendations for future developments of telemedical applications/services are given.

  5. Transportation optimization with fuzzy trapezoidal numbers based on possibility theory.

    PubMed

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods.

  6. Identifying influential nodes in weighted networks based on evidence theory

    NASA Astrophysics Data System (ADS)

    Wei, Daijun; Deng, Xinyang; Zhang, Xiaoge; Deng, Yong; Mahadevan, Sankaran

    2013-05-01

    The design of an effective ranking method to identify influential nodes is an important problem in the study of complex networks. In this paper, a new centrality measure is proposed based on the Dempster-Shafer evidence theory. The proposed measure trades off between the degree and strength of every node in a weighted network. The influences of both the degree and the strength of each node are represented by basic probability assignment (BPA). The proposed centrality measure is determined by the combination of these BPAs. Numerical examples are used to illustrate the effectiveness of the proposed method.

  7. Master equation based steady-state cluster perturbation theory

    NASA Astrophysics Data System (ADS)

    Nuss, Martin; Dorn, Gerhard; Dorda, Antonius; von der Linden, Wolfgang; Arrigoni, Enrico

    2015-09-01

    A simple and efficient approximation scheme to study electronic transport characteristics of strongly correlated nanodevices, molecular junctions, or heterostructures out of equilibrium is provided by steady-state cluster perturbation theory. In this work, we improve the starting point of this perturbative, nonequilibrium Green's function based method. Specifically, we employ an improved unperturbed (so-called reference) state ρ̂S, constructed as the steady state of a quantum master equation within the Born-Markov approximation. This resulting hybrid method inherits beneficial aspects of both the quantum master equation as well as the nonequilibrium Green's function technique. We benchmark this scheme on two experimentally relevant systems in the single-electron transistor regime: an electron-electron interaction based quantum diode and a triple quantum dot ring junction, which both feature negative differential conductance. The results of this method improve significantly with respect to the plain quantum master equation treatment at modest additional computational cost.

  8. Feature Selection with Neighborhood Entropy-Based Cooperative Game Theory

    PubMed Central

    Zeng, Kai; She, Kun; Niu, Xinzheng

    2014-01-01

    Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones. PMID:25276120

  9. Feature selection with neighborhood entropy-based cooperative game theory.

    PubMed

    Zeng, Kai; She, Kun; Niu, Xinzheng

    2014-01-01

    Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones.

  10. Risk Matrix Integrating Risk Attitudes Based on Utility Theory.

    PubMed

    Ruan, Xin; Yin, Zhiyi; Frangopol, Dan M

    2015-08-01

    Recent studies indicate that absence of the consideration of risk attitudes of decisionmakers in the risk matrix establishment process has become a major limitation. In order to evaluate risk in a more comprehensive manner, an approach to establish risk matrices that integrates risk attitudes based on utility theory is proposed. There are three main steps within this approach: (1) describing risk attitudes of decisionmakers by utility functions, (2) bridging the gap between utility functions and the risk matrix by utility indifference curves, and (3) discretizing utility indifference curves. A complete risk matrix establishment process based on practical investigations is introduced. This process utilizes decisionmakers' answers to questionnaires to formulate required boundary values for risk matrix establishment and utility functions that effectively quantify their respective risk attitudes.

  11. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

    2012-01-01

    Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

  12. Classification of topological crystalline insulators based on representation theory

    NASA Astrophysics Data System (ADS)

    Dong, Xiao-Yu; Liu, Chao-Xing

    2016-01-01

    Topological crystalline insulators define a new class of topological insulator phases with gapless surface states protected by crystalline symmetries. In this work, we present a general theory to classify topological crystalline insulator phases based on the representation theory of space groups. Our approach is to directly identify possible nontrivial surface states in a semi-infinite system with a specific surface, of which the symmetry property can be described by 17 two-dimensional space groups. We reproduce the existing results of topological crystalline insulators, such as mirror Chern insulators in the p m or p m m groups, Cn v topological insulators in the p 4 m ,p 31 m , and p 6 m groups, and topological nonsymmorphic crystalline insulators in the p g and p m g groups. Aside from these existing results, we also obtain the following results: (1) there are two integer mirror Chern numbers (Z2) in the p m group but only one (Z ) in the c m or p 3 m 1 group for both the spinless and spinful cases; (2) for the p m m (c m m ) groups, there is no topological classification in the spinless case but Z4 (Z2) classifications in the spinful case; (3) we show how topological crystalline insulator phase in the p g group is related to that in the p m group; (4) we identify topological classification of the p 4 m ,p 31 m , and p 6 m for the spinful case; (5) we find topological nonsymmorphic crystalline insulators also existing in p g g and p 4 g groups, which exhibit new features compared to those in p g and p m g groups. We emphasize the importance of the irreducible representations for the states at some specific high-symmetry momenta in the classification of topological crystalline phases. Our theory can serve as a guide for the search of topological crystalline insulator phases in realistic materials.

  13. A molecularly based theory for electron transfer reorganization energy

    SciTech Connect

    Zhuang, Bilin; Wang, Zhen-Gang

    2015-12-14

    Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule’s permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.

  14. Operator-based analytic theory of decoherence in NMR

    NASA Astrophysics Data System (ADS)

    Pandey, Manoj Kumar; Ramachandran, Ramesh

    2011-06-01

    The operator-based analytic description of polarization transfer in NMR spectroscopy is often fraught with difficulty due to (a) the dimension and (b) the non-commuting nature of the spin Hamiltonians. In this article, an analytic model is presented to elucidate the mechanism of polarization transfer between dilute spins I 1 and I 2 coupled to a reservoir of abundant S-spins (i.e. ? ) in the solid state. Specifically, the factors responsible for the decoherence observed in double cross-polarization (DCP) experiments are outlined in terms of operators via effective Floquet Hamiltonians. The interplay between the various anisotropic interactions is thoroughly investigated by comparing the simulations from the analytic theory with exact numerical methods. The analytical theory presents a framework for incorporating multi-spin effects within a reduced subspace spanned by spins I 1 and I 2. The simulation results from the analytic model comprising eight spins are in excellent agreement with the numerical methods and present an attractive tool for understanding the phenomenon of decoherence in NMR.

  15. Quantum Hall transitions: An exact theory based on conformal restriction

    NASA Astrophysics Data System (ADS)

    Bettelheim, E.; Gruzberg, I. A.; Ludwig, A. W. W.

    2012-10-01

    We revisit the problem of the plateau transition in the integer quantum Hall effect. Here we develop an analytical approach for this transition, and for other two-dimensional disordered systems, based on the theory of “conformal restriction.” This is a mathematical theory that was recently developed within the context of the Schramm-Loewner evolution which describes the “stochastic geometry” of fractal curves and other stochastic geometrical fractal objects in two-dimensional space. Observables elucidating the connection with the plateau transition include the so-called point-contact conductances (PCCs) between points on the boundary of the sample, described within the language of the Chalker-Coddington network model for the transition. We show that the disorder-averaged PCCs are characterized by a classical probability distribution for certain geometric objects in the plane (which we call pictures), occurring with positive statistical weights, that satisfy the crucial so-called restriction property with respect to changes in the shape of the sample with absorbing boundaries; physically, these are boundaries connected to ideal leads. At the transition point, these geometrical objects (pictures) become fractals. Upon combining this restriction property with the expected conformal invariance at the transition point, we employ the mathematical theory of “conformal restriction measures” to relate the disorder-averaged PCCs to correlation functions of (Virasoro) primary operators in a conformal field theory (of central charge c=0). We show how this can be used to calculate these functions in a number of geometries with various boundary conditions. Since our results employ only the conformal restriction property, they are equally applicable to a number of other critical disordered electronic systems in two spatial dimensions, including for example the spin quantum Hall effect, the thermal metal phase in symmetry class D, and classical diffusion in two

  16. A memory-based theory of verbal cognition.

    PubMed

    Dennis, Simon

    2005-03-01

    The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these constraints in working memory. Lexical information is extracted directly from text using a version of the expectation maximization algorithm. In this article, the model is described and then illustrated on a number of phenomena, including sentence processing, semantic categorization and rating, short-term serial recall, and analogical and logical inference. Subsequently, the model is used to answer questions about a corpus of tennis news articles taken from the Internet. The model's success demonstrates that it is possible to extract propositional information from naturally occurring text without employing a grammar, defining a set of heuristics, or specifying a priori a set of semantic roles. PMID:21702771

  17. Invulnerability of power grids based on maximum flow theory

    NASA Astrophysics Data System (ADS)

    Fan, Wenli; Huang, Shaowei; Mei, Shengwei

    2016-11-01

    The invulnerability analysis against cascades is of great significance in evaluating the reliability of power systems. In this paper, we propose a novel cascading failure model based on the maximum flow theory to analyze the invulnerability of power grids. In the model, node initial loads are built on the feasible flows of nodes with a tunable parameter γ used to control the initial node load distribution. The simulation results show that both the invulnerability against cascades and the tolerance parameter threshold αT are affected by node load distribution greatly. As γ grows, the invulnerability shows the distinct change rules under different attack strategies and different tolerance parameters α respectively. These results are useful in power grid planning and cascading failure prevention.

  18. Intelligent control based on fuzzy logic and neural net theory

    NASA Technical Reports Server (NTRS)

    Lee, Chuen-Chien

    1991-01-01

    In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.

  19. Theory for a gas composition sensor based on acoustic properties.

    PubMed

    Phillips, Scott; Dain, Yefim; Lueptow, Richard M

    2003-01-01

    Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent. PMID:14552356

  20. A Lie based 4-dimensional higher Chern-Simons theory

    NASA Astrophysics Data System (ADS)

    Zucchini, Roberto

    2016-05-01

    We present and study a model of 4-dimensional higher Chern-Simons theory, special Chern-Simons (SCS) theory, instances of which have appeared in the string literature, whose symmetry is encoded in a skeletal semistrict Lie 2-algebra constructed from a compact Lie group with non discrete center. The field content of SCS theory consists of a Lie valued 2-connection coupled to a background closed 3-form. SCS theory enjoys a large gauge and gauge for gauge symmetry organized in an infinite dimensional strict Lie 2-group. The partition function of SCS theory is simply related to that of a topological gauge theory localizing on flat connections with degree 3 second characteristic class determined by the background 3-form. Finally, SCS theory is related to a 3-dimensional special gauge theory whose 2-connection space has a natural symplectic structure with respect to which the 1-gauge transformation action is Hamiltonian, the 2-curvature map acting as moment map.

  1. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  2. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  3. A Theory of Conditioning: Inductive Learning within Rule-Based Default Hierarchies.

    ERIC Educational Resources Information Center

    Holyoak, Keith J.; And Others

    1989-01-01

    A theory of classical conditioning is presented, which is based on a parallel, rule-based performance system integrated with mechanisms for inductive learning. A major inferential heuristic incorporated into the theory involves "unusualness," which is focused on novel cues. The theory is implemented via computer simulation. (TJH)

  4. Experimental Energy Consumption of Frame Slotted ALOHA and Distributed Queuing for Data Collection Scenarios

    PubMed Central

    Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier

    2014-01-01

    Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839

  5. The Energetic Assessment of Frictional Instability Based on Rowe's Theory

    NASA Astrophysics Data System (ADS)

    Hirata, M.; Muto, J.; Nagahama, H.

    2015-12-01

    Frictional instability that controls the occurrence of unstable slips has been related to (1) rate and state dependent friction law (Dieterich, 1979; Ruina, 1983) and (2) shear localization in a gouge layer (e.g., Byerlee et al., 1978; Logan et al., 1979). Ikari et al. (2011) indicated that the transitions of frictional parameters obtained from the rate and state dependent friction law involve shear localization. However, the underlining theoretical background for their link has been unknown. Therefore, in this study, we investigate their relation theoretically and experimentally based on Rowe's theory on constant minimum energy ratio (Rowe, 1962) describing particle deformations quantitatively by energetic analysis. In theoretical analysis using analytical dynamics and irreversible thermodynamics, the energetic criterion about frictional instability is obtained; unstable slip occurs at energy ratios below 1. In friction experiments using a gas medium apparatus, simulated fault gouge deforms obeying the Rowe's theory. Additionally, the energy ratios change gradually with shear and show below 1 before the occurrence of unstable slip. Moreover, energy ratios are derived from volume changes. Transition of energy ratios from increase to decrease, which has been confirmed at the end of compaction, indicates the onset of volume increase toward the occurrence of unstable slip. The volume increases likely correspond to the formation of R1-shears with open mode character, which occurs prior to the unstable slip. Shear localization leads to a change in internal friction angle which is a statistical parameter to constitute a energy ratio. In short, changes in internal friction angle play an important role in evolving from being frictionally stable to unstable. From these results, the physical and energetic background for their link between the frictional parameter and shear localization becomes clear.

  6. SARA: A Text-Based and Reader-Based Theory of Signaling

    ERIC Educational Resources Information Center

    Lemarie, Julie; Lorch, Robert F., Jr.; Eyrolle, Helene; Virbel, Jacques

    2008-01-01

    We propose a two-component theory of text signaling devices. The first component is a text-based analysis that characterizes any signaling device along four dimensions: (a) the type of information it makes available, (b) its scope, (c) how it is realized in the text, and (d) its location with respect to the content it cues. The second component is…

  7. Simulation-based learning: From theory to practice.

    PubMed

    DeCaporale-Ryan, Lauren N; Dadiz, Rita; Peyre, Sarah E

    2016-06-01

    Comments on the article, "Stimulating Reflective Practice Using Collaborative Reflective Training in Breaking Bad News Simulations," by Kim, Hernandez, Lavery, and Denmark (see record 2016-18380-001). Kim et al. are applauded for engaging and supporting the development of simulation-based education, and for their efforts to create an interprofessional learning environment. However, we hope further work on alternate methods of debriefing leverage the already inherent activation of learners that builds on previous experience, fosters reflection and builds skills. What is needed is the transference of learning theories into our educational research efforts that measure the effectiveness, validation, and reliability of behavior based performance change. The majority of breaking bad news (BBN) curricula limit program evaluations to reports of learner satisfaction, confidence and self-efficacy, rather than determining the successful translation of effective and humanistic interpersonal skills into long-term clinical practice (Rosenbaum et al., 2004). Research is needed to investigate how educational programs affect provider-patient-family interaction, and ultimately patient and family understanding, to better inform our teaching BBN skills. (PsycINFO Database Record PMID:27270248

  8. Image integrity authentication scheme based on fixed point theory.

    PubMed

    Li, Xu; Sun, Xingming; Liu, Quansheng

    2015-02-01

    Based on the fixed point theory, this paper proposes a new scheme for image integrity authentication, which is very different from digital signature and fragile watermarking. By the new scheme, the sender transforms an original image into a fixed point image (very close to the original one) of a well-chosen transform and sends the fixed point image (instead of the original one) to the receiver; using the same transform, the receiver checks the integrity of the received image by testing whether it is a fixed point image and locates the tampered areas if the image has been modified during the transmission. A realization of the new scheme is based on Gaussian convolution and deconvolution (GCD) transform, for which an existence theorem of fixed points is proved. The semifragility is analyzed via commutativity of transforms, and three commutativity theorems are found for the GCD transform. Three iterative algorithms are presented for finding a fixed point image with a few numbers of iterations, and for the whole procedure of image integrity authentication; a fragile authentication system and a semifragile one are separately built. Experiments show that both the systems have good performance in transparence, fragility, security, and tampering localization. In particular, the semifragile system can perfectly resist the rotation by a multiple of 90° flipping and brightness attacks.

  9. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  10. Density functional theory based generalized effective fragment potential method

    NASA Astrophysics Data System (ADS)

    Nguyen, Kiet A.; Pachter, Ruth; Day, Paul N.

    2014-06-01

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  11. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  12. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  13. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-18

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  14. LSST Telescope Alignment Plan Based on Nodal Aberration Theory

    NASA Astrophysics Data System (ADS)

    Sebag, J.; Gressler, W.; Schmid, T.; Rolland, J. P.; Thompson, K. P.

    2012-04-01

    The optical alignment of the Large Synoptic Survey Telescope (LSST) is potentially challenging, due to its fast three-mirror optical design and its large 3.5° field of view (FOV). It is highly advantageous to align the three-mirror optical system prior to the integration of the complex science camera on the telescope, which corrects the FOV via three refractive elements and includes the operational wavefront sensors. A telescope alignment method based on nodal aberration theory (NAT) is presented here to address this challenge. Without the science camera installed on the telescope, the on-axis imaging performance of the telescope is diffraction-limited, but the field of view is not corrected. The nodal properties of the three-mirror telescope design have been analyzed and an alignment approach has been developed using the intrinsically linear nodal behavior, which is linked via sensitivities to the misalignment parameters. Since mirror figure errors will exist in any real application, a methodology to introduce primary-mirror figure errors into the analysis has been developed and is also presented.

  15. An Approach to Theory-Based Youth Programming

    ERIC Educational Resources Information Center

    Duerden, Mat D.; Gillard, Ann

    2011-01-01

    A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…

  16. Information and communication theory. Citations from the NTIS data base

    NASA Astrophysics Data System (ADS)

    Carrigan, B.

    1980-04-01

    This bibliography cites Government sponsored research information and communication theory, including coding, decoding, and transmission of signals. Individual studies are cited on radio, television, and digital communication systems. Pure theory is also included. This updated bibliography contains 187 abstracts, 78 of which are new entries to the previous edition.

  17. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    PubMed Central

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  18. Point-based rigid registration: clinical validation of theory

    NASA Astrophysics Data System (ADS)

    West, Jay B.; Fitzpatrick, J. Michael

    2000-06-01

    simulations that the FLE is isotropic, i.e. that it has the same distribution in each coordinate direction. In the present work, we extend the validation beyond simulated data sets to clinically acquired head images from a set of 86 patients. We use the actual localizations of skull-implanted, visible fiducial markers in the images to compare the observed TRE values with those estimated by theory. This approach provides a clinically relevant estimate of the usefulness of the theoretical predictions. We also make a comparison between the observed TRE values and those given by numerical simulation: this allows us to determine whether the assumptions we use for the derivation of our results are good ones in practice. Although the distributions of observed and theoretical values appear different, ROC analysis shows that the theoretical values are good predictors of the observed ones. This gives some validity to the assumptions we make governing the point- based registration process (e.g., that the FLE is isotropic, and that it is independent and identically distributed at each fiducial point), and shows that our theory has practical use in a clinical setting.

  19. The Development of an Attribution-Based Theory of Motivation: A History of Ideas

    ERIC Educational Resources Information Center

    Weiner, Bernard

    2010-01-01

    The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the incorporation of…

  20. The Application of Carl Rogers' Person-Centered Learning Theory to Web-Based Instruction.

    ERIC Educational Resources Information Center

    Miller, Christopher T.

    This paper provides a review of literature that relates research on Carl Rogers' person-centered learning theory to Web-based learning. Based on the review of the literature, a set of criteria is described that can be used to determine how closely a Web-based course matches the different components of Rogers' person-centered learning theory. Using…

  1. Optimisation of a honeybee-colony's energetics via social learning based on queuing delays

    NASA Astrophysics Data System (ADS)

    Thenius, Ronald; Schmickl, Thomas; Crailsheim, Karl

    2008-06-01

    Natural selection shaped the foraging-related processes of honeybees in such a way that a colony can react to changing environmental conditions optimally. To investigate this complex dynamic social system, we developed a multi-agent model of the nectar flow inside and outside of a honeybee colony. In a honeybee colony, a temporal caste collects nectar in the environment. These foragers bring their harvest into the colony, where they unload their nectar loads to one or more storer bees. Our model predicts that a cohort of foragers, collecting nectar from a single nectar source, is able to detect changes in quality in other food sources they have never visited, via the nectar processing system of the colony. We identified two novel pathways of forager-to-forager communication. Foragers can gain information about changes in the nectar flow in the environment via changes in their mean waiting time for unloadings and the number of experienced multiple unloadings. This way two distinct groups of foragers that forage on different nectar sources and that never communicate directly can share information via a third cohort of worker bees. We show that this noisy and loosely knotted social network allows a colony to perform collective information processing, so that a single forager has all necessary information available to be able to 'tune' its social behaviour, like dancing or dance-following. This way the net nectar gain of the colony is increased.

  2. Stochastic extension of cellular manufacturing systems: a queuing-based analysis

    NASA Astrophysics Data System (ADS)

    Fardis, Fatemeh; Zandi, Afagh; Ghezavati, Vahidreza

    2013-07-01

    Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.

  3. Queued History based Mediator Identification for an Incentive Attached peer to peer Electronic Coupon System

    NASA Astrophysics Data System (ADS)

    Shojima, Taiki; Ikkai, Yoshitomo; Komoda, Norihisa

    An incentive attached peer to peer (P2P) electronic coupon system is proposed in which users forward e-coupons to potential users by providing incentives to those mediators. A service provider needs to acquire distribution history for incentive payment by recording UserIDs (UIDs) in the e-coupons, since this system is intended for pure P2P environment. This causes problems of dishonestly altering distribution history. In order to solve such problems, distribution history is realized in a couple of queues structure. They are the UID queue, and the public key queue. Each element of the UID queue at the initial state consists of index, a secret key, and a digital signature. In recording one's UID, the encrypted UID is enqueued to the UID queue with a new digital signature created by a secret key of the dequeued element, so that each UID cannot be altered. The public key queue provides the functionality of validating digital signatures on mobile devices. This method makes it possible both each UID and sequence of them to be certificated. The availability of the method is evaluated by quantifying risk reduction using Fault Tree Analysis (FTA). And it's recognized that the method is better than common encryption methods.

  4. Using activity-based costing and theory of constraints to guide continuous improvement in managed care.

    PubMed

    Roybal, H; Baxendale, S J; Gupta, M

    1999-01-01

    Activity-based costing and the theory of constraints have been applied successfully in many manufacturing organizations. Recently, those concepts have been applied in service organizations. This article describes the application of activity-based costing and the theory of constraints in a managed care mental health and substance abuse organization. One of the unique aspects of this particular application was the integration of activity-based costing and the theory of constraints to guide process improvement efforts. This article describes the activity-based costing model and the application of the theory of constraint's focusing steps with an emphasis on unused capacities of activities in the organization.

  5. Using activity-based costing and theory of constraints to guide continuous improvement in managed care.

    PubMed

    Roybal, H; Baxendale, S J; Gupta, M

    1999-01-01

    Activity-based costing and the theory of constraints have been applied successfully in many manufacturing organizations. Recently, those concepts have been applied in service organizations. This article describes the application of activity-based costing and the theory of constraints in a managed care mental health and substance abuse organization. One of the unique aspects of this particular application was the integration of activity-based costing and the theory of constraints to guide process improvement efforts. This article describes the activity-based costing model and the application of the theory of constraint's focusing steps with an emphasis on unused capacities of activities in the organization. PMID:10350791

  6. Queuing for Union Jobs and the Social Return to Schooling. Institute for Research on Poverty Discussion Papers. Report 360-76.

    ERIC Educational Resources Information Center

    Bishop, John

    An analysis of the argument that a market imperfection (wage differentials and queuing caused by unions) raises the marginal social product (MSP) of college education above the average before-tax private wage premium (APP) for college (this discrepancy is called a union-Q-nality) focuses on verifying five hypotheses: (1) Workers with identical…

  7. [Peplau's theory of interpersonal relations: an analysis based of Barnum].

    PubMed

    de Almeida, Vitória de Cássia Félix; Lopes, Marcos Venícios de Oliveira; Damasceno, Marta Maria Coelho

    2005-06-01

    The use of theories in Nursing reflects a movement in the profession towards the autonomy and delimitation of its actions. It is, therefore, extremely relevant that theories may be analyzed as for their applicability to practice. The object of this study is to make an analytical-descriptive study of Peplau's Theory of Interpersonal Relations in Nursing from the model of analysis proposed by Barbara Barnum. Among the structural components that may be analyzed in a theory was chosen the element "process", a method recommended for the development of nursing' actions, which was submitted to Barnum's criteria of usefulness. The assessment showed that Peplau's theoretical presuppositions are operational and may serve as a basis in any situation in which nurses communicate and interact with his/her patients.

  8. Toward a brain-based theory of beauty.

    PubMed

    Ishizu, Tomohiro; Zeki, Semir

    2011-01-01

    We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1-9, with 9 being the most beautiful. This allowed us to select three sets of stimuli--beautiful, indifferent and ugly--which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources--musical and visual--and probably by other sources as well. This has led us to formulate a brain-based theory of beauty.

  9. Toward A Brain-Based Theory of Beauty

    PubMed Central

    Ishizu, Tomohiro; Zeki, Semir

    2011-01-01

    We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1–9, with 9 being the most beautiful. This allowed us to select three sets of stimuli–beautiful, indifferent and ugly–which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources–musical and visual–and probably by other sources as well. This has led us to formulate a brain-based theory of beauty. PMID:21755004

  10. Design of traveling wave tubes based on field theory

    SciTech Connect

    Vanderplaats, N.R.; Kodis, M.A. . Vacuum Electronics Branch); Freund, H.P. )

    1994-07-01

    A method is described for the design of helix traveling wave tubes (TWT) which is based on the linear field analysis of the coupled beam-wave system. The dispersion relations are obtained by matching of radial admittances at boundaries instead of the individual field components. This approach provides flexibility in modeling various beam and circuit configurations with relative ease by choosing the appropriate admittance functions for each case. The method is illustrated for the case of a solid beam inside a sheath helix which is loaded externally by lossy dielectric material, a conducting cylinder, and axial vanes. Extension of the analysis to include a thin tape helix model is anticipated in the near future. The TWT model may be divided into axial regions to include velocity tapers, lossy materials and severs, with the helix geometry in each region varied arbitrarily. The relations between the ac velocities, current densities, and axial electric fields are used to derive a general expression for the new amplitudes of the three forward waves at each axial boundary. The sum of the fields for the three forward waves (two waves in a drift region) is followed to the circuit output. Numerical results of the field analysis are compared with the coupled-mode Pierce theory. A method is suggested for applying the field analysis to accurate design of practical TWT's that have a more complex circuit geometry, which starts with a simple measurement of the dispersion of the helix circuit. The field analysis may then be used to generate a circuit having properties very nearly equivalent to those of the actual circuit.

  11. Trends in information theory-based chemical structure codification.

    PubMed

    Barigye, Stephen J; Marrero-Ponce, Yovani; Pérez-Giménez, Facundo; Bonchev, Danail

    2014-08-01

    This report offers a chronological review of the most relevant applications of information theory in the codification of chemical structure information, through the so-called information indices. Basically, these are derived from the analysis of the statistical patterns of molecular structure representations, which include primitive global chemical formulae, chemical graphs, or matrix representations. Finally, new approaches that attempt to go "back to the roots" of information theory, in order to integrate other information-theoretic measures in chemical structure coding are discussed.

  12. Quantum mechanical embedding theory based on a unique embedding potential

    SciTech Connect

    Chen Huang; Pavone, Michele; Carter, Emily A.

    2011-04-21

    We remove the nonuniqueness of the embedding potential that exists in most previous quantum mechanical embedding schemes by letting the environment and embedded region share a common embedding (interaction) potential. To efficiently solve for the embedding potential, an optimized effective potential method is derived. This embedding potential, which eschews use of approximate kinetic energy density functionals, is then used to describe the environment while a correlated wavefunction (CW) treatment of the embedded region is employed. We first demonstrate the accuracy of this new embedded CW (ECW) method by calculating the van der Waals binding energy curve between a hydrogen molecule and a hydrogen chain. We then examine the prototypical adsorption of CO on a metal surface, here the Cu(111) surface. In addition to obtaining proper site ordering (top site most stable) and binding energies within this theory, the ECW exhibits dramatic changes in the p-character of the CO 4{sigma} and 5{sigma} orbitals upon adsorption that agree very well with x-ray emission spectra, providing further validation of the theory. Finally, we generalize our embedding theory to spin-polarized quantum systems and discuss the connection between our theory and partition density functional theory.

  13. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    ERIC Educational Resources Information Center

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

  14. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

    PubMed

    Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

    2012-09-01

    Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change. PMID:24073123

  15. A Theory-Driven Integrative Process/Outcome Evaluation of a Concept-Based Nursing Curriculum

    ERIC Educational Resources Information Center

    Fromer, Rosemary F.

    2013-01-01

    The current trend in curriculum revision in nursing education is concept-based learning, but little research has been done on concept-based curricula in nursing education. The study used a theory-driven integrative process/outcome evaluation. Embedded in this theory-driven integrative process/outcome evaluation was a causal comparative…

  16. How Is a Science Lesson Developed and Implemented Based on Multiple Intelligences Theory?

    ERIC Educational Resources Information Center

    Kaya, Osman Nafiz

    2008-01-01

    The purpose of this study is to present the whole process step-by-step of how a science lesson can be planned and implemented based on Multiple Intelligences (MI) theory. First, it provides the potential of the MI theory for science teaching and learning. Then an MI science lesson that was developed based on a modified model in the literature and…

  17. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  18. Capacity and Delay Estimation for Roundabouts Using Conflict Theory

    PubMed Central

    Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

    2014-01-01

    To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982

  19. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  20. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  1. Logical Thinking in Children; Research Based on Piaget's Theory.

    ERIC Educational Resources Information Center

    Sigel, Irving E., Ed.; Hooper, Frank H., Ed.

    Theoretical and empirical research derived from Piagetian theory is collected on the intellectual development of the elementary school child and his acquisition and utilization of conservation concepts. The articles present diversity of method and motive in the results of replication (validation studies of the description of cognitive growth) and…

  2. Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection

    NASA Astrophysics Data System (ADS)

    Snarska, M.; Krzych, J.

    2006-11-01

    Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.

  3. Effective Contraceptive Use: An Exploration of Theory-Based Influences

    ERIC Educational Resources Information Center

    Peyman, N.; Oakley, D.

    2009-01-01

    The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…

  4. Prior individual training and self-organized queuing during group emergency escape of mice from water pool.

    PubMed

    Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia

    2015-01-01

    We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using 'actors'. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit-they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ρ of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)-a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ρ = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (ρ = 11.9%) produced a longer average individual escape time.

  5. Local control theory in trajectory-based nonadiabatic dynamics

    SciTech Connect

    Curchod, Basile F. E.; Penfold, Thomas J.; Rothlisberger, Ursula; Tavernelli, Ivano

    2011-10-15

    In this paper, we extend the implementation of nonadiabatic molecular dynamics within the framework of time-dependent density-functional theory in an external field described in Tavernelli et al.[Phys. Rev. A 81, 052508 (2010)] by calculating on-the-fly pulses to control the population transfer between electronic states using local control theory. Using Tully's fewest switches trajectory surface hopping method, we perform MD to control the photoexcitation of LiF and compare the results to quantum dynamics (QD) calculations performed within the Heidelberg multiconfiguration time-dependent Hartree package. We show that this approach is able to calculate a field that controls the population transfer between electronic states. The calculated field is in good agreement with that obtained from QD, and the differences that arise are discussed in detail.

  6. Ground movement analysis based on stochastic medium theory.

    PubMed

    Fei, Meng; Wu, Li-chun; Zhang, Jia-sheng; Deng, Guo-dong; Ni, Zhi-hui

    2014-01-01

    In order to calculate the ground movement induced by displacement piles driven into horizontal layered strata, an axisymmetric model was built and then the vertical and horizontal ground movement functions were deduced using stochastic medium theory. Results show that the vertical ground movement obeys normal distribution function, while the horizontal ground movement is an exponential function. Utilizing field measured data, parameters of these functions can be obtained by back analysis, and an example was employed to verify this model. Result shows that stochastic medium theory is suitable for calculating the ground movement in pile driving, and there is no need to consider the constitutive model of soil or contact between pile and soil. This method is applicable in practice. PMID:24701184

  7. Predicting the Number of Public Computer Terminals Needed for an On-Line Catalog: A Queuing Theory Approach.

    ERIC Educational Resources Information Center

    Knox, A. Whitney; Miller, Bruce A.

    1980-01-01

    Describes a method for estimating the number of cathode ray tube terminals needed for public use of an online library catalog. Authors claim method could also be used to estimate needed numbers of microform readers for a computer output microform (COM) catalog. Formulae are included. (Author/JD)

  8. Gravitational Cherenkov losses in theories based on modified Newtonian dynamics.

    PubMed

    Milgrom, Mordehai

    2011-03-18

    Survival of high-energy cosmic rays (HECRs) against gravitational Cherenkov losses is shown not to cast strong constraints on modified Newtonian dynamics (MOND) theories that are compatible with general relativity (GR): theories that coincide with GR for accelerations ≫a(0) (a(0) is the MOND constant). The energy-loss rate, E, is many orders smaller than those derived in the literature for theories with no extra scale. Modification to GR, which underlies E, enters only beyond the MOND radius of the particle: r(M)=(Gp/ca(0))(1/2). The spectral cutoff, entering E quadratically, is thus r(M)(-1), not k(dB)=p/ℏ. Thus, E is smaller than published rates, which use k(dB), by a factor ∼(r(M)k(dB))(2)≈10(39)(cp/3×10(11)  Gev)(3). Losses are important only beyond D(loss)≈qℓ(M), where q is a dimensionless factor, and ℓ(M)=c(2)/a(0) is the MOND length, which is ≈2π times the Hubble distance. PMID:21469855

  9. Circuit theory and model-based inference for landscape connectivity

    USGS Publications Warehouse

    Hanks, Ephraim M.; Hooten, Mevin B.

    2013-01-01

    Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.

  10. Collective learning modeling based on the kinetic theory of active particles

    NASA Astrophysics Data System (ADS)

    Burini, D.; De Lillo, S.; Gibelli, L.

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

  11. Collective learning modeling based on the kinetic theory of active particles.

    PubMed

    Burini, D; De Lillo, S; Gibelli, L

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

  12. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  13. Interactive image segmentation framework based on control theory

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Kolesov, Ivan; Ratner, Vadim; Karasev, Peter; Tannenbaum, Allen

    2015-03-01

    Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated.

  14. A precepted leadership course based on Bandura's social learning theory.

    PubMed

    Haddock, K S

    1994-01-01

    Transition from student to registered nurse (RN) has long been cited as a difficult time for new graduates entering health care. Bandura's (1977) theory of social learning guided a revision of a nursing leadership course required of baccalaureate student nurses (BSNs) in their final semester. The preceptorship allowed students to work closely with and to practice modeled behaviors of RNs and then receive feedback and reinforcement from both the preceptor and the supervising faculty member. Students were thus prepared to function better in the reality of the practice setting. Positive outcomes were experienced by students, BSN preceptors, faculty, and nurse administrators.

  15. Wireless network traffic modeling based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Liu, Chunfeng; Shu, Yantai; Yang, Oliver W. W.; Liu, Jiakun; Dong, Linfang

    2006-10-01

    In this paper, Extreme Value Theory (EVT) is presented to analyze wireless network traffic. The role of EVT is to allow the development of procedures that are scientifically and statistically rational to estimate the extreme behavior of random processes. There are two primary methods for studying extremes: the Block Maximum (BM) method and the Points Over Threshold (POT) method. By taking limited traffic data that is greater than the threshold value, our experiment and analysis show the wireless network traffic model obtained with the EVT fits well with that of empirical distribution of traffic, thus illustrating that EVT has a good application foreground in the analysis of wireless network traffic.

  16. An eddy tracking algorithm based on dynamical systems theory

    NASA Astrophysics Data System (ADS)

    Conti, Daniel; Orfila, Alejandro; Mason, Evan; Sayol, Juan Manuel; Simarro, Gonzalo; Balle, Salvador

    2016-11-01

    This work introduces a new method for ocean eddy detection that applies concepts from stationary dynamical systems theory. The method is composed of three steps: first, the centers of eddies are obtained from fixed points and their linear stability analysis; second, the size of the eddies is estimated from the vorticity between the eddy center and its neighboring fixed points, and, third, a tracking algorithm connects the different time frames. The tracking algorithm has been designed to avoid mismatching connections between eddies at different frames. Eddies are detected for the period between 1992 and 2012 using geostrophic velocities derived from AVISO altimetry and a new database is provided for the global ocean.

  17. Curriculum Design for Junior Life Sciences Based Upon the Theories of Piaget and Skiller. Final Report.

    ERIC Educational Resources Information Center

    Pearce, Ella Elizabeth

    Four seventh grade life science classes, given curriculum materials based upon Piagetian theories of intellectual development and Skinner's theories of secondary reinforcement, were compared with four control classes from the same school districts. Nine students from each class, who(at the pretest) were at the concrete operations stage of…

  18. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    ERIC Educational Resources Information Center

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  19. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

    ERIC Educational Resources Information Center

    McQuillin, Samuel D.; Lyons, Michael D.

    2016-01-01

    This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

  20. Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory

    ERIC Educational Resources Information Center

    Johnson, David W.; Johnson, Roger T.; Smith, Karl A.

    2014-01-01

    Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…

  1. Social Learning Theory Parenting Intervention Promotes Attachment-Based Caregiving in Young Children: Randomized Clinical Trial

    ERIC Educational Resources Information Center

    O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…

  2. The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation

    ERIC Educational Resources Information Center

    Moorer, Cleamon, Jr.

    2014-01-01

    This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…

  3. Applying Ecological Theory to Advance the Science and Practice of School-Based Prejudice Reduction Interventions

    ERIC Educational Resources Information Center

    McKown, Clark

    2005-01-01

    Several school-based racial prejudice-reduction interventions have demonstrated some benefit. Ecological theory serves as a framework within which to understand the limits and to enhance the efficacy of prejudice-reduction interventions. Using ecological theory, this article examines three prejudice-reduction approaches, including social cognitive…

  4. Teachers' Private Theories and their Design of Technology-Based Learning

    ERIC Educational Resources Information Center

    Churchill, Daniel

    2006-01-01

    This study explores the private theories of four vocational education teachers in Singapore who have engaged in the design of technology-based learning for their own classes. The understanding of teachers' private theories is important in the context of contemporary educational reforms, which emphasise the shift towards student-centred practices…

  5. Assembly models for Papovaviridae based on tiling theory

    NASA Astrophysics Data System (ADS)

    Keef, T.; Taormina, A.; Twarock, R.

    2005-09-01

    A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail.

  6. Assembly models for Papovaviridae based on tiling theory.

    PubMed

    Keef, T; Taormina, A; Twarock, R

    2005-09-01

    A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail. PMID:16224123

  7. A reactive mobile robot based on a formal theory of action

    SciTech Connect

    Baral, C. Floriano, L.; Gabaldon, A.

    1996-12-31

    One of the agenda behind research in reasoning about actions is to develop autonomous agents (robots) that can act in a dynamic world. The early attempts to use theories of reasoning about actions and planning to formulate a robot control architecture were not successful for several reasons: The early theories based on STRIPS and its extensions allowed only observations about the initial state. A robot control architecture using these theories was usually of the form: (i) make observations (ii) Use the action theory to construct a plan to achieve the goal, and (iii) execute the plan.

  8. Determination of the Sediment Carrying Capacity Based on Perturbed Theory

    PubMed Central

    Ni, Zhi-hui; Zeng, Qiang; Li-chun, Wu

    2014-01-01

    According to the previous studies of sediment carrying capacity, a new method of sediment carrying capacity on perturbed theory was proposed. By taking into account the average water depth, average flow velocity, settling velocity, and other influencing factors and introducing the median grain size as one main influencing factor in deriving the new formula, we established a new sediment carrying capacity formula. The coefficients were determined by the principle of dimensional analysis, multiple linear regression method, and the least square method. After that, the new formula was verified through measuring data of natural rivers and flume tests and comparing the verified results calculated by Cao Formula, Zhang Formula, Li Formula, Engelung-Hansen Formula, Ackers-White Formula, and Yang Formula. According to the compared results, it can be seen that the new method is of high accuracy. It could be a useful reference for the determination of sediment carrying capacity. PMID:25136652

  9. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408

  10. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions.

  11. Theory-based Low-Sodium Diet Education for Heart Failure Patients

    PubMed Central

    Welsh, Darlene; Marcinek, Regina; Abshire, Demetrius; Lennie, Terry; Biddle, Martha; Bentley, Brooke; Moser, Debra

    2010-01-01

    Theory-based teaching strategies for promoting adherence to a low-sodium diet among patients with heart failure are presented in this manuscript. The strategies, which are based on the theory of planned behavior, address patient attitude, subjective norm, and perceived control as they learn how to follow a low-sodium diet. Home health clinicians can select a variety of the instructional techniques presented to meet individual patient learning needs. PMID:20592543

  12. Interlaminar Stresses by Refined Beam Theories and the Sinc Method Based on Interpolation of Highest Derivative

    NASA Technical Reports Server (NTRS)

    Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander

    2010-01-01

    Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.

  13. Neuroanatomical and neurochemical bases of theory of mind.

    PubMed

    Abu-Akel, Ahmad; Shamay-Tsoory, Simone

    2011-09-01

    This paper presents a novel neurobiological model of theory of mind (ToM) that incorporates both neuroanatomical and neurochemical levels of specificity. Within this model, cortical and subcortical regions are functionally organized into networks that subserve the ability to represent cognitive and affective mental states to both self and other. The model maintains that (1) cognitive and affective aspects of ToM are subserved by dissociable, yet interacting, prefrontal networks. The cognitive ToM network primarily engages the dorsomedial prefrontal cortex, the dorsal anterior cingulate cortex and the dorsal striatum; and the affective ToM network primarily engages the ventromedial and orbitofrontal cortices, the ventral anterior cingulate cortex, the amygdala and the ventral striatum; (2) self and other mental-state representation is processed by distinct brain regions within the mentalizing network, and that the ability to distinguish between self and other mental states is modulated by a functionally interactive dorsal and ventral attention/selection systems at the temporoparietal junction and the anterior cingulate cortex; and (3) ToM functioning is dependent on the integrity of the dopaminergic and serotonergic systems which are primarily engaged in the maintenance and application processes of represented mental states. In addition to discussing the mechanisms involved in mentalizing in terms of its component processes, we discuss the model's implications to pathologies that variably impact one's ability to represent, attribute and apply mental states. PMID:21803062

  14. Seismic data reconstruction based on CS and Fourier theory

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Chen, Xiao-Hong; Wu, Xin-Min

    2013-06-01

    Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theorem, rendering the coherent aliases of regular undersampling into harmless incoherent random noise using random undersampling, and effectively turning the reconstruction problem into a much simpler denoising problem. We introduce the projections onto convex sets (POCS) algorithm in the data reconstruction process, apply the exponential decay threshold parameter in the iterations, and modify the traditional reconstruction process that performs forward and reverse transforms in the time and space domain. We propose a new method that uses forward and reverse transforms in the space domain. The proposed method uses less computer memory and improves computational speed. We also analyze the antinoise and anti-aliasing ability of the proposed method, and compare the 2D and 3D data reconstruction. Theoretical models and real data show that the proposed method is effective and of practical importance, as it can reconstruct missing traces and reduce the exploration cost of complex data acquisition.

  15. Improved routing strategy based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Song, Hai-Quan; Guo, Jin

    2015-10-01

    Routing and path selection are crucial for many communication and logistic applications. We study the interaction between nodes and packets and establish a simple model for describing the attraction of the node to the packet in transmission process by using the gravitational field theory, considering the real and potential congestion of the nodes. On the basis of this model, we propose a gravitational field routing strategy that considers the attractions of all of the nodes on the travel path to the packet. In order to illustrate the efficiency of proposed routing algorithm, we introduce the order parameter to measure the throughput of the network by the critical value of phase transition from a free flow phase to a congested phase, and study the distribution of betweenness centrality and traffic jam. Simulations show that, compared with the shortest path routing strategy, the gravitational field routing strategy considerably enhances the throughput of the network and balances the traffic load, and nearly all of the nodes are used efficiently. Project supported by the Technology and Development Research Project of China Railway Corporation (Grant No. 2012X007-D) and the Key Program of Technology and Development Research Foundation of China Railway Corporation (Grant No. 2012X003-A).

  16. Non-fragile H∞ synchronization of memristor-based neural networks using passivity theory.

    PubMed

    Mathiyalagan, K; Anbuvithya, R; Sakthivel, R; Park, Ju H; Prakash, P

    2016-02-01

    In this paper, we formulate and investigate the mixed H∞ and passivity based synchronization criteria for memristor-based recurrent neural networks with time-varying delays. Some sufficient conditions are obtained to guarantee the synchronization of the considered neural network based on the master-slave concept, differential inclusions theory and Lyapunov-Krasovskii stability theory. Also, the memristive neural network is considered with two different types of memductance functions and two types of gain variations. The results for non-fragile observer-based synchronization are derived in terms of linear matrix inequalities (LMIs). Finally, the effectiveness of the proposed criterion is demonstrated through numerical examples.

  17. Boundary based on exchange symmetry theory for multilevel simulations. I. Basic theory.

    PubMed

    Shiga, Motoyuki; Masia, Marco

    2013-07-28

    In this paper, we lay the foundations for a new method that allows multilevel simulations of a diffusive system, i.e., a system where a flux of particles through the boundaries might disrupt the primary region. The method is based on the use of flexible restraints that maintain the separation between inner and outer particles. It is shown that, by introducing a bias potential that accounts for the exchange symmetry of the system, the correct statistical distribution is preserved. Using a toy model consisting of non-interacting particles in an asymmetric potential well, we prove that the method is formally exact, and that it could be simplified by considering only up to a couple of particle exchanges without a loss of accuracy. A real-world test is then made by considering a hybrid MM(∗)/MM calculation of cesium ion in water. In this case, the single exchange approximation is sound enough that the results superimpose to the exact solutions. Potential applications of this method to many different hybrid QM/MM systems are discussed, as well as its limitations and strengths in comparison to existing approaches. PMID:23901973

  18. Learning Trajectory Based Instruction: Toward a Theory of Teaching

    ERIC Educational Resources Information Center

    Sztajn, Paola; Confrey, Jere; Wilson, P. Holt; Edgington, Cynthia

    2012-01-01

    In this article, we propose a theoretical connection between research on learning and research on teaching through recent research on students' learning trajectories (LTs). We define learning trajectory based instruction (LTBI) as teaching that uses students' LTs as the basis for instructional decisions. We use mathematics as the context for our…

  19. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  20. An Efficacious Theory-Based Intervention for Stepfamilies

    ERIC Educational Resources Information Center

    Forgatch, Marion S.; DeGarmo, David S.; Beldavs, Zintars G.

    2005-01-01

    This article evaluates the efficacy of the Oregon model of Parent Management Training (PMTO) in the stepfamily context. Sixty-seven of 110 participants in the Marriage and Parenting in Stepfamilies (MAPS) program received a PMTO-based intervention. Participants in the randomly assigned experimental group displayed a large effect in benefits to…

  1. A Conceptual Framework Based on Activity Theory for Mobile CSCL

    ERIC Educational Resources Information Center

    Zurita, Gustavo; Nussbaum, Miguel

    2007-01-01

    There is a need for collaborative group activities that promote student social interaction in the classroom. Handheld computers interconnected by a wireless network allow people who work on a common task to interact face to face while maintaining the mediation afforded by a technology-based system. Wirelessly interconnected handhelds open up new…

  2. Integrated Models of School-Based Prevention: Logic and Theory

    ERIC Educational Resources Information Center

    Domitrovich, Celene E.; Bradshaw, Catherine P.; Greenberg, Mark T.; Embry, Dennis; Poduska, Jeanne M.; Ialongo, Nicholas S.

    2010-01-01

    School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to…

  3. Scale-invariant entropy-based theory for dynamic ordering

    SciTech Connect

    Mahulikar, Shripad P. E-mail: spm@aero.iitb.ac.in; Kumari, Priti

    2014-09-01

    Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations.

  4. Model-based resolution: applying the theory in quantitative microscopy.

    PubMed

    Santos, A; Young, I T

    2000-06-10

    Model-based image processing techniques have been proposed as a way to increase the resolution of optical microscopes. Here a model based on the microscope's point-spread function is analyzed, and the resolution limits achieved with a proposed goodness-of-fit criterion are quantified. Several experiments were performed to evaluate the possibilities and limitations of this method: (a) experiments with an ideal (diffraction-limited) microscope, (b) experiments with simulated dots and a real microscope, and (c) experiments with real dots acquired with a real microscope. The results show that a threefold increase over classical resolution (e.g., Rayleigh) is possible. These results can be affected by model misspecifications, whereas model corruption, as seen in the effect of Poisson noise, seems to be unimportant. This research can be considered to be preliminary with the final goal being the accurate measurement of various cytogenetic properties, such as gene distributions, in labeled preparations.

  5. INTEGRATED MODELS OF SCHOOL-BASED PREVENTION: LOGIC AND THEORY

    PubMed Central

    DOMITROVICH, CELENE E.; BRADSHAW, CATHERINE P.; GREENBERG, MARK T.; EMBRY, DENNIS; PODUSKA, JEANNE M.; IALONGO, NICHOLAS S.

    2011-01-01

    School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to school-based prevention. These models leverage the most effective structural and content components of social-emotional and behavioral health prevention interventions. Integrated interventions are expected to have additive and synergistic effects that result in greater impacts on multiple student outcomes. Integrated programs are also expected to be more efficient to deliver, easier to implement with high quality and integrity, and more sustainable. We provide a detailed example of the process through which the PAX-Good Behavior Game and the Promoting Alternative Thinking Strategies (PATHS) curriculum were integrated into the PATHS to PAX model. Implications for future research are proposed. PMID:27182089

  6. Transient response of lattice structures based on exact member theory

    NASA Technical Reports Server (NTRS)

    Anderson, Melvin S.

    1989-01-01

    The computer program BUNVIS-RG, which treats vibration and buckling of lattice structures using exact member stiffness matrices, has been extended to calculate the exact modal mass and stiffness quantities that can be used in a conventional transient response analysis based on modes. The exact nature of the development allows inclusion of local member response without introduction of any interior member nodes. Results are given for several problems in which significant interaction between local and global response occurs.

  7. Research on e-learning services based on ontology theory

    NASA Astrophysics Data System (ADS)

    Liu, Rui

    2013-07-01

    E-learning services can realize network learning resource sharing and interoperability, but they can't realize automatic discovery, implementation and integration of services. This paper proposes a framework of e-learning services based on ontology, the ontology technology is applied to the publication and discovery process of e-learning services, in order to realize accurate and efficient retrieval and utilization of e-learning services.

  8. Ab initio theory of iron-based superconductors

    NASA Astrophysics Data System (ADS)

    Essenberger, F.; Sanna, A.; Buczek, P.; Ernst, A.; Sandratskii, L.; Gross, E. K. U.

    2016-07-01

    We report a first-principles study of the superconducting critical temperature and other properties of Fe-based superconductors taking into account, on equal footing, phonon, charge, and spin-fluctuation mediated Cooper pairing. We show that in FeSe this leads to a modulated s ± gap symmetry and that the antiferromagnetic paramagnons are the leading mechanism for superconductivity in FeSe, overcoming the strong repulsive effect of both phonons and charge pairing.

  9. Venture Capital Investment Base on Grey Relational Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xubo

    This paper builds a venture capital investment projects selection evaluation model base on risk-weight investment return using grey relational analysis. The risk and return in venture capital investment projects selection process is analyses. These risk and return mainly constricted in management ability, operation ability, market ability, exit obtain and investment cost. The 18 sub-indicators are the impact factors contributed to these five evaluation aspects. Grey relation analysis is use to evaluate the venture capital investment selection. Get the optimal solution of risk-weight double objective investment selection evaluation model. An example is used to demonstrate the model in this paper.

  10. Theory of zwitterionic molecular-based organic magnets

    NASA Astrophysics Data System (ADS)

    Shelton, William A.; Aprà, Edoardo; Sumpter, Bobby G.; Saraiva-Souza, Aldilene; Souza Filho, Antonio G.; Nero, Jordan Del; Meunier, Vincent

    2011-08-01

    We describe a class of organic molecular magnets based on zwitterionic molecules (betaine derivatives) possessing donor, π bridge, and acceptor groups. Using extensive electronic structure calculations we show the electronic ground-state in these systems is magnetic. In addition, we show that the large energy differences computed for the various magnetic states indicate a high Neel temperature. The quantum mechanical nature of the magnetic properties originates from the conjugated π bridge (only p electrons) in cooperation with the molecular donor-acceptor character. The exchange interactions between electron spin are strong, local, and independent on the length of the π bridge.

  11. An explanation of flocculation using Lewis acid-base theory

    SciTech Connect

    Brown, P.M.; Stanley, D.A.; Scheiner, B.J.

    1988-01-01

    This paper describes a Bureau of Mines-devleoped method of dewatering clay slurries based on flocculation by high-molecular-weight polymers and water removal from the formed flocs using a trommel or hydrosieve. The exchange ion on the clays affects their dewaterability. Metal ions in solution and on the exchange sites of smectite clays are known to act as Lewis acids. Recent work has determined that these ions can be titrated with high-molecular-weight polymers. The relative acidity of the exchange ion and the basicity of the polymer determined by the new method give insight into the dewatering mechanism.

  12. Discussion of remote sensing image classification method based on evidence theory

    NASA Astrophysics Data System (ADS)

    Deng, Wensheng; Shao, Xiaoli; Guan, Zequn

    2005-10-01

    Remote sensing image classification is an important and complex problem. Conventional remote sensing image classification methods are mostly based on Bayes' subjective probability theory. Because there are many defects on solving uncertainty problem, new tendency is that mathematical theory of evidence is applied to remote sensing image classification. At first, this paper introduces differences between Dempster-Shafer's(D-S) evidence theory and Bayes' subjective probability theory in solving uncertainty problem, main definitions and algorithms of D-S evidence theory. Especially degree of belief, degree of plausibility and degree of support are the bridges that D-S evidence theory is used in other fields. It emphatically introduced Support function that D-S evidence theory is used on pattern recognition, and degree of support is applied to classification. We acquire degree of support surfaces according to large classes, such as urban land, farmland, forest land, and water, then use "hard classification" to gain initial classification result. If initial classification accuracy is unfitted to acquirement, do reclassification for degree of support surfaces of less than threshold until final classification result reaches satisfying accuracy. We conclude that main advantages of this method are that it can go on reclassification after classification and its classification accuracy is very high. This method has dependable theory, intensive application, easy operation and research potential.

  13. The boundaries of instance-based learning theory for explaining decisions from experience.

    PubMed

    Gonzalez, Cleotilde

    2013-01-01

    Most demonstrations of how people make decisions in risky situations rely on decisions from description, where outcomes and their probabilities are explicitly stated. But recently, more attention has been given to decisions from experience where people discover these outcomes and probabilities through exploration. More importantly, risky behavior depends on how decisions are made (from description or experience), and although prospect theory explains decisions from description, a comprehensive model of decisions from experience is yet to be found. Instance-based learning theory (IBLT) explains how decisions are made from experience through interactions with dynamic environments (Gonzalez et al., 2003). The theory has shown robust explanations of behavior across multiple tasks and contexts, but it is becoming unclear what the theory is able to explain and what it does not. The goal of this chapter is to start addressing this problem. I will introduce IBLT and a recent cognitive model based on this theory: the IBL model of repeated binary choice; then I will discuss the phenomena that the IBL model explains and those that the model does not. The argument is for the theory's robustness but also for clarity in terms of concrete effects that the theory can or cannot account for.

  14. Safety models incorporating graph theory based transit indicators.

    PubMed

    Quintero, Liliana; Sayed, Tarek; Wahba, Mohamed M

    2013-01-01

    There is a considerable need for tools to enable the evaluation of the safety of transit networks at the planning stage. One interesting approach for the planning of public transportation systems is the study of networks. Network techniques involve the analysis of systems by viewing them as a graph composed of a set of vertices (nodes) and edges (links). Once the transport system is visualized as a graph, various network properties can be evaluated based on the relationships between the network elements. Several indicators can be calculated including connectivity, coverage, directness and complexity, among others. The main objective of this study is to investigate the relationship between network-based transit indicators and safety. The study develops macro-level collision prediction models that explicitly incorporate transit physical and operational elements and transit network indicators as explanatory variables. Several macro-level (zonal) collision prediction models were developed using a generalized linear regression technique, assuming a negative binomial error structure. The models were grouped into four main themes: transit infrastructure, transit network topology, transit route design, and transit performance and operations. The safety models showed that collisions were significantly associated with transit network properties such as: connectivity, coverage, overlapping degree and the Local Index of Transit Availability. As well, the models showed a significant relationship between collisions and some transit physical and operational attributes such as the number of routes, frequency of routes, bus density, length of bus and 3+ priority lanes.

  15. [Training -- competency-based education -- learning theory and practice].

    PubMed

    Breuer, Georg

    2013-11-01

    A lifelong learning process is necessarily the basis for the specialization and expertise in the field of anesthesiology. Thus competency as a physician is a complex, multidimensional construction of knowledge, skills and attitudes to be able to solve and persist the complex daily work challenges in a flexible and responsible way. Experts therefore showflexible and intuitive capabilities in pursuing their profession. Accordingly modern competency based learning objectives are very helpful. The DGAI Commission for “Further Education” already thought ahead in defining a competencybased curriculum for the specialization in the field of anesthesiology and could be integrated into the frameworks of the German Medical Association. In addition to the curricular framework elements of assessment are necessary. A single oral exam is consequently not representative for different levels of competencies. However, there is beside the responsibility of the learners for their learning processalso a high obligation of the clinical teachers to attend the learning process and to ensure a positive learning atmosphere with scope for feedback. Some competencies potentially could be better learned in a “sheltered” room based on simulation outside the OR, for example to train rare incidents or emergency procedures. In general there should be ongoing effort to enhance the process of expertise development, also in context of patient safety and quality management.

  16. Following Human Footsteps: Proposal of a Decision Theory Based on Human Behavior

    NASA Technical Reports Server (NTRS)

    Mahmud, Faisal

    2011-01-01

    Human behavior is a complex nature which depends on circumstances and decisions varying from time to time as well as place to place. The way a decision is made either directly or indirectly related to the availability of the options. These options though appear at random nature, have a solid directional way for decision making. In this paper, a decision theory is proposed which is based on human behavior. The theory is structured with model sets that will show the all possible combinations for making a decision, A virtual and simulated environment is considered to show the results of the proposed decision theory

  17. String-theory-based predictions for nonhydrodynamic collective modes in strongly interacting Fermi gases

    NASA Astrophysics Data System (ADS)

    Bantilan, H.; Brewer, J. T.; Ishii, T.; Lewis, W. E.; Romatschke, P.

    2016-09-01

    Very different strongly interacting quantum systems such as Fermi gases, quark-gluon plasmas formed in high-energy ion collisions, and black holes studied theoretically in string theory are known to exhibit quantitatively similar damping of hydrodynamic modes. It is not known if such similarities extend beyond the hydrodynamic limit. Do nonhydrodynamic collective modes in Fermi gases with strong interactions also match those from string theory calculations? In order to answer this question, we use calculations based on string theory to make predictions for modes outside the hydrodynamic regime in trapped Fermi gases. These predictions are amenable to direct testing with current state-of-the-art cold atom experiments.

  18. The Effect Of The Materials Based On Multiple Intelligence Theory Upon The Intelligence Groups' Learning Process

    NASA Astrophysics Data System (ADS)

    Oral, I.; Dogan, O.

    2007-04-01

    The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.

  19. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

    NASA Astrophysics Data System (ADS)

    Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

    The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

  20. [The Chinese urban metabolisms based on the emergy theory].

    PubMed

    Song, Tao; Cai, Jian-Ming; Ni, Pan; Yang, Zhen-Shan

    2014-04-01

    By using emergy indices of urban metabolisms, this paper analyzed 31 Chinese urban metabolisms' systematic structures and characteristics in 2000 and 2010. The results showed that Chinese urban metabolisms were characterized as resource consumption and coastal external dependency. Non-renewable resource emergy accounted for a higher proportion of the total emergy in the inland cities' urban metabolisms. The emergy of imports and exports accounted for the vast majority of urban metabolic systems in metropolises and coastal cities such as Beijing and Shanghai, showing a significant externally-oriented metabolic characteristic. Based on that, the related policies were put forward: to develop the renewable resource and energy industry; to improve the non-renewable resource and energy utilization efficiencies; to optimize the import and export structure of services, cargo and fuel; and to establish the flexible management mechanism of urban metabolisms. PMID:25011303

  1. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  2. Game Theory Based Trust Model for Cloud Environment.

    PubMed

    Gokulnath, K; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered.

  3. Evaluating the Evidence Base for Relational Frame Theory: A Citation Analysis

    ERIC Educational Resources Information Center

    Dymond, Simon; May, Richard J.; Munnelly, Anita; Hoon, Alice E.

    2010-01-01

    Relational frame theory (RFT) is a contemporary behavior-analytic account of language and cognition. Since it was first outlined in 1985, RFT has generated considerable controversy and debate, and several claims have been made concerning its evidence base. The present study sought to evaluate the evidence base for RFT by undertaking a citation…

  4. Strategies for Integrating Computer-Based Training in College Music Theory Courses.

    ERIC Educational Resources Information Center

    Hess, George J., Jr.

    During the fall semester of 1993, a curriculum-based computer-based training (CBT) program was used to replace all in-class drills in intervals and chord identification for one section of freshman music theory at the University of Northern Colorado. This study was conducted to determine whether aural skills can be taught as effectively through the…

  5. Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again

    ERIC Educational Resources Information Center

    Ohlsson, Stellan

    2016-01-01

    The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to…

  6. Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance

    ERIC Educational Resources Information Center

    Burguillo, Juan C.

    2010-01-01

    This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…

  7. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

  8. Kinematics and dynamics of deployable structures with scissor-like-elements based on screw theory

    NASA Astrophysics Data System (ADS)

    Sun, Yuantao; Wang, Sanmin; Mills, James K.; Zhi, Changjian

    2014-07-01

    Because the deployable structures are complex multi-loop structures and methods of derivation which lead to simpler kinematic and dynamic equations of motion are the subject of research effort, the kinematics and dynamics of deployable structures with scissor-like-elements are presented based on screw theory and the principle of virtual work respectively. According to the geometric characteristic of the deployable structure examined, the basic structural unit is the common scissor-like-element(SLE). First, a spatial deployable structure, comprised of three SLEs, is defined, and the constraint topology graph is obtained. The equations of motion are then derived based on screw theory and the geometric nature of scissor elements. Second, to develop the dynamics of the whole deployable structure, the local coordinates of the SLEs and the Jacobian matrices of the center of mass of the deployable structure are derived. Then, the equivalent forces are assembled and added in the equations of motion based on the principle of virtual work. Finally, dynamic behavior and unfolded process of the deployable structure are simulated. Its figures of velocity, acceleration and input torque are obtained based on the simulate results. Screw theory not only provides an efficient solution formulation and theory guidance for complex multi-closed loop deployable structures, but also extends the method to solve dynamics of deployable structures. As an efficient mathematical tool, the simper equations of motion are derived based on screw theory.

  9. Time dependent mechanical modeling for polymers based on network theory

    NASA Astrophysics Data System (ADS)

    Billon, Noëlle

    2016-05-01

    Despite of a lot of attempts during recent years, complex mechanical behaviour of polymers remains incompletely modelled, making industrial design of structures under complex, cyclic and hard loadings not totally reliable. The non linear and dissipative viscoelastic, viscoplastic behaviour of those materials impose to take into account non linear and combined effects of mechanical and thermal phenomena. In this view, a visco-hyperelastic, viscoplastic model, based on network description of the material has recently been developed and designed in a complete thermodynamic frame in order to take into account those main thermo-mechanical couplings. Also, a way to account for coupled effects of strain-rate and temperature was suggested. First experimental validations conducted in the 1D limit on amorphous rubbery like PMMA in isothermal conditions led to pretty goods results. In this paper a more complete formalism is presented and validated in the case of a semi crystalline polymer, a PA66 and a PET (either amorphous or semi crystalline) are used. Protocol for identification of constitutive parameters is described. It is concluded that this new approach should be the route to accurately model thermo-mechanical behaviour of polymers using a reduced number of parameters of some physicl meaning.

  10. Microfluidic, Bead-Based Assay: Theory and Experiments

    PubMed Central

    Thompson, Jason A.; Bau, Haim H.

    2009-01-01

    Microbeads are frequently used as a solid support for biomolecules such as proteins and nucleic acids in heterogeneous microfluidic assays. However, relatively few studies investigate the binding kinetics on modified bead surfaces in a microfluidics context. In this study, a customized hot embossing technique is used to stamp microwells in a thin plastic substrate where streptavidin-coated agarose beads are selectively placed and subsequently immobilized within a conduit. Biotinylated quantum dots are used as a label to monitor target analyte binding to the bead's surface. Three-dimensional finite element simulations are carried out to model the binding kinetics on the bead's surface. The model accounts for surface exclusion effects resulting from a single quantum dot occluding multiple receptor sites. The theoretical predictions are compared and favorably agree with experimental observations. The theoretical simulations provide a useful tool to predict how varying parameters affect microbead reaction kinetics and sensor performance. This study enhances our understanding of bead-based microfluidic assays and provides a design tool for developers of point-of-care, lab-on-chip devices for medical diagnosis, food and water quality inspection, and environmental monitoring. PMID:19766545

  11. Semiclassical Transition-State Theory Based on Fourth-Order Vibrational Perturbation Theory: The Symmetrical Eckart Barrier.

    PubMed

    Stanton, John F

    2016-07-21

    Semiclassical transition-state theory based on fourth-order vibrational perturbation theory (VPT4-SCTST) is applied to compute the barrier transmission coefficient for the symmetric Eckart potential. For a barrier parametrized to mimic the H2 + H exchange reaction, the results obtained are in excellent agreement with exact quantum calculations over a range of energy that extends down to roughly 1% of the barrier height, V0, where tunneling is negligible. The VPT2-SCTST treatment, which is commonly used in chemical kinetics studies, also performs quite well but already shows an error of a few percent at ca. 0.8 V0 where tunneling is still important. This suggests that VPT4-SCTST could offer an improvement over VPT2-SCTST in applications studies. However, the computational effort for VPT4-SCTST treatments of molecules is excessive, and any improvement gained is unlikely to warrant the increased effort. Nevertheless, the treatment of the symmetric Eckart barrier problem here suggests a simple modification of the usual VPT2-SCTST protocol that warrants further investigation. PMID:27358083

  12. An approach for leukemia classification based on cooperative game theory.

    PubMed

    Torkaman, Atefeh; Charkari, Nasrollah Moghaddam; Aghaeipour, Mahnaz

    2011-01-01

    Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO). Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5) with (90.16%) in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment of leukemic

  13. A Christian faith-based recovery theory: understanding God as sponsor.

    PubMed

    Timmons, Shirley M

    2012-12-01

    This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions.

  14. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

  15. Social learning theory parenting intervention promotes attachment-based caregiving in young children: randomized clinical trial.

    PubMed

    O'Connor, Thomas G; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social learning theory-based treatment promoted change in qualities of parent-child relationship derived from attachment theory. A randomized clinical trial of 174 four- to six-year-olds selected from a high-need urban area and stratified by conduct problems were assigned to a parenting program plus a reading intervention (n = 88) or nonintervention condition (n = 86). In-home observations of parent-child interactions were assessed in three tasks: (a) free play, (b) challenge task, and (c) tidy up. Parenting behavior was coded according to behavior theory using standard count measures of positive and negative parenting, and for attachment theory using measures of sensitive responding and mutuality; children's attachment narratives were also assessed. Compared to the parents in the nonintervention group, parents allocated to the intervention showed increases in the positive behavioral counts and sensitive responding; change in behavioral count measures overlapped modestly with change in attachment-based changes. There was no reliable change in children's attachment narratives associated with the intervention. The findings demonstrate that standard social learning theory-based parenting interventions can change broader aspects of parent-child relationship quality and raise clinical and conceptual questions about the distinctiveness of existing treatment models in parenting research.

  16. The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories

    ERIC Educational Resources Information Center

    Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele

    2009-01-01

    In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…

  17. A Theory-based Approach to the Measurement of Foreign Language Learning Ability: The CANAL-F Theory and Test.

    ERIC Educational Resources Information Center

    Grigorenko, Elena L.; Sternberg, Robert J.; Ehrman, Madeline E.

    2000-01-01

    Presents a rationale, description, and partial construct validation of a new theory of foreign language aptitude: CANAL-F--Cognitive Ability for Novelty in Acquisition of Language (foreign). The theory was applied and implemented in a test of foreign language aptitude (CANAL-FT). Outlines the CANAL-F theory and details of its instrumentation…

  18. Perturbation theory for multicomponent fluids based on structural properties of hard-sphere chain mixtures

    NASA Astrophysics Data System (ADS)

    Hlushak, Stepan

    2015-09-01

    An analytical expression for the Laplace transform of the radial distribution function of a mixture of hard-sphere chains of arbitrary segment size and chain length is used to rigorously formulate the first-order Barker-Henderson perturbation theory for the contribution of the segment-segment dispersive interactions into thermodynamics of the Lennard-Jones chain mixtures. Based on this approximation, a simple variant of the statistical associating fluid theory is proposed and used to predict properties of several mixtures of chains of different lengths and segment sizes. The theory treats the dispersive interactions more rigorously than the conventional theories and provides means for more accurate description of dispersive interactions in the mixtures of highly asymmetric components.

  19. Perturbation theory for multicomponent fluids based on structural properties of hard-sphere chain mixtures

    SciTech Connect

    Hlushak, Stepan

    2015-09-28

    An analytical expression for the Laplace transform of the radial distribution function of a mixture of hard-sphere chains of arbitrary segment size and chain length is used to rigorously formulate the first-order Barker-Henderson perturbation theory for the contribution of the segment-segment dispersive interactions into thermodynamics of the Lennard-Jones chain mixtures. Based on this approximation, a simple variant of the statistical associating fluid theory is proposed and used to predict properties of several mixtures of chains of different lengths and segment sizes. The theory treats the dispersive interactions more rigorously than the conventional theories and provides means for more accurate description of dispersive interactions in the mixtures of highly asymmetric components.

  20. Power optimization of chemically driven heat engine based on first and second order reaction kinetic theory and probability theory

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Chen, Lingen; Sun, Fengrui

    2016-03-01

    The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.

  1. Theory of normal and superconducting properties of fullerene-based solids

    SciTech Connect

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ``standard model`` of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

  2. Theory of normal and superconducting properties of fullerene-based solids

    SciTech Connect

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

  3. Revisited: The South Dakota Board of Nursing theory-based regulatory decisioning model.

    PubMed

    Damgaard, Gloria; Bunkers, Sandra Schmidt

    2012-07-01

    The authors of this column describe the South Dakota Board of Nursing's 11 year journey utilizing a humanbecoming theory-based regulatory decisioning model. The column revisits the model with an emphasis on the cocreation of a strategic plan guiding the work of the South Dakota Board of Nursing through 2014. The strategic plan was influenced by the latest refinements of the humanbecoming postulates and the humanbecoming community change concepts. A graphic picture of the decisioning model is presented along with future plans for the theory-based model.

  4. Studying thin film damping in a micro-beam resonator based on non-classical theories

    NASA Astrophysics Data System (ADS)

    Ghanbari, Mina; Hossainpour, Siamak; Rezazadeh, Ghader

    2016-06-01

    In this paper, a mathematical model is presented for studying thin film damping of the surrounding fluid in an in-plane oscillating micro-beam resonator. The proposed model for this study is made up of a clamped-clamped micro-beam bound between two fixed layers. The micro-gap between the micro-beam and fixed layers is filled with air. As classical theories are not properly capable of predicting the size dependence behaviors of the micro-beam, and also behavior of micro-scale fluid media, hence in the presented model, equation of motion governing longitudinal displacement of the micro-beam has been extracted based on non-local elasticity theory. Furthermore, the fluid field has been modeled based on micro-polar theory. These coupled equations have been simplified using Newton-Laplace and continuity equations. After transforming to non-dimensional form and linearizing, the equations have been discretized and solved simultaneously using a Galerkin-based reduced order model. Considering slip boundary conditions and applying a complex frequency approach, the equivalent damping ratio and quality factor of the micro-beam resonator have been obtained. The obtained values for the quality factor have been compared to those based on classical theories. We have shown that applying non-classical theories underestimate the values of the quality factor obtained based on classical theories. The effects of geometrical parameters of the micro-beam and micro-scale fluid field on the quality factor of the resonator have also been investigated.

  5. An anisotropic constitutive equation for the stress tensor of blood based on mixture theory

    SciTech Connect

    Massoudi, M.; Antaki, J.

    2008-01-01

    Based on ideas proposed by Massoudi and Rajagopal M-R , we develop a model for blood using the theory of interacting continua, that is, the mixture theory. We first provide a brief review of mixture theory, and then discuss certain issues in constitutive modeling of a two-component mixture. In the present formulation, we ignore the biochemistry of blood and assume that blood is composed of red blood cells RBCs suspended in plasma, where the plasma behaves as a linearly viscous fluid and the RBCs are modeled as an anisotropic nonlinear density-gradient-type fluid. We obtain a constitutive relation for blood, based on the simplified constitutive relations derived for plasma and RBCs. A simple shear flow is discussed, and an exact solution is obtained for a very special case; for more general cases, it is necessary to solve the nonlinear coupled equations numerically.

  6. An anisotropic constitutive equation for the stress tensor of blood based on mixture theory

    SciTech Connect

    Massoudi, Mehrdad; Antaki, J.F.

    2008-09-12

    Based on ideas proposed by Massoudi and Rajagopal (M-R), we develop a model for blood using the theory of interacting continua, that is, the mixture theory. We first provide a brief review of mixture theory, and then discuss certain issues in constitutive modeling of a two-component mixture. In the present formulation, we ignore the biochemistry of blood and assume that blood is composed of red blood cells (RBCs) suspended in plasma, where the plasma behaves as a linearly viscous fluid and the RBCs are modeled as an anisotropic nonlinear density-gradient-type fluid. We obtain a constitutive relation for blood, based on the simplified constitutive relations derived for plasma and RBCs. A simple shear flow is discussed, and an exact solution is obtained for a very special case; for more general cases, it is necessary to solve the nonlinear coupled equations numerically.

  7. Energy-Efficiency Analysis of a Distributed Queuing Medium Access Control Protocol for Biomedical Wireless Sensor Networks in Saturation Conditions

    PubMed Central

    Otal, Begonya; Alonso, Luis; Verikoukis, Christos

    2011-01-01

    The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC) protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs). The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors’ energy consumption in order to prolong sensors’ battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions). In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs), which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ) MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead. PMID:22319351

  8. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  9. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

    PubMed

    Cowin, Stephen C; Cardoso, Luis

    2012-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues.

  10. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    ERIC Educational Resources Information Center

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  11. Applying Unidimensional and Multidimensional Item Response Theory Models in Testlet-Based Reading Assessment

    ERIC Educational Resources Information Center

    Min, Shangchao; He, Lianzhen

    2014-01-01

    This study examined the relative effectiveness of the multidimensional bi-factor model and multidimensional testlet response theory (TRT) model in accommodating local dependence in testlet-based reading assessment with both dichotomously and polytomously scored items. The data used were 14,089 test-takers' item-level responses to the…

  12. Application of Online Multimedia Courseware in College English Teaching Based on Constructivism Theory

    ERIC Educational Resources Information Center

    Li, Zhenying

    2012-01-01

    Based on Constructivism Theory, this paper aims to investigate the application of online multimedia courseware to college English teaching. By making experiments and students' feedback, some experience has been accumulated, and some problems are discovered and certain revelations are acquired as well in English teaching practice, which pave the…

  13. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  14. Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders

    ERIC Educational Resources Information Center

    Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel

    2012-01-01

    This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…

  15. Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder

    ERIC Educational Resources Information Center

    Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.

    2005-01-01

    We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion…

  16. Predicting Study Abroad Intentions Based on the Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi

    2012-01-01

    The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…

  17. Examining Instruction in MIDI-Based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

  18. Examining Instruction in MIDI-based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

  19. English Textbooks Based on Research and Theory--A Possible Dream.

    ERIC Educational Resources Information Center

    Suhor, Charles

    1984-01-01

    Research based text materials will probably never dominate the textbook market. To begin with, translating theory and research into practice is a chancy business. There are also creative problems such as the inherent oversimplification involved in textbook writing. Every textbook writer who has been a classroom teacher will acknowledge that such…

  20. Integrating the Demonstration Orientation and Standards-Based Models of Achievement Goal Theory

    ERIC Educational Resources Information Center

    Wynne, Heather Marie

    2014-01-01

    Achievement goal theory and thus, the empirical measures stemming from the research, are currently divided on two conceptual approaches, namely the reason versus aims-based models of achievement goals. The factor structure and predictive utility of goal constructs from the Patterns of Adaptive Learning Strategies (PALS) and the latest two versions…

  1. Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach

    ERIC Educational Resources Information Center

    Mainardes, Emerson; Alves, Helena; Raposo, Mario

    2013-01-01

    In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…

  2. Item Response Theory with Estimation of the Latent Population Distribution Using Spline-Based Densities

    ERIC Educational Resources Information Center

    Woods, Carol M.; Thissen, David

    2006-01-01

    The purpose of this paper is to introduce a new method for fitting item response theory models with the latent population distribution estimated from the data using splines. A spline-based density estimation system provides a flexible alternative to existing procedures that use a normal distribution, or a different functional form, for the…

  3. A Three Year Outcome Evaluation of a Theory Based Drink Driving Education Program.

    ERIC Educational Resources Information Center

    Sheehan, Mary; And Others

    1996-01-01

    Reports on the impact of a "drink driving education program" taught to tenth-grade students. The program, which involved twelve lessons, used strategies based on the Ajzen and Madden theory of planned behavior. Students (N=1,774) were trained to use alternatives to drinking and driving and to use safer passenger behaviors, and were followed-up…

  4. Theory Presentation and Assessment in a Problem-Based Learning Group.

    ERIC Educational Resources Information Center

    Glenn, Phillip J.; Koschmann, Timothy; Conlee, Melinda

    A study used conversational analysis to examine the reasoning students use in a Problem-Based Learning (PBL) environment as they formulate a theory (in medical contexts, a diagnosis) which accounts for evidence (medical history and symptoms). A videotaped group interaction was analyzed and transcribed. In the segment of interaction examined, the…

  5. Aligning Theory and Web-Based Instructional Design Practice with Design Patterns.

    ERIC Educational Resources Information Center

    Frizell, Sherri S.; Hubscher, Roland

    Designing instructionally sound Web courses is a difficult task for instructors who lack experience in interaction and Web-based instructional design. Learning theories and instructional strategies can provide course designers with principles and design guidelines associated with effective instruction that can be utilized in the design of…

  6. Optimal guidance law for cooperative attack of multiple missiles based on optimal control theory

    NASA Astrophysics Data System (ADS)

    Sun, Xiao; Xia, Yuanqing

    2012-08-01

    This article considers the problem of optimal guidance laws for cooperative attack of multiple missiles based on the optimal control theory. New guidance laws are presented such that multiple missiles attack a single target simultaneously. Simulation results show the effectiveness of the proposed algorithms.

  7. The Triarchic Theory of Intelligence and Computer-based Inquiry Learning.

    ERIC Educational Resources Information Center

    Howard, Bruce C.; McGee, Steven; Shin, Namsoo; Shia, Regina

    2001-01-01

    Discussion of the triarchic theory of intelligence focuses on a study of ninth graders that explored the relationships between student abilities and the cognitive and attitudinal outcomes that resulted from student immersion in a computer-based inquiry environment. Examines outcome variables related to content understanding, problem solving, and…

  8. A Theory-based Faculty Development Program for Clinician-Educators.

    ERIC Educational Resources Information Center

    Hewson, Mariana G.

    2000-01-01

    Describes development, implementation, and evaluation of a theory-based faculty development program for physician-educators in medicine and pediatrics at the Cleveland Clinic (Ohio). The program includes a 12-hour course focused on precepting skills, bedside teaching, and effective feedback; on-site coaching; and innovative projects in clinical…

  9. Theory-Based Development and Testing of an Adolescent Tobacco-Use Awareness Program.

    ERIC Educational Resources Information Center

    Smith, Dennis W.; Colwell, Brian; Zhang, James J.; Brimer, Jennifer; McMillan, Catherine; Stevens, Stacey

    2002-01-01

    The Adolescent Tobacco Use Awareness and Cessation Program trial, based on social cognitive theory and transtheoretical model, was designed to develop, evaluate, and disseminate effective cessation programming related to Texas legislation. Data from participants and site facilitators indicated that significantly more participants were in the…

  10. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  11. Preparing Students for Education, Work, and Community: Activity Theory in Task-Based Curriculum Design

    ERIC Educational Resources Information Center

    Campbell, Chris; MacPherson, Seonaigh; Sawkins, Tanis

    2014-01-01

    This case study describes how sociocultural and activity theory were applied in the design of a publicly funded, Canadian Language Benchmark (CLB)-based English as a Second Language (ESL) credential program and curriculum for immigrant and international students in postsecondary institutions in British Columbia, Canada. The ESL Pathways Project…

  12. Operationalizing Levels of Academic Mastery Based on Vygotsky's Theory: The Study of Mathematical Knowledge

    ERIC Educational Resources Information Center

    Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry

    2015-01-01

    The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…

  13. Theory and Utility-Key Themes in Evidence-Based Assessment: Comment on the Special Section

    ERIC Educational Resources Information Center

    McFall, Richard M.

    2005-01-01

    This article focuses on two key themes in the four featured reviews on evidence-based assessment. The first theme is the essential role of theory in psychological assessment. An overview of this complex, multilayered role is presented. The second theme is the need for a common metric with which to gauge the utility of specific psychological tests…

  14. Imitation dynamics of vaccine decision-making behaviours based on the game theory.

    PubMed

    Yang, Junyuan; Martcheva, Maia; Chen, Yuming

    2016-01-01

    Based on game theory, we propose an age-structured model to investigate the imitation dynamics of vaccine uptake. We first obtain the existence and local stability of equilibria. We show that Hopf bifurcation can occur. We also establish the global stability of the boundary equilibria and persistence of the disease. The theoretical results are supported by numerical simulations.

  15. Investigating Learner Attitudes toward E-Books as Learning Tools: Based on the Activity Theory Approach

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Huang, Hsiu-Mei

    2016-01-01

    This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…

  16. Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data

    ERIC Educational Resources Information Center

    Abdullah, Lazim

    2011-01-01

    Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…

  17. Web-Support for Activating Use of Theory in Group-Based Learning.

    ERIC Educational Resources Information Center

    van der Veen, Jan; van Riemsdijk, Maarten; Laagland, Eelco; Gommer, Lisa; Jones, Val

    This paper describes a series of experiments conducted within the context of a course on organizational theory that is taught at the Department of Management Sciences at the University of Twente (Netherlands). In 1997, a group-based learning approach was adopted, but after the first year it was apparent that acquisition and application of theory…

  18. A Practice-Based Theory of Professional Education: Teach For America's Professional Development Model

    ERIC Educational Resources Information Center

    Gabriel, Rachael

    2011-01-01

    In 1999, Ball and Cohen proposed a practice-based theory of professional education, which would end inadequate professional development efforts with a more comprehensive approach. Their work has been referenced over the past decade, yet there have been limited attempts to actualize their ideals and research their implications. In this article, I…

  19. From Theory to Practice: Concept-Based Inquiry in a High School Art Classroom

    ERIC Educational Resources Information Center

    Walker, Margaret A.

    2014-01-01

    This study examines what an emerging educational theory looks like when put into practice in an art classroom. It explores the teaching methodology of a high school art teacher who has utilized concept-based inquiry in the classroom to engage his students in artmaking and analyzes the influence this methodology has had on his adolescent students.…

  20. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  1. Investigating Acceptance toward Mobile Learning to Assist Individual Knowledge Management: Based on Activity Theory Approach

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei

    2010-01-01

    Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…

  2. Social Theory, Sacred Text, and Sing-Sing Prison: A Sociology of Community-Based Reconciliation.

    ERIC Educational Resources Information Center

    Erickson, Victoria Lee

    2002-01-01

    Examines the sociological component of the urban community-based professional education programs at New York Theological Seminary offered at Sing-Sing Prison. Explores the simultaneous use of social theory and sacred texts as teaching tools and intervention strategies in the educational and personal transformation processes of men incarcerated for…

  3. Issues in Inquiry-Based Science Education Seen through Dewey's Theory of Inquiry

    ERIC Educational Resources Information Center

    Won, Mihye

    2009-01-01

    To understand the issues of inquiry-based education, I adopted John Dewey's theory of inquiry as the analytical framework to examine science learning activities, students' interactions, and education standards. Educators have tried to engage students in meaningful learning, but the analysis revealed that the meaning of inquiry was diverse:…

  4. Effect of Cognitive-Behavioral-Theory-Based Skill Training on Academic Procrastination Behaviors of University Students

    ERIC Educational Resources Information Center

    Toker, Betül; Avci, Rasit

    2015-01-01

    This study examined the effectiveness of a cognitive-behavioral theory (CBT) psycho-educational group program on the academic procrastination behaviors of university students and the persistence of any training effect. This was a quasi-experimental research based on an experimental and control group pretest, posttest, and followup test model.…

  5. Supporting Self-Regulated Personalised Learning through Competence-Based Knowledge Space Theory

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Nussbaumer, Alexander; Albert, Dietrich

    2009-01-01

    This article presents two current research trends in e-learning that at first sight appear to compete. Competence-Based Knowledge Space Theory (CBKST) provides a knowledge representation framework which, since its invention by Doignon & Falmagne, has been successfully applied in various e-learning systems (for example, Adaptive Learning with…

  6. Effects of Guided Writing Strategies on Students' Writing Attitudes Based on Media Richness Theory

    ERIC Educational Resources Information Center

    Lan, Yu-Feng; Hung, Chun-Ling; Hsu, Hung-Ju

    2011-01-01

    The purpose of this paper is to develop different guided writing strategies based on media richness theory and further evaluate the effects of these writing strategies on younger students' writing attitudes in terms of motivation, enjoyment and anxiety. A total of 66 sixth-grade elementary students with an average age of twelve were invited to…

  7. Evidence-Based Practice in Kinesiology: The Theory to Practice Gap Revisited

    ERIC Educational Resources Information Center

    Knudson, Duane

    2005-01-01

    As evidence-based practice sweeps the applied health professions, it is a good time to evaluate the generation of knowledge in Kinesiology and its transmission to professionals and the public. Knowledge transmission has been debated in the past from the perspectives of the theory-to-practice gap and the discipline versus profession emphasis.…

  8. Can attentional theory explain the inverse base rate effect? Comment on Kruschke (2001).

    PubMed

    Winman, Anders; Wennerholm, Pia; Juslin, Peter

    2003-11-01

    In J. K. Kruschke's (2001; see record 2001-18940-005) study, it is argued that attentional theory is the sole satisfactory explanation of the inverse base rate effect and that eliminative inference (P. Juslin, P. Wennerholm, & A. Winman, 2001; see record 2001-07828-016) plays no role in the phenomenon. In this comment, the authors demonstrate that, in contrast to the central tenets of attentional theory, (a) rapid attention shifts as implemented in ADIT decelerate learning in the inverse base-rate task and (b) the claim that the inverse base-rate effect is directly caused by an attentional asymmetry is refuted by data. It is proposed that a complete account of the inverse base-rate effect needs to integrate attention effects with inference rules that are flexibly used for both induction and elimination. PMID:14622069

  9. [The model of the reward choice basing on the theory of reinforcement learning].

    PubMed

    Smirnitskaia, I A; Frolov, A A; Merzhanova, G Kh

    2007-01-01

    We developed the model of alimentary instrumental conditioned bar-pressing reflex for cats making a choice between either immediate small reinforcement ("impulsive behavior") or delayed more valuable reinforcement ("self-control behavior"). Our model is based on the reinforcement learning theory. We emulated dopamine contribution by discount coefficient of this theory (a subjective decrease in the value of a delayed reinforcement). The results of computer simulation showed that "cats" with large discount coefficient demonstrated "self-control behavior"; small discount coefficient was associated with "impulsive behavior". This data are in agreement with the experimental data indicating that the impulsive behavior is due to a decreased amount of dopamine in striatum.

  10. Theory-based behavior change interventions: comments on Hobbis and Sutton.

    PubMed

    Fishbein, Martin; Ajzen, Icek

    2005-01-01

    Hobbis and Sutton (this issue) suggest that Cognitive Behavior Therapy (CBT) techniques can be used in interventions based on the Theory of Planned Behavior (TPB). Although this suggestion has merit, CBT is only one of many applicable methods for producing belief and behavior change. Moreover, CBT's primary purpose is to help people carry out intended behaviors, not to influence intentions, and that it is more useful in face-to-face than in community-level interventions. Contrary to Hobbis and Sutton's critique, TPB can accommodate core beliefs or fundamental assumptions, but the theory suggests that interventions targeted at such beliefs are less effective than interventions targeted at behavior specific beliefs. PMID:15576497

  11. Learning control system design based on 2-D theory - An application to parallel link manipulator

    NASA Technical Reports Server (NTRS)

    Geng, Z.; Carroll, R. L.; Lee, J. D.; Haynes, L. H.

    1990-01-01

    An approach to iterative learning control system design based on two-dimensional system theory is presented. A two-dimensional model for the iterative learning control system which reveals the connections between learning control systems and two-dimensional system theory is established. A learning control algorithm is proposed, and the convergence of learning using this algorithm is guaranteed by two-dimensional stability. The learning algorithm is applied successfully to the trajectory tracking control problem for a parallel link robot manipulator. The excellent performance of this learning algorithm is demonstrated by the computer simulation results.

  12. A novel trust evaluation method for Ubiquitous Healthcare based on cloud computational theory.

    PubMed

    Athanasiou, Georgia; Fengou, Maria-Anna; Beis, Antonios; Lymberopoulos, Dimitrios

    2014-01-01

    The notion of trust is considered to be the cornerstone on patient-psychiatrist relationship. Thus, a trustfully background is fundamental requirement for provision of effective Ubiquitous Healthcare (UH) service. In this paper, the issue of Trust Evaluation of UH Providers when register UH environment is addressed. For that purpose a novel trust evaluation method is proposed, based on cloud theory, exploiting User Profile attributes. This theory mimics human thinking, regarding trust evaluation and captures fuzziness and randomness of this uncertain reasoning. Two case studies are investigated through simulation in MATLAB software, in order to verify the effectiveness of this novel method.

  13. A method for calculating strain energy release rate based on beam theory

    NASA Technical Reports Server (NTRS)

    Sun, C. T.; Pandey, R. K.

    1993-01-01

    The Timoshenko beam theory was used to model cracked beams and to calculate the total strain energy release rate. The root rotation of the beam segments at the crack tip were estimated based on an approximate 2D elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain energy release rate obtained using beam theory agrees very well with the 2D finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.

  14. Improved method for calculating strain energy release rate based on beam theory

    NASA Technical Reports Server (NTRS)

    Sun, C. T.; Pandey, R. K.

    1994-01-01

    The Timoshenko beam theory was used to model cracked beams and to calculate the total strain-energy release rate. The root rotations of the beam segments at the crack tip were estimated based on an approximate two-dimensional elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain-energy release rate obtained using beam theory agrees very well with the two-dimensional finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.

  15. Risk Assessment of Communication Network of Power Company Based on Rough Set Theory and Multiclass SVM

    NASA Astrophysics Data System (ADS)

    He, Xi; Wang, Wei; Liu, Xinyu; Ji, Yong

    This paper proposes a new risk assessment method based on the attribute reduction theory of rough set and multiclass SVM classification. Rough set theory is introduced for data attribute reduction and multiclass SVM is used for automatic assessment of risk levels. Redundant features of data are deleted that can reduce the computation complexity of multiclass SVM and improve the learning and the generalization ability. Multiclass SVM trained with the empirical data can predict the risk level. Experiment shows that the predict result has relatively high precision, and the method is validity for power network risk assessment.

  16. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs.

  17. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288

  18. Is social projection based on simulation or theory? Why new methods are needed for differentiating.

    PubMed

    Bazinger, Claudia; Kühberger, Anton

    2012-12-01

    The literature on social cognition reports many instances of a phenomenon titled 'social projection' or 'egocentric bias'. These terms indicate egocentric predictions, i.e., an over-reliance on the self when predicting the cognition, emotion, or behavior of other people. The classic method to diagnose egocentric prediction is to establish high correlations between our own and other people's cognition, emotion, or behavior. We argue that this method is incorrect because there is a different way to come to a correlation between own and predicted states, namely, through the use of theoretical knowledge. Thus, the use of correlational measures is not sufficient to identify the source of social predictions. Based on the distinction between simulation theory and theory theory, we propose the following alternative methods for inferring prediction strategies: independent vs. juxtaposed predictions, the use of 'hot' mental processes, and the use of participants' self-reports.

  19. Chaos theory as a planning tool for community-based educational experiences for health students.

    PubMed

    Velde, Beth P; Greer, Annette G; Lynch, Deirdre C; Escott-Stump, Sylvia

    2002-01-01

    Educational and community health systems are social systems composed of a group or collection of entities for which there is a unifying principle. The purpose of this paper is to briefly explain chaos theory and to apply it to the Interdisciplinary Rural Health Training Program (IRHTP) as a case study. The IRHTP is an existing rural, community based educational program for baccalaureate and graduate health care students. Chaos theory attempts to understand the underlying order in processes that appear to not have any guidelines or principles. These processes typically involve the interaction of several elements over time. Chaos theory provided the university with a method of anticipating the natural flux between order and chaos to allow the system to function at its highest level. To thrive in such a complex dynamic environment the authors recommend application of Ockerman's Five Factors. PMID:12227265

  20. Robust stabilization control based on guardian maps theory for a longitudinal model of hypersonic vehicle.

    PubMed

    Liu, Yanbin; Liu, Mengying; Sun, Peihua

    2014-01-01

    A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods.

  1. Is social projection based on simulation or theory? Why new methods are needed for differentiating

    PubMed Central

    Bazinger, Claudia; Kühberger, Anton

    2012-01-01

    The literature on social cognition reports many instances of a phenomenon titled ‘social projection’ or ‘egocentric bias’. These terms indicate egocentric predictions, i.e., an over-reliance on the self when predicting the cognition, emotion, or behavior of other people. The classic method to diagnose egocentric prediction is to establish high correlations between our own and other people's cognition, emotion, or behavior. We argue that this method is incorrect because there is a different way to come to a correlation between own and predicted states, namely, through the use of theoretical knowledge. Thus, the use of correlational measures is not sufficient to identify the source of social predictions. Based on the distinction between simulation theory and theory theory, we propose the following alternative methods for inferring prediction strategies: independent vs. juxtaposed predictions, the use of ‘hot’ mental processes, and the use of participants’ self-reports. PMID:23209342

  2. A comparison of design variables for control theory based airfoil optimization

    NASA Technical Reports Server (NTRS)

    Reuther, James; Jameson, Antony

    1995-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.

  3. Robust Stabilization Control Based on Guardian Maps Theory for a Longitudinal Model of Hypersonic Vehicle

    PubMed Central

    Liu, Mengying; Sun, Peihua

    2014-01-01

    A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535

  4. Experimental investigation and kinetic-theory-based model of a rapid granular shear flow

    NASA Astrophysics Data System (ADS)

    Wildman, R. D.; Martin, T. W.; Huntley, J. M.; Jenkins, J. T.; Viswanathan, H.; Fen, X.; Parker, D. J.

    An experimental investigation of an idealized rapidly sheared granular flow was performed to test the predictions of a model based on the kinetic theory of dry granular media. Glass ballotini beads were placed in an annular shear cell and the lower boundary rotated to induce a shearing motion in the bed. A single particle was tracked using the positron emission particle tracking (PEPT) technique, a method that determines the location of a particle through the triangulation of gamma photons emitted by a radioactive tracer particle. The packing fraction and velocity fields within the three-dimensional flow were measured and compared to the predictions of a model developed using the conservation and balance equations applicable to dissipative systems, and solved incorporating constitutive relations derived from kinetic theory. The comparison showed that kinetic theory is able to capture the general features of a rapid shear flow reasonably well over a wide range of shear rates and confining pressures.

  5. Analysis and synthesis of phase shifting algorithms based on linear systems theory

    NASA Astrophysics Data System (ADS)

    Servin, M.; Estrada, J. C.

    2012-08-01

    We review and update a recently published formalism for the theory of linear Phase Shifting Algorithms (PSAs) based on linear filtering (systems) theory, mainly using the Frequency Transfer Function (FTF). The FTF has been for decades the standard tool in Electrical Engineering to analyze and synthesize their linear systems. Given the well defined FTF approach (matured over the last century), it clarifies, in our view, many not fully understood properties of PSAs. We present easy formulae for the spectra of the PSAs (the FTF magnitude), their Signal to Noise (S/N) power-ratio gain, their detuning robustness, and their harmonic rejection in terms of the FTF. This paper has more practical appeal than previous publications by the same authors, hoping to enrich the understanding of this PSA's theory as applied to the analysis and synthesis of temporal interferometry algorithms in Optical Metrology.

  6. Performance of Four-Leg VSC based DSTATCOM using Single Phase P-Q Theory

    NASA Astrophysics Data System (ADS)

    Jampana, Bangarraju; Veramalla, Rajagopal; Askani, Jayalaxmi

    2016-06-01

    This paper presents single-phase P-Q theory for four-leg VSC based distributed static compensator (DSTATCOM) in the distribution system. The proposed DSTATCOM maintains unity power factor at source, zero voltage regulation, eliminates current harmonics, load balancing and neutral current compensation. The advantage of using four-leg VSC based DSTATCOM is to eliminate isolated/non-isolated transformer connection at point of common coupling (PCC) for neutral current compensation. The elimination of transformer connection at PCC with proposed topology will reduce cost of DSTATCOM. The single-phase P-Q theory control algorithm is used to extract fundamental component of active and reactive currents for generation of reference source currents which is based on indirect current control method. The proposed DSTATCOM is modelled and the results are validated with various consumer loads under unity power factor and zero voltage regulation modes in the MATLAB R2013a environment using simpower system toolbox.

  7. Evaluating clinical simulations for learning procedural skills: a theory-based approach.

    PubMed

    Kneebone, Roger

    2005-06-01

    Simulation-based learning is becoming widely established within medical education. It offers obvious benefits to novices learning invasive procedural skills, especially in a climate of decreasing clinical exposure. However, simulations are often accepted uncritically, with undue emphasis being placed on technological sophistication at the expense of theory-based design. The author proposes four key areas that underpin simulation-based learning, and summarizes the theoretical grounding for each. These are (1) gaining technical proficiency (psychomotor skills and learning theory, the importance of repeated practice and regular reinforcement), (2) the place of expert assistance (a Vygotskian interpretation of tutor support, where assistance is tailored to each learner's needs), (3) learning within a professional context (situated learning and contemporary apprenticeship theory), and (4) the affective component of learning (the effect of emotion on learning). The author then offers four criteria for critically evaluating new or existing simulations, based on the theoretical framework outlined above. These are: (1) Simulations should allow for sustained, deliberate practice within a safe environment, ensuring that recently-acquired skills are consolidated within a defined curriculum which assures regular reinforcement; (2) simulations should provide access to expert tutors when appropriate, ensuring that such support fades when no longer needed; (3) simulations should map onto real-life clinical experience, ensuring that learning supports the experience gained within communities of actual practice; and (4) simulation-based learning environments should provide a supportive, motivational, and learner-centered milieu which is conducive to learning.

  8. Personality and Psychopathology: a Theory-Based Revision of Eysenck's PEN Model.

    PubMed

    van Kampen, Dirk

    2009-12-08

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck's original PEN model by repairing the various shortcomings that can be noted in Eysenck's personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question 'which personality factors are basic?', arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck's theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck's PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions.

  9. Remote sensing image classification method based on evidence theory and decision tree

    NASA Astrophysics Data System (ADS)

    Li, Xuerong; Xing, Qianguo; Kang, Lingyan

    2010-11-01

    Remote sensing image classification is an important and complex problem. Conventional remote sensing image classification methods are mostly based on Bayesian subjective probability theory, but there are many defects for its uncertainty. This paper firstly introduces evidence theory and decision tree method. Then it emphatically introduces the function of support degree that evidence theory is used on pattern recognition. Combining the D-S evidence theory with the decision tree algorithm, a D-S evidence theory decision tree method is proposed, where the support degree function is the tie. The method is used to classify the classes, such as water, urban land and green land with the exclusive spectral feature parameters as input values, and produce three classification images of support degree. Then proper threshold value is chosen and according image is handled with the method of binarization. Then overlay handling is done with these images according to the type of classifications, finally the initial result is obtained. Then further accuracy assessment will be done. If initial classification accuracy is unfit for the requirement, reclassification for images with support degree of less than threshold is conducted until final classification meets the accuracy requirements. Compared to Bayesian classification, main advantages of this method are that it can perform reclassification and reach a very high accuracy. This method is finally used to classify the land use of Yantai Economic and Technological Development Zone to four classes such as urban land, green land and water, and effectively support the classification.

  10. Buckling analysis of functionally graded nanobeams based on a nonlocal third-order shear deformation theory

    NASA Astrophysics Data System (ADS)

    Rahmani, O.; Jandaghian, A. A.

    2015-06-01

    In this paper, a general third-order beam theory that accounts for nanostructure-dependent size effects and two-constituent material variation through the nanobeam thickness, i.e., functionally graded material (FGM) beam is presented. The material properties of FG nanobeams are assumed to vary through the thickness according to the power law. A detailed derivation of the equations of motion based on Eringen nonlocal theory using Hamilton's principle is presented, and a closed-form solution is derived for buckling behavior of the new model with various boundary conditions. The nonlocal elasticity theory includes a material length scale parameter that can capture the size effect in a functionally graded material. The proposed model is efficient in predicting the shear effect in FG nanobeams by applying third-order shear deformation theory. The proposed approach is validated by comparing the obtained results with benchmark results available in the literature. In the following, a parametric study is conducted to investigate the influences of the length scale parameter, gradient index, and length-to-thickness ratio on the buckling of FG nanobeams and the improvement on nonlocal third-order shear deformation theory comparing with the classical (local) beam model has been shown. It is found out that length scale parameter is crucial in studying the stability behavior of the nanobeams.

  11. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission

    PubMed Central

    Kiani, Mehdi; Ghovanloo, Maysam

    2014-01-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  12. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission.

    PubMed

    Kiani, Mehdi; Ghovanloo, Maysam

    2012-09-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  13. Paying for express checkout: competition and price discrimination in multi-server queuing systems.

    PubMed

    Deck, Cary; Kimbrough, Erik O; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus.

  14. Paying for Express Checkout: Competition and Price Discrimination in Multi-Server Queuing Systems

    PubMed Central

    Deck, Cary; Kimbrough, Erik O.; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  15. Membrane-Based Characterization of a Gas Component — A Transient Sensor Theory

    PubMed Central

    Lazik, Detlef

    2014-01-01

    Based on a multi-gas solution-diffusion problem for a dense symmetrical membrane this paper presents a transient theory of a planar, membrane-based sensor cell for measuring gas from both initial conditions: dynamic and thermodynamic equilibrium. Using this theory, the ranges for which previously developed, simpler approaches are valid will be discussed; these approaches are of vital interest for membrane-based gas sensor applications. Finally, a new theoretical approach is introduced to identify varying gas components by arranging sensor cell pairs resulting in a concentration independent gas-specific critical time. Literature data for the N2, O2, Ar, CH4, CO2, H2 and C4H10 diffusion coefficients and solubilities for a polydimethylsiloxane membrane were used to simulate gas specific sensor responses. The results demonstrate the influence of (i) the operational mode; (ii) sensor geometry and (iii) gas matrices (air, Ar) on that critical time. Based on the developed theory the case-specific suitable membrane materials can be determined and both operation and design options for these sensors can be optimized for individual applications. The results of mixing experiments for different gases (O2, CO2) in a gas matrix of air confirmed the theoretical predictions. PMID:24608004

  16. Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping

    PubMed Central

    Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E

    2014-01-01

    Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880

  17. Agent-based modeling: a new approach for theory building in social psychology.

    PubMed

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach. PMID:18453457

  18. Comparison of inlet suppressor data with approximate theory based on cutoff ratio

    NASA Technical Reports Server (NTRS)

    Rice, E. J.; Heidelberg, L. J.

    1980-01-01

    This paper represents the initial quantitative comparison of inlet suppressor far-field directivity suppression with that predicted using an approximate liner design and evaluation method based upon mode cutoff ratio. The experimental data was obtained using a series of cylindrical point-reacting inlet liners on an Avco-Lycoming YF102 engine. The theoretical prediction program is based upon simplified sound propagation concepts derived from exact calculations. These indicate that all of the controlling phenomenon can be approximately correlated with mode cutoff ratio which itself is intimately related to the angles of propagation within the duct. The objective of the theory-data comparisons is to point out possible deficiencies in the approximate theory which may be corrected. After all theoretical refinements have been made, then empirical corrections can be applied.

  19. Comparison of inlet suppressor data with approximate theory based on cutoff ratio

    NASA Technical Reports Server (NTRS)

    Rice, E. J.; Heidelberg, L. J.

    1979-01-01

    Inlet suppressor far-field directivity suppression was quantitatively compared with that predicted using an approximate linear design and evaluation method based upon mode cutoff ratio. The experimental data was obtained using a series of cylindrical point-reacting inlet liners on a YF102 engine. The theoretical prediction program is based upon simplified sound propagation concepts derived from exact calculations. These indicate that all of the controlling phenomenon can be approximately correlated with mode cutoff ratio which itself is intimately related to the angles of propagation within the duct. The theory-data comparisons are intended to point out possible deficiencies in the approximate theory which may be corrected. After all theoretical refinements are made, then empirical corrections can be applied.

  20. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  1. A theory-based logic model for innovation policy and evaluation.

    SciTech Connect

    Jordan, Gretchen B.

    2010-04-01

    Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.

  2. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks

    PubMed Central

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  3. Design of Flexure-based Precision Transmission Mechanisms using Screw Theory

    SciTech Connect

    Hopkins, J B; Panas, R M

    2011-02-07

    This paper enables the synthesis of flexure-based transmission mechanisms that possess multiple decoupled inputs and outputs of any type (e.g. rotations, translations, and/or screw motions), which are linked by designer-specified transmission ratios. A comprehensive library of geometric shapes is utilized from which every feasible concept that possesses the desired transmission characteristics may be rapidly conceptualized and compared before an optimal concept is selected. These geometric shapes represent the rigorous mathematics of screw theory and uniquely link a body's desired motions to the flexible constraints that enable those motions. This paper's impact is most significant to the design of nano-positioners, microscopy stages, optical mounts, and sensors. A flexure-based microscopy stage was designed, fabricated, and tested to demonstrate the utility of the theory.

  4. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-06-30

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead.

  5. Theory-based evaluation of a comprehensive Latino education initiative: an interactive evaluation approach.

    PubMed

    Nesman, Teresa M; Batsche, Catherine; Hernandez, Mario

    2007-08-01

    Latino student access to higher education has received significant national attention in recent years. This article describes a theory-based evaluation approach used with ENLACE of Hillsborough, a 5-year project funded by the W.K. Kellogg Foundation for the purpose of increasing Latino student graduation from high school and college. Theory-based evaluation guided planning, implementation as well as evaluation through the process of developing consensus on the Latino population of focus, adoption of culturally appropriate principles and values to guide the project, and identification of strategies to reach, engage, and impact outcomes for Latino students and their families. The approach included interactive development of logic models that focused the scope of interventions and guided evaluation designs for addressing three stages of the initiative. Challenges and opportunities created by the approach are discussed, as well as ways in which the initiative impacted Latino students and collaborating educational institutions.

  6. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

    2016-06-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

  7. Eight myths on motivating social services workers: theory-based perspectives.

    PubMed

    Latting, J K

    1991-01-01

    A combination of factors has made formal motivational and reward systems rare in human service organizations generally and virtually non-existent in social service agencies. The author reviews eight of these myths by reference to eight motivational theories which refute them: need theory, expectancy theory, feedback theory, equity theory, reinforcement theory, cognitive evaluation theory, goal setting theory, and social influence theory. Although most of these theories have been developed and applied in the private sector, relevant research has also been conducted in social service agencies. The author concludes with a summary of guidelines suggested by the eight theories for motivating human service workers.

  8. Eight myths on motivating social services workers: theory-based perspectives.

    PubMed

    Latting, J K

    1991-01-01

    A combination of factors has made formal motivational and reward systems rare in human service organizations generally and virtually non-existent in social service agencies. The author reviews eight of these myths by reference to eight motivational theories which refute them: need theory, expectancy theory, feedback theory, equity theory, reinforcement theory, cognitive evaluation theory, goal setting theory, and social influence theory. Although most of these theories have been developed and applied in the private sector, relevant research has also been conducted in social service agencies. The author concludes with a summary of guidelines suggested by the eight theories for motivating human service workers. PMID:10114292

  9. Applying Educational Theory to Simulation-Based Training and Assessment in Surgery.

    PubMed

    Chauvin, Sheila W

    2015-08-01

    Considerable progress has been made regarding the range of simulator technologies and simulation formats. Similarly, results from research in human learning and behavior have facilitated the development of best practices in simulation-based training (SBT) and surgical education. Today, SBT is a common curriculum component in surgical education that can significantly complement clinical learning, performance, and patient care experiences. Beginning with important considerations for selecting appropriate forms of simulation, several relevant educational theories of learning are described. PMID:26210964

  10. Practice of Improving Roll Deformation Theory in Strip Rolling Process Based on Boundary Integral Equation Method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhengwen; Xiao, Hong; Xie, Hongbiao

    2014-02-01

    Precise strip-shape control theory is significant to improve rolled strip quality, and roll flattening theory is a primary part of the strip-shape theory. To improve the accuracy of roll flattening calculation based on semi-infinite body model, a new and more accurate roll flattening model is proposed in this paper, which is derived based on boundary integral equation method. The displacement fields of the finite length semi-infinite body on left and right sides are simulated by using finite element method (FEM) and displacement decay functions on left and right sides are established. Based on the new roll flattening model, a new 4Hi mill deformation model is established and verified by FEM. The new model is compared with Foppl formula and semi-infinite body model in different strip width, roll shifting value and bending force. The results show that the pressure and flattening between rolls calculated by the new model are more precise than other two models, especially near the two roll barrel edges.

  11. Galloping of iced quad-conductors bundles based on curved beam theory

    NASA Astrophysics Data System (ADS)

    Yan, Zhitao; Savory, Eric; Li, Zhengliang; Lin, William E.

    2014-03-01

    Galloping refers to wind-induced, low-frequency, large-amplitude oscillations that have been more frequently observed for a bundle conductor than for a single conductor. In the present work two different models are built to investigate the galloping of a bundle conductor: (1) a finite curved beam element method and (2) a hybrid model based on curved beam element theory. The finite curved beam element model is effective in dealing with the spacers between the bundled conductors and the joint between the conductors and spacers that can be simulated as a rigid joint or a hinge. Furthermore, the finite curved beam element model can be used to deal with large deformation. The hybrid model invokes the small deformation hypothesis and has a high computational efficiency. A hybrid model based on conventional cable element theory is also programmed to be compared with the aforementioned models based on curved beam element theory. Numerical examples are presented to assess the accuracy of the different models in predicting the equilibrium conductor position, natural frequencies and galloping amplitude. The results show that the curved beam element models, involving more degrees of freedom and coupling of translational and torsional motion, are more accurate at simulating the static and dynamic characters of an iced quad-conductor bundle. The use of hinges, rather than rigid connections, reduces the structural response amplitudes of a galloping conductor bundle.

  12. A general theory to analyse and design wireless power transfer based on impedance matching

    NASA Astrophysics Data System (ADS)

    Liu, Shuo; Chen, Linhui; Zhou, Yongchun; Cui, Tie Jun

    2014-10-01

    We propose a general theory to analyse and design the wireless power transfer (WPT) systems based on impedance matching. We take two commonly used structures as examples, the transformer-coupling-based WPT and the series/parallel capacitor-based WPT, to show how to design the impedance matching network (IMN) to obtain the maximum transfer efficiency and the maximum output power. Using the impedance matching theory (IMT), we derive a simple expression of the overall transfer efficiency by the coils' quality factors and the coupling coefficient, which has perfect accuracy compared to full-circuit simulations. Full-wave electromagnetic software, CST Microwave Studio, has been used to extract the parameters of coils, thus providing us a comprehensive way to simulate WPT systems directly from the coils' physical model. We have also discussed the relationship between the output power and the transfer efficiency, and found that the maximum output power and the maximum transfer efficiency may occur at different frequencies. Hence, both power and efficiency should be considered in real WPT applications. To validate the proposed theory, two types of WPT experiments have been conducted using 30 cm-diameter coils for lighting a 20 W light bulb with 60% efficiency over a distance of 50 cm. The experimental results have very good agreements to the theoretical predictions.

  13. A queuing model for designing multi-modality buried target detection systems: preliminary results

    NASA Astrophysics Data System (ADS)

    Malof, Jordan M.; Morton, Kenneth D.; Collins, Leslie M.; Torrione, Peter A.

    2015-05-01

    Many remote sensing modalities have been developed for buried target detection, each one offering its own relative advantages over the others. As a result there has been interest in combining several modalities into a single detection platform that benefits from the advantages of each constituent sensor, without suffering from their weaknesses. Traditionally this involves collecting data continuously on all sensors and then performing data, feature, or decision level fusion. While this is effective for lowering false alarm rates, this strategy neglects the potential benefits of a more general system-level fusion architecture. Such an architecture can involve dynamically changing which modalities are in operation. For example, a large standoff modality such as a forward-looking infrared (FLIR) camera can be employed until an alarm is encountered, at which point a high performance (but short standoff) sensor, such as ground penetrating radar (GPR), is employed. Because the system is dynamically changing its rate of advance and sensors, it becomes difficult to evaluate the expected false alarm rate and advance rate. In this work, a probabilistic model is proposed that can be used to estimate these quantities based on a provided operating policy. In this model the system consists of a set of states (e.g., sensors employed) and conditions encountered (e.g., alarm locations). The predictive accuracy of the model is evaluated using a collection of collocated FLIR and GPR data and the results indicate that the model is effective at predicting the desired system metrics.

  14. Vervet monkeys solve a multiplayer "forbidden circle game" by queuing to learn restraint.

    PubMed

    Fruteau, Cécile; van Damme, Eric; Noë, Ronald

    2013-04-22

    In social dilemmas, the ability of individuals to coordinate their actions is crucial to reach group optima. Unless exacted by power or force, coordination in humans relies on a common understanding of the problem, which is greatly facilitated by communication. The lack of means of consultation about the nature of the problem and how to solve it may explain why multiagent coordination in nonhuman vertebrates has commonly been observed only when multiple individuals react instantaneously to a single stimulus, either natural or experimentally simulated, for example a predator, a prey, or a neighboring group. Here we report how vervet monkeys solved an experimentally induced coordination problem. In each of three groups, we trained a low-ranking female, the "provider," to open a container holding a large amount of food, which the providers only opened when all individuals dominant to them ("dominants") stayed outside an imaginary "forbidden circle" around it. Without any human guidance, the dominants learned restraint one by one, in hierarchical order from high to low. Once all dominants showed restraint immediately at the start of the trial, the providers opened the container almost instantly, saving all individuals opportunity costs due to lost foraging time. Solving this game required trial-and-error learning based on individual feedback from the provider to each dominant, and all dominants being patient enough to wait outside the circle while others learned restraint. Communication, social learning, and policing by high-ranking animals played no perceptible role.

  15. Theory of plasma contactors in ground-based experiments and low Earth orbit

    NASA Technical Reports Server (NTRS)

    Gerver, M. J.; Hastings, Daniel E.; Oberhardt, M. R.

    1990-01-01

    Previous theoretical work on plasma contactors as current collectors has fallen into two categories: collisionless double layer theory (describing space charge limited contactor clouds) and collisional quasineutral theory. Ground based experiments at low current are well explained by double layer theory, but this theory does not scale well to power generation by electrodynamic tethers in space, since very high anode potentials are needed to draw a substantial ambient electron current across the magnetic field in the absence of collisions (or effective collisions due to turbulence). Isotropic quasineutral models of contactor clouds, extending over a region where the effective collision frequency upsilon sub e exceeds the electron cyclotron frequency omega sub ce, have low anode potentials, but would collect very little ambient electron current, much less than the emitted ion current. A new model is presented, for an anisotropic contactor cloud oriented along the magnetic field, with upsilon sub e less than omega sub ce. The electron motion along the magnetic field is nearly collisionless, forming double layers in that direction, while across the magnetic field the electrons diffuse collisionally and the potential profile is determined by quasineutrality. Using a simplified expression for upsilon sub e due to ion acoustic turbulence, an analytic solution has been found for this model, which should be applicable to current collection in space. The anode potential is low and the collected ambient electron current can be several times the emitted ion current.

  16. Analysis of dislocation pile-ups using a dislocation-based continuum theory

    NASA Astrophysics Data System (ADS)

    Schulz, K.; Dickel, D.; Schmitt, S.; Sandfeld, S.; Weygand, D.; Gumbsch, P.

    2014-03-01

    The increasing demand for materials with well-defined microstructure, accompanied by the advancing miniaturization of devices, is the reason for the growing interest in physically motivated, dislocation-based continuum theories of plasticity. In recent years, various advanced continuum theories have been introduced, which are able to described the motion of straight and curved dislocation lines. The focus of this paper is the question of how to include fundamental properties of discrete dislocations during their motion and interaction in a continuum dislocation dynamics (CDD) theory. In our CDD model, we obtain elastic interaction stresses for the bundles of dislocations by a mean-field stress, which represents long-range stress components, and a short range corrective stress component, which represents the gradients of the local dislocation density. The attracting and repelling behavior of bundles of straight dislocations of the same and opposite sign are analyzed. Furthermore, considering different dislocation pile-up systems, we show that the CDD formulation can solve various fundamental problems of micro-plasticity. To obtain a mesh size independent formulation (which is a prerequisite for further application of the theory to more complex situations), we propose a discretization dependent scaling of the short range interaction stress. CDD results are compared to analytical solutions and benchmark data obtained from discrete dislocation simulations.

  17. Energy decomposition analysis based on a block-localized wavefunction and multistate density functional theory

    PubMed Central

    Bao, Peng

    2013-01-01

    An interaction energy decomposition analysis method based on the block-localized wavefunction (BLW-ED) approach is described. The first main feature of the BLW-ED method is that it combines concepts of valence bond and molecular orbital theories such that the intermediate and physically intuitive electron-localized states are variationally optimized by self-consistent field calculations. Furthermore, the block-localization scheme can be used both in wave function theory and in density functional theory, providing a useful tool to gain insights on intermolecular interactions that would otherwise be difficult to obtain using the delocalized Kohn–Sham DFT. These features allow broad applications of the BLW method to energy decomposition (BLW-ED) analysis for intermolecular interactions. In this perspective, we outline theoretical aspects of the BLW-ED method, and illustrate its applications in hydrogen-bonding and π–cation intermolecular interactions as well as metal–carbonyl complexes. Future prospects on the development of a multistate density functional theory (MSDFT) are presented, making use of block-localized electronic states as the basis configurations. PMID:21369567

  18. Adolescent decision making: a broadly based theory and its application to the prevention of early pregnancy.

    PubMed

    Gordon, C P

    1996-01-01

    The purpose of this paper is to present a broadly based theory of adolescent decision making including all the necessary components of the subject: cognitive development, social and psychological factors, and, perhaps most importantly, cultural and societal influences. Previous theories and applications have often focused on only one or two aspects. This theory is then applied to the problem of prevention of early pregnancy at an inner-city high school. Use of this theory, combined with an open-ended data-gathering format made possible some of the unexpected findings of this study: most of the young women at this school desire their pregnancies; many of them prefer single parenthood to traditional family structure; and low academic skills and poverty often result in pregnancy, rather than pregnancy causing high school dropouts and a life of poverty. Prevention programs will necessarily differ for sexually active adolescents who do and do not want pregnancy and for younger versus older adolescents. In designing such programs, we need to focus on pregnancy as the problem rather than on adolescent sexuality.

  19. Three new branched chain equations of state based on Wertheim's perturbation theory

    NASA Astrophysics Data System (ADS)

    Marshall, Bennett D.; Chapman, Walter G.

    2013-05-01

    In this work, we present three new branched chain equations of state (EOS) based on Wertheim's perturbation theory. The first represents a slightly approximate general branched chain solution of Wertheim's second order perturbation theory (TPT2) for athermal hard chains, and the second represents the extension of first order perturbation theory with a dimer reference fluid (TPT1-D) to branched athermal hard chain molecules. Each athermal branched chain EOS was shown to give improved results over their linear counterparts when compared to simulation data for branched chain molecules with the branched TPT1-D EOS being the most accurate. Further, it is shown that the branched TPT1-D EOS can be extended to a Lennard-Jones dimer reference system to obtain an equation of state for branched Lennard-Jones chains. The theory is shown to accurately predict the change in phase diagram and vapor pressure which results from branching as compared to experimental data for n-octane and corresponding branched isomers.

  20. A Research of Weapon System Storage Reliability Simulation Method Based on Fuzzy Theory

    NASA Astrophysics Data System (ADS)

    Shi, Yonggang; Wu, Xuguang; Chen, Haijian; Xu, Tingxue

    Aimed at the problem of the new, complicated weapon equipment system storage reliability analyze, the paper researched on the methods of fuzzy fault tree analysis and fuzzy system storage reliability simulation, discussed the path that regarded weapon system as fuzzy system, and researched the storage reliability of weapon system based on fuzzy theory, provided a method of storage reliability research for the new, complicated weapon equipment system. As an example, built up the fuzzy fault tree of one type missile control instrument based on function analysis, and used the method of fuzzy system storage reliability simulation to analyze storage reliability index of control instrument.

  1. Identification of a novel V1-type AVP receptor based on the molecular recognition theory.

    PubMed Central

    Herrera, V. L.; Ruiz-Opazo, N.

    2001-01-01

    BACKGROUND: The molecular recognition theory predicts that binding domains of peptide hormones and their corresponding receptor binding domains evolved from complementary strands of genomic DNA, and that a process of selective evolutionary mutational events within these primordial domains gave rise to the high affinity and high specificity of peptide hormone-receptor interactions observed today in different peptide hormone-receptor systems. Moreover, this theory has been broadened as a general hypothesis that could explain the evolution of intermolecular protein-protein and intramolecular peptide interactions. MATERIALS AND METHODS: Applying a molecular cloning strategy based on the molecular recognition theory, we screened a rat kidney cDNA library with a vasopressin (AVP) antisense oligonucleotide probe, expecting to isolate potential AVP receptors. RESULTS: We isolated a rat kidney cDNA encoding a functional V1-type vasopressin receptor. Structural analysis identified a 135 amino acid-long polypeptide with a single transmembrane domain, quite distinct from the rhodopsin-based G protein-coupled receptor superfamily. Functional analysis of the expressed V1-type receptor in Cos-1 cells revealed AVP-specific binding, AVP-specific coupling to Ca2+ mobilizing transduction system, and characteristic V1-type antagonist inhibition. CONCLUSIONS: This is the second AVP receptor cDNA isolated using AVP antipeptide-based oligonucleotide screening, thus providing compelling evidence in support of the molecular recognition theory as the basis of the evolution of this peptide hormone-receptor system, as well as adds molecular complexity and diversity to AVP receptor systems. PMID:11683375

  2. Extended weighted fair queuing (EWFQ) algorithm for broadband applications including multicast traffic

    NASA Astrophysics Data System (ADS)

    Tufail, Mudassir; Cousin, Bernard

    1997-10-01

    Ensuring end-to-end bounded delay and fair allocation of bandwidth to a backlogged session are no more the only criterias for declaring a queue service scheme good. With the evolution of packet-switched networks, more and more distributed and multimedia applications are being developed. These applications demand that service offered to them should be homogeneously distributed at all instants contrarily to back-to-back packet's serving in WFQ scheme. There are two reasons for this demand of homogeneous service: (1) In feedback based congestion control algorithms, sources constantly sample the network state using the feedback from the receiver. The source modifies its emission rate in accordance to the feedback message. A reliable feedback message is only possible if the packet service is homogeneous. (2) In multicast applications, where packet replication is performed at switches, replicated packets are probable to be served at different rates if service to them, at different output ports, is not homogeneous. This is not desirable for such applications as the phenomena of packet replication to different multicast branches, at a switch, has to be carried out at a homogeneous speed for the following two important reasons: (1) heterogeneous service rates of replicated multicast packets result in different feedback informations, from different destinations (of same multicast session), and thus lead to unstable and less efficient network control. (2) in a switch architecture, the buffer requirement can be reduced if replication and serving of multicast packets are done at a homogeneous rate. Thus, there is a need of a service discipline which not only serve the applications at no less than their guaranteed rates but also assures a homogeneous service to packets. The homogeneous service to an application may precisely be translated in terms of maintaining a good inter-packets spacing. EWFQ scheme is identical to WFQ scheme expect that a packet is stamped with delayed

  3. Simple Models for Airport Delays During Transition to a Trajectory-Based Air Traffic System

    NASA Astrophysics Data System (ADS)

    Brooker, Peter

    It is now widely recognised that a paradigm shift in air traffic control concepts is needed. This requires state-of-the-art innovative technologies, making much better use of the information in the air traffic management (ATM) system. These paradigm shifts go under the names of NextGen in the USA and SESAR in Europe, which inter alia will make dramatic changes to the nature of airport operations. A vital part of moving from an existing system to a new paradigm is the operational implications of the transition process. There would be business incentives for early aircraft fitment, it is generally safer to introduce new technologies gradually, and researchers are already proposing potential transition steps to the new system. Simple queuing theory models are used to establish rough quantitative estimates of the impact of the transition to a more efficient time-based navigational and ATM system. Such models are approximate, but they do offer insight into the broad implications of system change and its significant features. 4D-equipped aircraft in essence have a contract with the airport runway and, in return, they would get priority over any other aircraft waiting for use of the runway. The main operational feature examined here is the queuing delays affecting non-4D-equipped arrivals. These get a reasonable service if the proportion of 4D-equipped aircraft is low, but this can deteriorate markedly for high proportions, and be economically unviable. Preventative measures would be to limit the additional growth of 4D-equipped flights and/or to modify their contracts to provide sufficient space for the non-4D-equipped flights to operate without excessive delays. There is a potential for non-Poisson models, for which there is little in the literature, and for more complex models, e.g. grouping a succession of 4D-equipped aircraft as a batch.

  4. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  5. Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

    PubMed Central

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.

    2014-01-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159

  6. The theory of community based health and safety programs: a critical examination

    PubMed Central

    Nilsen, P

    2006-01-01

    This paper examines the theoretical underpinning of the community based approach to health and safety programs. Drawing upon the literature, a theory is constructed by elucidating assumptions of community based programs. The theory is then put to test by analyzing the extent to which the assumptions are supported by empirical evidence and the extent to which the assumptions have been applied in community based injury prevention practice. Seven principles representing key assumptions of the community based approach to health and safety programs are identified. The analysis suggests that some of the principles may have important shortcomings. Programs overwhelmingly define geographical or geopolitical units as communities, which is problematic considering that these entities can be heterogeneous and characterized by a weak sense of community. This may yield insufficient community mobilization and inadequate program reach. At the same time, none of the principles identified as most plausible appears to be widely or fully applied in program practice. The implication is that many community based health and safety programs do not function at an optimum level, which could explain some of the difficulties in demonstrating effectiveness seen with many of these programs. PMID:16751442

  7. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  8. Promoting fruit and vegetable consumption. Testing an intervention based on the theory of planned behaviour.

    PubMed

    Kothe, E J; Mullan, B A; Butow, P

    2012-06-01

    This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result.

  9. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

    2014-09-01

    Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

  10. Promoting fruit and vegetable consumption. Testing an intervention based on the theory of planned behaviour.

    PubMed

    Kothe, E J; Mullan, B A; Butow, P

    2012-06-01

    This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result. PMID:22349778

  11. Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation

    ERIC Educational Resources Information Center

    Nam, Chang S.; Smith-Jackson, Tonya L.

    2007-01-01

    Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…

  12. When learning order affects sensitivity to base rates: challenges for theories of causal learning.

    PubMed

    Reips, Ulf-Dietrich; Waldmann, Michael R

    2008-01-01

    In three experiments we investigated whether two procedures of acquiring knowledge about the same causal structure, predictive learning (from causes to effects) versus diagnostic learning (from effects to causes), would lead to different base-rate use in diagnostic judgments. Results showed that learners are capable of incorporating base-rate information in their judgments regardless of the direction in which the causal structure is learned. However, this only holds true for relatively simple scenarios. When complexity was increased, base rates were only used after diagnostic learning, but were largely neglected after predictive learning. It could be shown that this asymmetry is not due to a failure of encoding base rates in predictive learning because participants in all conditions were fairly good at reporting them. The findings present challenges for all theories of causal learning.

  13. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach

    PubMed Central

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287

  14. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  15. Student learning in interprofessional practice-based environments: what does theory say?

    PubMed

    Roberts, Chris; Kumar, Koshila

    2015-01-01

    Student learning in interprofessional practice-based environments has garnered significant attention in the last decade, and is reflected in a corresponding increase in published literature on the topic. We review the current empirical literature with specific attention to the theoretical frameworks that have been used to illustrate how and why student learning occurs in interprofessional practice-based environments. Our findings show there are relatively few theoretical-based studies available to guide educators and researchers alike. We recommend a more considered and consistent use of theory and suggest that professional identity and socio-cultural frameworks offer promising avenues for advancing understandings of student learning and professional identity development within interprofessional practice-based environments. PMID:26611786

  16. Student learning in interprofessional practice-based environments: what does theory say?

    PubMed

    Roberts, Chris; Kumar, Koshila

    2015-11-26

    Student learning in interprofessional practice-based environments has garnered significant attention in the last decade, and is reflected in a corresponding increase in published literature on the topic. We review the current empirical literature with specific attention to the theoretical frameworks that have been used to illustrate how and why student learning occurs in interprofessional practice-based environments. Our findings show there are relatively few theoretical-based studies available to guide educators and researchers alike. We recommend a more considered and consistent use of theory and suggest that professional identity and socio-cultural frameworks offer promising avenues for advancing understandings of student learning and professional identity development within interprofessional practice-based environments.

  17. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach.

    PubMed

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287

  18. The modification of multiple articulation errors based on distinctive feature theory.

    PubMed

    Costello, J; Onstine, J M

    1976-05-01

    The effectiveness of articulation remediation procedures based on distinctive feature theory was evaluated through the administration of an articulation program designed for this purpose. Two preschool children with multiple phoneme errors which could be described by a distinctive feature analysis were the subjects. Both children substituted stop phonemes for most continuant phonemes. Each child was individually administered the distinctive feature program which is described in full. Data are presented which indicate the adequacy of the treatment program, the acquisition of correct articulation of the two directly treated target phonemes, and the concurrent improvement of five other nontreated error phonemes. Such across-phoneme generalization was predicted by distinctive feature theory. Certain modifications in the treatment program are suggested and theoretical/empirical questions regarding articulation remediation from a distinctive features viewpoint are discussed.

  19. Characterization of degeneration process in combustion instability based on dynamical systems theory.

    PubMed

    Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

    2015-11-01

    We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011); Chaos 22, 043128 (2012)]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics.

  20. Dynamically Incremental K-means++ Clustering Algorithm Based on Fuzzy Rough Set Theory

    NASA Astrophysics Data System (ADS)

    Li, Wei; Wang, Rujing; Jia, Xiufang; Jiang, Qing

    Being classic K-means++ clustering algorithm only for static data, dynamically incremental K-means++ clustering algorithm (DK-Means++) is presented based on fuzzy rough set theory in this paper. Firstly, in DK-Means++ clustering algorithm, the formula of similar degree is improved by weights computed by using of the important degree of attributes which are reduced on the basis of rough fuzzy set theory. Secondly, new data only need match granular which was clustered by K-means++ algorithm or seldom new data is clustered by classic K-means++ algorithm in global data. In this way, that all data is re-clustered each time in dynamic data set is avoided, so the efficiency of clustering is improved. Throughout our experiments showing, DK-Means++ algorithm can objectively and efficiently deal with clustering problem of dynamically incremental data.

  1. Characterization of degeneration process in combustion instability based on dynamical systems theory.

    PubMed

    Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

    2015-11-01

    We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011); Chaos 22, 043128 (2012)]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics. PMID:26651761

  2. Social judgment theory based model on opinion formation, polarization and evolution

    NASA Astrophysics Data System (ADS)

    Chau, H. F.; Wong, C. Y.; Chow, F. K.; Fung, Chi-Hang Fred

    2014-12-01

    The dynamical origin of opinion polarization in the real world is an interesting topic that physical scientists may help to understand. To properly model the dynamics, the theory must be fully compatible with findings by social psychologists on microscopic opinion change. Here we introduce a generic model of opinion formation with homogeneous agents based on the well-known social judgment theory in social psychology by extending a similar model proposed by Jager and Amblard. The agents’ opinions will eventually cluster around extreme and/or moderate opinions forming three phases in a two-dimensional parameter space that describes the microscopic opinion response of the agents. The dynamics of this model can be qualitatively understood by mean-field analysis. More importantly, first-order phase transition in opinion distribution is observed by evolving the system under a slow change in the system parameters, showing that punctuated equilibria in public opinion can occur even in a fully connected social network.

  3. Formula for the rms blur circle radius of Wolter telescope based on aberration theory

    NASA Technical Reports Server (NTRS)

    Shealy, David L.; Saha, Timo T.

    1990-01-01

    A formula for the rms blur circle for Wolter telescopes has been derived using the transverse ray aberration expressions of Saha (1985), Saha (1984), and Saha (1986). The resulting formula for the rms blur circle radius over an image plane and a formula for the surface of best focus based on third-, fifth-, and seventh-order aberration theory predict results in good agreement with exact ray tracing. It has also been shown that one of the two terms in the empirical formula of VanSpeybroeck and Chase (1972), for the rms blur circle radius of a Wolter I telescope can be justified by the aberration theory results. Numerical results are given comparing the rms blur radius and the surface of best focus vs the half-field angle computed by skew ray tracing and from analytical formulas for grazing incidence Wolter I-II telescopes and a normal incidence Cassegrain telescope.

  4. A 3-D elasticity theory based model for acoustic radiation from multilayered anisotropic plates.

    PubMed

    Shen, C; Xin, F X; Lu, T J

    2014-05-01

    A theoretical model built upon three-dimensional elasticity theory is developed to investigate the acoustic radiation from multilayered anisotropic plates subjected to a harmonic point force excitation. Fourier transform technique and stationary phase method are combined to predict the far-field radiated sound pressure of one-side water immersed plate. Compared to equivalent single-layer plate models, the present model based on elasticity theory can differentiate radiated sound pressure between dry-side and wet-side excited cases, as well as discrepancies induced by different layer sequences for multilayered anisotropic plates. These results highlight the superiority of the present theoretical model especially for handling multilayered anisotropic structures. PMID:24815294

  5. Control theory based airfoil design for potential flow and a finite volume discretization

    NASA Technical Reports Server (NTRS)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  6. Slender-Body Theory Based On Approximate Solution of the Transonic Flow Equation

    NASA Technical Reports Server (NTRS)

    Spreiter, John R.; Alksne, Alberta Y.

    1959-01-01

    Approximate solution of the nonlinear equations of the small disturbance theory of transonic flow are found for the pressure distribution on pointed slender bodies of revolution for flows with free-stream, Mach number 1, and for flows that are either purely subsonic or purely supersonic. These results are obtained by application of a method based on local linearization that was introduced recently in the analysis of similar problems in two-dimensional flows. The theory is developed for bodies of arbitrary shape, and specific results are given for cone-cylinders and for parabolic-arc bodies at zero angle of attack. All results are compared either with existing theoretical results or with experimental data.

  7. Theory-based approaches to understanding public emergency preparedness: implications for effective health and risk communication.

    PubMed

    Paek, Hye-Jin; Hilyard, Karen; Freimuth, Vicki; Barge, J Kevin; Mindlin, Michele

    2010-06-01

    Recent natural and human-caused disasters have awakened public health officials to the importance of emergency preparedness. Guided by health behavior and media effects theories, the analysis of a statewide survey in Georgia reveals that self-efficacy, subjective norm, and emergency news exposure are positively associated with the respondents' possession of emergency items and their stages of emergency preparedness. Practical implications suggest less focus on demographics as the sole predictor of emergency preparedness and more comprehensive measures of preparedness, including both a person's cognitive stage of preparedness and checklists of emergency items on hand. We highlight the utility of theory-based approaches for understanding and predicting public emergency preparedness as a way to enable more effective health and risk communication. PMID:20574880

  8. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area

  9. An Implicational View of Self-Healing and Personality Change Based on Gendlin's Theory of Experiencing.

    ERIC Educational Resources Information Center

    Bohart, Arthur C.

    There is relatively little theory on how psychotherapy clients self-heal since most theories of therapy stress the magic of the therapist's interventions. Of the theories that exist, this paper briefly discusses Carl Rogers' theory of self-actualization; and the dialectical theories of Greenberg and his colleagues, Jenkins, and Rychlak. Gendlin's…

  10. How Does an Activity Theory Model Help to Know Better about Teaching with Electronic-Exercise-Bases?

    ERIC Educational Resources Information Center

    Abboud-Blanchard, Maha; Cazes, Claire

    2012-01-01

    The research presented in this paper relies on Activity Theory and particularly on Engestrom's model, to better understand the use of Electronic-Exercise-Bases (EEB) by mathematics teachers. This theory provides a holistic approach to illustrate the complexity of the EEB integration. The results highlight reasons and ways of using EEB and show…

  11. The Conceptual Mechanism for Viable Organizational Learning Based on Complex System Theory and the Viable System Model

    ERIC Educational Resources Information Center

    Sung, Dia; You, Yeongmahn; Song, Ji Hoon

    2008-01-01

    The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…

  12. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    SciTech Connect

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  13. Motivational Measure of the Instruction Compared: Instruction Based on the ARCS Motivation Theory vs Traditional Instruction in Blended Courses

    ERIC Educational Resources Information Center

    Colakoglu, Ozgur M.; Akdemir, Omur

    2012-01-01

    The ARCS Motivation Theory was proposed to guide instructional designers and teachers who develop their own instruction to integrate motivational design strategies into the instruction. There is a lack of literature supporting the idea that instruction for blended courses if designed based on the ARCS Motivation Theory provides different…

  14. An Alienation-Based Framework for Student Experience in Higher Education: New Interpretations of Past Observations in Student Learning Theory

    ERIC Educational Resources Information Center

    Barnhardt, Bradford; Ginns, Paul

    2014-01-01

    This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…

  15. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  16. Evaluating Art Studio Courses at Sultan Qaboos University in Light of the Discipline Based Art Education Theory

    ERIC Educational Resources Information Center

    Al-Amri, Mohammed

    2010-01-01

    Discipline-Based Art Education (DBAE), a theory developed in the USA, has been influential but also used in Art Education institutions world-wide. One of its stated goals was to develop the quality of teaching art education. Today, it is used as a theory for identifying and assessing good practices in the field of Art Education. The purpose of…

  17. A simple laminate theory using the orthotropic viscoplasticity theory based on overstress. I - In-plane stress-strain relationships for metal matrix composites

    NASA Technical Reports Server (NTRS)

    Krempl, Erhard; Hong, Bor Zen

    1989-01-01

    A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.

  18. Symmetry-based theory for mean velocities in the flat plate turbulent boundary layer

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Hussain, Fazle; She, Zhen-Su

    2014-11-01

    A major difference from channel and pipe flow in zero-pressure-gradient turbulent boundary layer -ZPG-TBL is the streamwise development of the mean velocity components. We report a symmetry-based theory for ZPG-TBL, which yields a complete prediction for both the streamwise and vertical mean velocities, i.e. U (x , y) and V (x , y) . A significant result is the identification of a bulk flow constantκb, which achieves a highly accurate description of U above y + ~ 150; for a set of DNS data (Schlatter et al. 2010); the relative error is bounded within 0.1%. It is found that κb has a non-trivial streamwise development, and asymptote to 0.45 for large Re's the latter is consistent with the true Karman constant recently discovered for channel and pipe flows. The theory assumes a fractional scaling for the total stress, which yields, for the first time, an analytical prediction for V, Reynolds stress profile, friction coefficient and shape factor in ZPG-TBL, in good agreement with both DNS and experimental data. In conclusion, a complete analytical theory is viable for both laminar (i.e. Blasius) and turbulent boundary layers.

  19. Toward a limited realism for psychiatric nosology based on the coherence theory of truth.

    PubMed

    Kendler, K S

    2015-04-01

    A fundamental debate in the philosophy of science is whether our central concepts are true or only useful instruments to help predict and manipulate the world. The first position is termed 'realism' and the second 'instrumentalism'. Strong support for the instrumentalist position comes from the 'pessimistic induction' (PI) argument. Given that many key scientific concepts once considered true (e.g., humors, ether, epicycles, phlogiston) are now considered false, how, the argument goes, can we assert that our current concepts are true? The PI argument applies strongly to psychiatric diagnoses. Given our long history of abandoned diagnoses, arguments that we have finally 'gotten it right' and developed definitive psychiatric categories that correspond to observer-independent reality are difficult to defend. For our current diagnostic categories, we should settle for a less ambitious vision of truth. For this, the coherence theory, which postulates that something is true when it fits well with the other things we confidently know about the world, can serve us well. Using the coherence theory, a diagnosis is real to the extent that it is well integrated into our accumulating scientific data base. Furthermore, the coherence theory establishes a framework for us to evaluate our diagnostic categories and can provide a set of criteria, closely related to our concept of validators, for deciding when they are getting better. Finally, we need be much less skeptical about the truth status of the aggregate concept of psychiatric illness than we are regarding the specific categories in our current nosology.

  20. Toward a limited realism for psychiatric nosology based on the coherence theory of truth.

    PubMed

    Kendler, K S

    2015-04-01

    A fundamental debate in the philosophy of science is whether our central concepts are true or only useful instruments to help predict and manipulate the world. The first position is termed 'realism' and the second 'instrumentalism'. Strong support for the instrumentalist position comes from the 'pessimistic induction' (PI) argument. Given that many key scientific concepts once considered true (e.g., humors, ether, epicycles, phlogiston) are now considered false, how, the argument goes, can we assert that our current concepts are true? The PI argument applies strongly to psychiatric diagnoses. Given our long history of abandoned diagnoses, arguments that we have finally 'gotten it right' and developed definitive psychiatric categories that correspond to observer-independent reality are difficult to defend. For our current diagnostic categories, we should settle for a less ambitious vision of truth. For this, the coherence theory, which postulates that something is true when it fits well with the other things we confidently know about the world, can serve us well. Using the coherence theory, a diagnosis is real to the extent that it is well integrated into our accumulating scientific data base. Furthermore, the coherence theory establishes a framework for us to evaluate our diagnostic categories and can provide a set of criteria, closely related to our concept of validators, for deciding when they are getting better. Finally, we need be much less skeptical about the truth status of the aggregate concept of psychiatric illness than we are regarding the specific categories in our current nosology. PMID:25181016

  1. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

    PubMed

    Stejskal, Taryn M

    2012-01-01

    Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole

  2. General theory based on fluctuational electrodynamics for van der Waals interactions in colloidal systems

    SciTech Connect

    Yannopapas, Vassilios

    2007-12-15

    A rigorous theory for the determination of the van der Waals interactions in colloidal systems is presented. The method is based on fluctuational electrodynamics and a multiple-scattering method which provides the electromagnetic Green's tensor. In particular, expressions for the Green's tensor are presented for arbitrary, finite collections of colloidal particles, for infinitely periodic or defected crystals, as well as for finite slabs of crystals. The presented formalism allows for ab initio calculations of the van der Waals interactions in colloidal systems since it takes fully into account retardation, many-body, multipolar, and near-field effects.

  3. The Study of Relationship and Strategy Between New Energy and Economic Development Based on Decoupling Theory

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Xu, Hui; Liu, Yaping; Xu, Yang

    With the increasing pressure in energy conservation and emissions reduction, the new energy revolution in China is imminent. The implementation of electric energy substitution and cleaner alternatives is an important way to resolve the contradiction among economic growth, energy saving and emission reduction. This article demonstrates that China is in the second stage which energy consumption and GDP is increasing together with the reducing of energy consumption intensity based on the theory of decoupling. At the same time, new energy revolution needs to be realized through the increasing of the carbon productivity and the proportion of new energy.

  4. Design optical antenna and fiber coupling system based on the vector theory of reflection and refraction.

    PubMed

    Jiang, Ping; Yang, Huajun; Mao, Shengqian

    2015-10-01

    A Cassegrain antenna system and an optical fiber coupling system which consists of a plano-concave lens and a plano-convex lens are designed based on the vector theory of reflection and refraction, so as to improve the transmission performance of the optical antenna and fiber coupling system. Three-dimensional ray tracing simulation are performed and results of the optical aberrations calculation and the experimental test show that the aberrations caused by on-axial defocusing, off-axial defocusing and deflection of receiving antenna can be well corrected by the optical fiber coupling system. PMID:26480125

  5. Constraints on Neutron Star Radii Based on Chiral Effective Field Theory Interactions

    SciTech Connect

    Hebeler, K.; Lattimer, J. M.; Pethick, C. J.; Schwenk, A.

    2010-10-15

    We show that microscopic calculations based on chiral effective field theory interactions constrain the properties of neutron-rich matter below nuclear densities to a much higher degree than is reflected in commonly used equations of state. Combined with observed neutron star masses, our results lead to a radius R=9.7-13.9 km for a 1.4M{sub {center_dot}} star, where the theoretical range is due, in about equal amounts, to uncertainties in many-body forces and to the extrapolation to high densities.

  6. Evaluation of a preschool nutrition education program based on the theory of multiple intelligences.

    PubMed

    Cason, K L

    2001-01-01

    This report describes the evaluation of a preschool nutrition education program based on the theory of multiple intelligences. Forty-six nutrition educators provided a series of 12 lessons to 6102 preschool-age children. The program was evaluated using a pretest/post-test design to assess differences in fruit and vegetable identification, healthy snack choices, willingness to taste foods, and eating behaviors. Subjects showed significant improvement in food identification and recognition, healthy snack identification, willingness to taste foods, and frequency of fruit, vegetable, meat, and dairy consumption. The evaluation indicates that the program was an effective approach for educating preschool children about nutrition.

  7. A description of the mechanical behavior of composite solid propellants based on molecular theory

    NASA Technical Reports Server (NTRS)

    Landel, R. F.

    1976-01-01

    Both the investigation and the representation of the stress-strain response (including rupture) of gum and filled elastomers can be based on a simple functional statement. Internally consistent experiments are used to sort out the effects of time, temperature, strain and crosslink density on gum rubbers. All effects are readily correlated and shown to be essentially independent of the elastomer when considered in terms of non-dimensionalized stress, strain and time. A semiquantitative molecular theory is developed to explain this result. The introduction of fillers modifies the response, but, guided by the framework thus provided, their effects can be readily accounted for.

  8. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  9. An information theory based search for homogeneity on the largest accessible scale

    NASA Astrophysics Data System (ADS)

    Sarkar, Suman; Pandey, Biswajit

    2016-11-01

    We analyze the SDSS DR12 quasar catalogue to test the large-scale smoothness in the quasar distribution. We quantify the degree of inhomogeneity in the quasar distribution using information theory based measures and find that the degree of inhomogeneity diminishes with increasing length scales which finally reach a plateau at $\\sim 250 \\, h^{-1}\\, {\\rm Mpc}$. The residual inhomogeneity at the plateau is consistent with that expected for a Poisson point process. Our results indicate that the quasar distribution is homogeneous beyond length scales of $250 \\, h^{-1}\\, {\\rm Mpc}$.

  10. Data collection method for mobile sensor networks based on the theory of thermal fields.

    PubMed

    Macuha, Martin; Tariq, Muhammad; Sato, Takuro

    2011-01-01

    Many sensor applications are aimed for mobile objects, where conventional routing approaches of data delivery might fail. Such applications are habitat monitoring, human probes or vehicular sensing systems. This paper targets such applications and proposes lightweight proactive distributed data collection scheme for Mobile Sensor Networks (MSN) based on the theory of thermal fields. By proper mapping, we create distribution function which allows considering characteristics of a sensor node. We show the functionality of our proposed forwarding method when adapted to the energy of sensor node. We also propose enhancement in order to maximize lifetime of the sensor nodes. We thoroughly evaluate proposed solution and discuss the tradeoffs. PMID:22164011

  11. Unique laminar-flow stability limit based shallow-water theory

    USGS Publications Warehouse

    Chen, Cheng-lung

    1993-01-01

    Two approaches are generally taken in deriving the stability limit for the Froude member (Fs) for laminar sheet flow. The first approach used the Orr-Sommerfeld equation, while the second uses the cross-section-averaged equations of continuity and motion. Because both approaches are based on shallow-water theory, the values of Fs obtained from both approaches should be identical, yet in the literature they are not. This suggests that a defect exists in at least one of the two approaches. After examining the governing equations used in both approaches, one finds that the existing cross-section -averaged equation of motion is dependent on the frame of reference.

  12. Superior coexistence: systematicALLY regulatING land subsidence BASED on set pair theory

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Gong, S.-L.

    2015-11-01

    Anthropogenic land subsidence is an environmental side effect of exploring and using natural resources in the process of economic development. The key points of the system for controlling land subsidence include cooperation and superior coexistence while the economy develops, exploring and using natural resources, and geological environmental safety. Using the theory and method of set pair analysis (SPA), this article anatomises the factors, effects, and transformation of land subsidence. Based on the principle of superior coexistence, this paper promotes a technical approach to the system for controlling land subsidence, in order to improve the prevention and control of geological hazards.

  13. Systematic errors analysis for a large dynamic range aberrometer based on aberration theory.

    PubMed

    Wu, Peng; Liu, Sheng; DeHoog, Edward; Schwiegerling, Jim

    2009-11-10

    In Ref. 1, it was demonstrated that the significant systematic errors of a type of large dynamic range aberrometer are strongly related to the power error (defocus) in the input wavefront. In this paper, a generalized theoretical analysis based on vector aberration theory is presented, and local shift errors of the SH spot pattern as a function of the lenslet position and the local wavefront tilt over the corresponding lenslet are derived. Three special cases, a spherical wavefront, a crossed cylindrical wavefront, and a cylindrical wavefront, are analyzed and the possibly affected Zernike terms in the wavefront reconstruction are investigated. The simulation and experimental results are illustrated to verify the theoretical predictions.

  14. Wavefront sensing based on phase contrast theory and coherent optical processing

    NASA Astrophysics Data System (ADS)

    Lei, Huang; Qi, Bian; Chenlu, Zhou; Tenghao, Li; Mali, Gong

    2016-07-01

    A novel wavefront sensing method based on phase contrast theory and coherent optical processing is proposed. The wavefront gradient field in the object plane is modulated into intensity distribution in a gang of patterns, making high-density detection available. By applying the method, we have also designed a wavefront sensor. It consists of a classical coherent optical processing system, a CCD detector array, two pieces of orthogonal composite sinusoidal gratings, and a mechanical structure that can perform real-time linear positioning. The simulation results prove and demonstrate the validity of the method and the sensor in high-precision measurement of the wavefront gradient field.

  15. An ISAR imaging algorithm for the space satellite based on empirical mode decomposition theory

    NASA Astrophysics Data System (ADS)

    Zhao, Tao; Dong, Chun-zhu

    2014-11-01

    Currently, high resolution imaging of the space satellite is a popular topic in the field of radar technology. In contrast with regular targets, the satellite target often moves along with its trajectory and simultaneously its solar panel substrate changes the direction toward the sun to obtain energy. Aiming at the imaging problem, a signal separating and imaging approach based on the empirical mode decomposition (EMD) theory is proposed, and the approach can realize separating the signal of two parts in the satellite target, the main body and the solar panel substrate and imaging for the target. The simulation experimentation can demonstrate the validity of the proposed method.

  16. Best practices: implementation of a glucose screening program based on diffusion of innovation theory methods.

    PubMed

    Nicol, Ginger E; Morrato, Elaine H; Johnson, Mark C; Campagna, Elizabeth; Yingling, Michael D; Pham, Victor; Newcomer, John W

    2011-01-01

    There is public health interest in the identification and treatment of modifiable cardiometabolic risk factors among patients treated with antipsychotic medications. However, best-practice screening recommendations endorsed by multiple medical organizations have not translated into real-world clinical practice. Quality improvement strategies may help to address the gap between policy and implementation. This column describes the successful implementation of a best-practice glucose screening program in a large network of community mental health centers that was based on Six Sigma and diffusion of innovation theory.

  17. Evaluation of a preschool nutrition education program based on the theory of multiple intelligences.

    PubMed

    Cason, K L

    2001-01-01

    This report describes the evaluation of a preschool nutrition education program based on the theory of multiple intelligences. Forty-six nutrition educators provided a series of 12 lessons to 6102 preschool-age children. The program was evaluated using a pretest/post-test design to assess differences in fruit and vegetable identification, healthy snack choices, willingness to taste foods, and eating behaviors. Subjects showed significant improvement in food identification and recognition, healthy snack identification, willingness to taste foods, and frequency of fruit, vegetable, meat, and dairy consumption. The evaluation indicates that the program was an effective approach for educating preschool children about nutrition. PMID:11953232

  18. A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua

    NASA Astrophysics Data System (ADS)

    Taylor, Washington; Wang, Yi-Nan

    2016-01-01

    We use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be ˜ 1048. The distribution of bases peaks around h 1,1 ˜ 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We find that the number of non-Higgsable gauge group factors grows roughly linearly in h 1,1 of the threefold base. Typical bases have ˜ 6 isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) × SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) × SU(2) is the third most common connected two-factor product group, following SU(2) × SU(2) and G 2 × SU(2), which arise more frequently.

  19. A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua

    DOE PAGESBeta

    Taylor, Washington; Wang, Yi-Nan

    2016-01-22

    Here, we use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be ~ 1048. Moreover, the distribution of bases peaks around h1,1 ~ 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We also find that the number of non-Higgsable gauge group factors grows roughly linearly in h1,1 of the threefold base. Typical bases have ~ 6more » isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) x SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) x SU(2) is the third most common connected two-factor product group, following SU(2) x SU(2) and G2 x SU(2), which arise more frequently.« less

  20. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory.

    PubMed

    Nikolaev, Evgeni V; Sontag, Eduardo D

    2016-04-01

    Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic") or environmental ("extrinsic") noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic") behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a

  1. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory

    PubMed Central

    Nikolaev, Evgeni V.

    2016-01-01

    Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of “on-off” switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular (“intrinsic”) or environmental (“extrinsic”) noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a “majority-vote” correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of “monotone” dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable (“chaotic”) behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation

  2. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory.

    PubMed

    Nikolaev, Evgeni V; Sontag, Eduardo D

    2016-04-01

    Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic") or environmental ("extrinsic") noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic") behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a

  3. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

    PubMed Central

    Voth, Elizabeth C; Oelke, Nelly D

    2016-01-01

    Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

  4. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

    PubMed Central

    Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

  5. A general theory of evolution based on energy efficiency: its implications for diseases.

    PubMed

    Yun, Anthony J; Lee, Patrick Y; Doux, John D; Conley, Buford R

    2006-01-01

    We propose a general theory of evolution based on energy efficiency. Life represents an emergent property of energy. The earth receives energy from cosmic sources such as the sun. Biologic life can be characterized by the conversion of available energy into complex systems. Direct energy converters such as photosynthetic microorganisms and plants transform light energy into high-energy phosphate bonds that fuel biochemical work. Indirect converters such as herbivores and carnivores predominantly feed off the food chain supplied by these direct converters. Improving energy efficiency confers competitive advantage in the contest among organisms for energy. We introduce a term, return on energy (ROE), as a measure of energy efficiency. We define ROE as a ratio of the amount of energy acquired by a system to the amount of energy consumed to generate that gain. Life-death cycling represents a tactic to sample the environment for innovations that allow increases in ROE to develop over generations rather than an individual lifespan. However, the variation-selection strategem of Darwinian evolution may define a particular tactic rather than an overarching biological paradigm. A theory of evolution based on competition for energy and driven by improvements in ROE both encompasses prior notions of evolution and portends post-Darwinian mechanisms. Such processes may involve the exchange of non-genetic traits that improve ROE, as exemplified by cognitive adaptations or memes. Under these circumstances, indefinite persistence may become favored over life-death cycling, as increases in ROE may then occur more efficiently within a single lifespan rather than over multiple generations. The key to this transition may involve novel methods to address the promotion of health and cognitive plasticity. We describe the implications of this theory for human diseases. PMID:16122878

  6. Theory of chemical kinetics and charge transfer based on nonequilibrium thermodynamics.

    PubMed

    Bazant, Martin Z

    2013-05-21

    Advances in the fields of catalysis and electrochemical energy conversion often involve nanoparticles, which can have kinetics surprisingly different from the bulk material. Classical theories of chemical kinetics assume independent reactions in dilute solutions, whose rates are determined by mean concentrations. In condensed matter, strong interactions alter chemical activities and create variations that can dramatically affect the reaction rate. The extreme case is that of a reaction coupled to a phase transformation, whose kinetics must depend not only on the order parameter but also on its gradients at phase boundaries. Reaction-driven phase transformations are common in electrochemistry, when charge transfer is accompanied by ion intercalation or deposition in a solid phase. Examples abound in Li-ion, metal-air, and lead-acid batteries, as well as metal electrodeposition-dissolution. Despite complex thermodynamics, however, the standard kinetic model is the Butler-Volmer equation, based on a dilute solution approximation. The Marcus theory of charge transfer likewise considers isolated reactants and neglects elastic stress, configurational entropy, and other nonidealities in condensed phases. The limitations of existing theories recently became apparent for the Li-ion battery material LixFePO4 (LFP). It has a strong tendency to separate into Li-rich and Li-poor solid phases, which scientists believe limits its performance. Chemists first modeled phase separation in LFP as an isotropic "shrinking core" within each particle, but experiments later revealed striped phase boundaries on the active crystal facet. This raised the question: What is the reaction rate at a surface undergoing a phase transformation? Meanwhile, dramatic rate enhancement was attained with LFP nanoparticles, and classical battery models could not predict the roles of phase separation and surface modification. In this Account, I present a general theory of chemical kinetics, developed over

  7. Theory of chemical kinetics and charge transfer based on nonequilibrium thermodynamics.

    PubMed

    Bazant, Martin Z

    2013-05-21

    Advances in the fields of catalysis and electrochemical energy conversion often involve nanoparticles, which can have kinetics surprisingly different from the bulk material. Classical theories of chemical kinetics assume independent reactions in dilute solutions, whose rates are determined by mean concentrations. In condensed matter, strong interactions alter chemical activities and create variations that can dramatically affect the reaction rate. The extreme case is that of a reaction coupled to a phase transformation, whose kinetics must depend not only on the order parameter but also on its gradients at phase boundaries. Reaction-driven phase transformations are common in electrochemistry, when charge transfer is accompanied by ion intercalation or deposition in a solid phase. Examples abound in Li-ion, metal-air, and lead-acid batteries, as well as metal electrodeposition-dissolution. Despite complex thermodynamics, however, the standard kinetic model is the Butler-Volmer equation, based on a dilute solution approximation. The Marcus theory of charge transfer likewise considers isolated reactants and neglects elastic stress, configurational entropy, and other nonidealities in condensed phases. The limitations of existing theories recently became apparent for the Li-ion battery material LixFePO4 (LFP). It has a strong tendency to separate into Li-rich and Li-poor solid phases, which scientists believe limits its performance. Chemists first modeled phase separation in LFP as an isotropic "shrinking core" within each particle, but experiments later revealed striped phase boundaries on the active crystal facet. This raised the question: What is the reaction rate at a surface undergoing a phase transformation? Meanwhile, dramatic rate enhancement was attained with LFP nanoparticles, and classical battery models could not predict the roles of phase separation and surface modification. In this Account, I present a general theory of chemical kinetics, developed over

  8. A theory-based evaluation of a community-based funding scheme in a disadvantaged suburban city area.

    PubMed

    Hickey, Gráinne; McGilloway, Sinead; O'Brien, Morgan; Leckey, Yvonne; Devlin, Maurice

    2015-10-01

    Community-driven development (CDD) initiatives frequently involve funding schemes which are aimed at channelling financial investment into local need and fostering community participation and engagement. This exploratory study examined, through a program theory approach, the design and implementation of a small-scale, community-based fund in Ireland. Observations, documentary analysis, interviews and group discussions with 19 participants were utilized to develop a detailed understanding of the program mechanisms, activities and processes, as well as the experiences of key stakeholders engaged with the funding scheme and its implementation. The findings showed that there were positive perceptions of the scheme and its function within the community. Overall, the availability of funding was perceived by key stakeholders as being beneficial. However, there were concerns over the accessibility of the scheme for more marginalized members of the community, as well as dissatisfaction with the openness and transparency surrounding funding eligibility. Lessons for the implementation of small-scale CDD funds are elaborated and the utility of program theory approaches for evaluators and planners working with programs that fund community-based initiatives is outlined.

  9. A theory-based evaluation of a community-based funding scheme in a disadvantaged suburban city area.

    PubMed

    Hickey, Gráinne; McGilloway, Sinead; O'Brien, Morgan; Leckey, Yvonne; Devlin, Maurice

    2015-10-01

    Community-driven development (CDD) initiatives frequently involve funding schemes which are aimed at channelling financial investment into local need and fostering community participation and engagement. This exploratory study examined, through a program theory approach, the design and implementation of a small-scale, community-based fund in Ireland. Observations, documentary analysis, interviews and group discussions with 19 participants were utilized to develop a detailed understanding of the program mechanisms, activities and processes, as well as the experiences of key stakeholders engaged with the funding scheme and its implementation. The findings showed that there were positive perceptions of the scheme and its function within the community. Overall, the availability of funding was perceived by key stakeholders as being beneficial. However, there were concerns over the accessibility of the scheme for more marginalized members of the community, as well as dissatisfaction with the openness and transparency surrounding funding eligibility. Lessons for the implementation of small-scale CDD funds are elaborated and the utility of program theory approaches for evaluators and planners working with programs that fund community-based initiatives is outlined. PMID:25933408

  10. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain

    PubMed Central

    Brette, Romain

    2015-01-01

    Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496

  11. Adapting evidence-based interventions using a common theory, practices, and principles.

    PubMed

    Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D

    2014-01-01

    Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed. PMID:24079747

  12. Coding theory based models for protein translation initiation in prokaryotic organisms.

    SciTech Connect

    May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

    2003-03-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  13. Carbon nanotube-reinforced composites: frequency analysis theories based on the matrix stiffness

    NASA Astrophysics Data System (ADS)

    Amin, Sara Shayan; Dalir, Hamid; Farshidianfar, Anooshirvan

    2009-03-01

    Strong and versatile carbon nanotubes are finding new applications in improving conventional polymer-based fibers and films. This paper studies the influence of matrix stiffness and the intertube radial displacements on free vibration of an individual double-walled carbon nanotube (DWNT). For this, a double elastic beam model is presented for frequency analysis in a DWNT embedded in an elastic matrix. The analysis is based on both Euler-Bernoulli and Timoshenko beam theories which considers shear deformation and rotary inertia and for both concentric and non-concentric assumptions considering intertube radial displacements and the related internal degrees of freedom. New intertube resonant frequencies and the associated non-coaxial vibrational modes are calculated. Detailed results are demonstrated for the dependence of resonant frequencies and mode shapes on the matrix stiffness. The results indicate that internal radial displacement and surrounding matrix stiffness could substantially affect resonant frequencies especially for longer double-walled carbon nanotubes of larger innermost radius at higher resonant frequencies, and thus the latter does not keep the otherwise concentric structure at ultrahigh frequencies. Therefore, depending on the matrix stiffness, for carbon nanotubes reinforced composites, different analysis techniques should be used while the aspect ratio of carbon nanotubes has a little effect on the analysis theory which should be selected.

  14. Improving breast cancer control among Latinas: evaluation of a theory-based educational program.

    PubMed

    Mishra, S I; Chavez, L R; Magaña, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A

    1998-10-01

    The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384

  15. Credibility theory based dynamic control bound optimization for reservoir flood limited water level

    NASA Astrophysics Data System (ADS)

    Jiang, Zhiqiang; Sun, Ping; Ji, Changming; Zhou, Jianzhong

    2015-10-01

    The dynamic control operation of reservoir flood limited water level (FLWL) can solve the contradictions between reservoir flood control and beneficial operation well, and it is an important measure to make sure the security of flood control and realize the flood utilization. The dynamic control bound of FLWL is a fundamental key element for implementing reservoir dynamic control operation. In order to optimize the dynamic control bound of FLWL by considering flood forecasting error, this paper took the forecasting error as a fuzzy variable, and described it with the emerging credibility theory in recent years. By combining the flood forecasting error quantitative model, a credibility-based fuzzy chance constrained model used to optimize the dynamic control bound was proposed in this paper, and fuzzy simulation technology was used to solve the model. The FENGTAN reservoir in China was selected as a case study, and the results show that, compared with the original operation water level, the initial operation water level (IOWL) of FENGTAN reservoir can be raised 4 m, 2 m and 5.5 m respectively in the three division stages of flood season, and without increasing flood control risk. In addition, the rationality and feasibility of the proposed forecasting error quantitative model and credibility-based dynamic control bound optimization model are verified by the calculation results of extreme risk theory.

  16. Practical application of game theory based production flow planning method in virtual manufacturing networks

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2016-08-01

    Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

  17. Dissemination of a theory-based online bone health program: Two intervention approaches.

    PubMed

    Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa

    2015-06-01

    With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668

  18. Massive Yang-Mills theory based on the nonlinearly realized gauge group

    SciTech Connect

    Bettinelli, D.; Ferrari, R.; Quadri, A.

    2008-02-15

    We propose a subtraction scheme for a massive Yang-Mills theory realized via a nonlinear representation of the gauge group [here SU(2)]. It is based on the subtraction of the poles in D-4 of the amplitudes, in dimensional regularization, after a suitable normalization has been performed. Perturbation theory is in the number of loops, and the procedure is stable under iterative subtraction of the poles. The unphysical Goldstone bosons, the Faddeev-Popov ghosts, and the unphysical mode of the gauge field are expected to cancel out in the unitarity equation. The spontaneous symmetry breaking parameter is not a physical variable. We use the tools already tested in the nonlinear sigma model: hierarchy in the number of Goldstone boson legs and weak-power-counting property (finite number of independent divergent amplitudes at each order). It is intriguing that the model is naturally based on the symmetry SU(2){sub L} local x SU(2){sub R} global. By construction the physical amplitudes depend on the mass and on the self-coupling constant of the gauge particle and moreover on the scale parameter of the radiative corrections. The Feynman rules are in the Landau gauge.

  19. Looking to the future of new media in health marketing: deriving propositions based on traditional theories.

    PubMed

    Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice

    2008-01-01

    Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future.

  20. Research on the alternatives in a strategic environmental assessment based on the extension theory.

    PubMed

    Du, Jing; Yang, Yang; Xu, Ling; Zhang, Shushen; Yang, Fenglin

    2012-09-01

    The main purpose of a strategic environmental assessment (SEA) is to facilitate the early consideration of potential environmental impacts in decision-making processes. SEA alternative identification is a core issue within the SEA framework. However, the current methods of SEA alternative formulation and selection are constrained by the limited setting range and lack of scientific evaluation. Thus, the current paper attempts to provide a new methodology based on the extension theory to identify a range of alternatives and screen the best one. Extension planning is applied to formulate a set of alternatives that satisfy the reasonable interests of the stakeholders. Extension priority evaluation is used to assess and optimize the alternatives and present a scientific methodology for the SEA alternative study. Thereafter, the urban traffic plan of Dalian City is used as an example to demonstrate the feasibility of the new method. The traffic planning scheme and the environmental protection scheme are organically combined based on the extension theory, and the reliability and practicality of this approach are examined.

  1. Utilizing measure-based feedback in control-mastery theory: A clinical error.

    PubMed

    Snyder, John; Aafjes-van Doorn, Katie

    2016-09-01

    Clinical errors and ruptures are an inevitable part of clinical practice. Often times, therapists are unaware that a clinical error or rupture has occurred, leaving no space for repair, and potentially leading to patient dropout and/or less effective treatment. One way to overcome our blind spots is by frequently and systematically collecting measure-based feedback from the patient. Patient feedback measures that focus on the process of psychotherapy such as the Patient's Experience of Attunement and Responsiveness scale (PEAR) can be used in conjunction with treatment outcome measures such as the Outcome Questionnaire 45.2 (OQ-45.2) to monitor the patient's therapeutic experience and progress. The regular use of these types of measures can aid clinicians in the identification of clinical errors and the associated patient deterioration that might otherwise go unnoticed and unaddressed. The current case study describes an instance of clinical error that occurred during the 2-year treatment of a highly traumatized young woman. The clinical error was identified using measure-based feedback and subsequently understood and addressed from the theoretical standpoint of the control-mastery theory of psychotherapy. An alternative hypothetical response is also presented and explained using control-mastery theory. (PsycINFO Database Record PMID:27631857

  2. A Theory-Based Approach to Teaching Young Children about Health: A Recipe for Understanding

    ERIC Educational Resources Information Center

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children's concepts are integrated into theories. Concept-learning studies have documented the central role that theories play in children's learning of experimenter-defined categories but have yet to extensively examine complex, real-world concepts, such as health. The present study…

  3. Study on the salary system for IT enterprise based on double factor motivation theory

    NASA Astrophysics Data System (ADS)

    Zhuang, Chen; Qian, Wu

    2005-12-01

    To improve the fact that the IT enterprise's salary & compensation system can not motivate a company's staff efficiently, the salary system based on Hertzberg's double factor motivation theory and the enterprise characteristics is presented. The salary system includes a salary model, an assessment model and a performance model. The system is connected with a cash incentive based on the staff's performance and emphasizes that the salary alone is not a motivating factor. Health care, for example, may also play a positive role on the motivation factor. According to this system, a scientific and reasonable salary & compensation management system was established and applied in an IT enterprise. It was found to promote the enterprise's overall performance and competitive power.

  4. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls.

  5. Re-Examining of Moffitt's Theory of Delinquency through Agent Based Modeling.

    PubMed

    Leaw, Jia Ning; Ang, Rebecca P; Huan, Vivien S; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt's theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  6. A Novel Tone Mapping Based on Double-Anchoring Theory for Displaying HDR Images

    NASA Astrophysics Data System (ADS)

    Wang, Jinhua; Xu, De; Li, Bing

    In this paper, we present a Double-Anchoring Based Tone Mapping (DABTM) algorithm for displaying high dynamic range (HDR) images. First, two anchoring values are obtained using the double-anchoring theory. Second, we use the two values to formulate the compressing operator, which can achieve the aim of tone mapping directly. A new method based on accelerated K-means for the decomposition of HDR images into groups (frameworks) is proposed. Most importantly, a group of piecewise-overlap linear functions is put forward to define the belongingness of pixels to their locating frameworks. Experiments show that our algorithm is capable of achieving dynamic range compression, while preserving fine details and avoiding common artifacts such as gradient reversals, halos, or loss of local contrast.

  7. Predictive models based on sensitivity theory and their application to practical shielding problems

    SciTech Connect

    Bhuiyan, S.I.; Roussin, R.W.; Lucius, J.L.; Bartine, D.E.

    1983-01-01

    Two new calculational models based on the use of cross-section sensitivity coefficients have been devised for calculating radiation transport in relatively simple shields. The two models, one an exponential model and the other a power model, have been applied, together with the traditional linear model, to 1- and 2-m-thick concrete-slab problems in which the water content, reinforcing-steel content, or composition of the concrete was varied. Comparing the results obtained with the three models with those obtained from exact one-dimensional discrete-ordinates transport calculations indicates that the exponential model, named the BEST model (for basic exponential shielding trend), is a particularly promising predictive tool for shielding problems dominated by exponential attenuation. When applied to a deep-penetration sodium problem, the BEST model also yields better results than do calculations based on second-order sensitivity theory.

  8. Buckling Analysis for Stiffened Anisotropic Circular Cylinders Based on Sanders Nonlinear Shell Theory

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.

    2014-01-01

    Nonlinear and bifurcation buckling equations for elastic, stiffened, geometrically perfect, right-circular cylindrical, anisotropic shells subjected to combined loads are presented that are based on Sanders' shell theory. Based on these equations, a three-parameter approximate Rayleigh-Ritz solution and a classical solution to the buckling problem are presented for cylinders with simply supported edges. Extensive comparisons of results obtained from these solutions with published results are also presented for a wide range of cylinder constructions. These comparisons include laminated-composite cylinders with a wide variety of shell-wall orthotropies and anisotropies. Numerous results are also given that show the discrepancies between the results obtained by using Donnell's equations and variants of Sanders' equations. For some cases, nondimensional parameters are identified and "master" curves are presented that facilitate the concise representation of results.

  9. Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling

    PubMed Central

    Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  10. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls. PMID:22471316

  11. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    PubMed

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  12. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies

    PubMed Central

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/. PMID:26313379

  13. Unit Template Synchronous Reference Frame Theory Based Control Algorithm for DSTATCOM

    NASA Astrophysics Data System (ADS)

    Bangarraju, J.; Rajagopal, V.; Jayalaxmi, A.

    2014-04-01

    This article proposes new and simplified unit templates instead of standard phase locked loop (PLL) for Synchronous Reference Frame Theory Control Algorithm (SRFT). The extraction of synchronizing components (sinθ and cosθ) for parks and inverse parks transformation using standard PLL takes more execution time. This execution time in control algorithm delays the extraction of reference source current generation. The standard PLL not only takes more execution time but also increases the reactive power burden on the Distributed Static Compensator (DSTATCOM). This work proposes a unit template based SRFT control algorithm for four-leg insulated gate bipolar transistor based voltage source converter for DSTATCOM in distribution systems. This will reduce the execution time and reactive power burden on the DSTATCOM. The proposed DSTATCOM suppress harmonics, regulates the terminal voltage along with neutral current compensation. The DSTATCOM in distribution systems with proposed control algorithm is modeled and simulated using MATLAB using SIMULINK and Simpower systems toolboxes.

  14. Calculation of thermal expansion coefficient of glasses based on topological constraint theory

    NASA Astrophysics Data System (ADS)

    Zeng, Huidan; Ye, Feng; Li, Xiang; Wang, Ling; Yang, Bin; Chen, Jianding; Zhang, Xianghua; Sun, Luyi

    2016-10-01

    In this work, the thermal expansion behavior and the structure configuration evolution of glasses were studied. Degree of freedom based on the topological constraint theory is correlated with configuration evolution; considering the chemical composition and the configuration change, the analytical equation for calculating the thermal expansion coefficient of glasses from degree of freedom was derived. The thermal expansion of typical silicate and chalcogenide glasses was examined by calculating their thermal expansion coefficients (TEC) using the approach stated above. The results showed that this approach was energetically favorable for glass materials and revealed the corresponding underlying essence from viewpoint of configuration entropy. This work establishes a configuration-based methodology to calculate the thermal expansion coefficient of glasses that, lack periodic order.

  15. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  16. Evaluation of Transverse Thermal Stresses in Composite Plates Based on First-Order Shear Deformation Theory

    NASA Technical Reports Server (NTRS)

    Rolfes, R.; Noor, A. K.; Sparr, H.

    1998-01-01

    A postprocessing procedure is presented for the evaluation of the transverse thermal stresses in laminated plates. The analytical formulation is based on the first-order shear deformation theory and the plate is discretized by using a single-field displacement finite element model. The procedure is based on neglecting the derivatives of the in-plane forces and the twisting moments, as well as the mixed derivatives of the bending moments, with respect to the in-plane coordinates. The calculated transverse shear stiffnesses reflect the actual stacking sequence of the composite plate. The distributions of the transverse stresses through-the-thickness are evaluated by using only the transverse shear forces and the thermal effects resulting from the finite element analysis. The procedure is implemented into a postprocessing routine which can be easily incorporated into existing commercial finite element codes. Numerical results are presented for four- and ten-layer cross-ply laminates subjected to mechanical and thermal loads.

  17. Combinatorial density functional theory-based screening of surface alloys for the oxygen reduction reaction.

    SciTech Connect

    Greeley, J.; Norskov, J.; Center for Nanoscale Materials; Technical Univ. of Denmark

    2009-03-26

    A density functional theory (DFT) -based, combinatorial search for improved oxygen reduction reaction (ORR) catalysts is presented. A descriptor-based approach to estimate the ORR activity of binary surface alloys, wherein alloying occurs only in the surface layer, is described, and rigorous, potential-dependent computational tests of the stability of these alloys in aqueous, acidic environments are presented. These activity and stability criteria are applied to a database of DFT calculations on nearly 750 binary transition metal surface alloys; of these, many are predicted to be active for the ORR but, with few exceptions, they are found to be thermodynamically unstable in the acidic environments typical of low-temperature fuel cells. The results suggest that, absent other thermodynamic or kinetic mechanisms to stabilize the alloys, surface alloys are unlikely to serve as useful ORR catalysts over extended periods of operation.

  18. Optimal design of hydrometric monitoring networks with dynamic components based on Information Theory

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri

    2016-04-01

    The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A

  19. Understanding exercise behaviour during home-based cardiac rehabilitation: a theory of planned behaviour perspective.

    PubMed

    Blanchard, Christopher

    2008-01-01

    Although home-based cardiac rehabilitation (CR) programs have been shown to produce significant increases in exercise capacity, obtaining patient adherence to these programs has been challenging. It is therefore critical to identify key theoretical determinants of exercise during home-based CR in order to inform the development of behavioural interventions that improve adherence. The present study examined the utility of the theory of planned behaviour (TPB) in explaining exercise behaviour during home-based CR. Seventy-six patients who were receiving 6 months of home-based CR completed a TPB questionnaire at the beginning and mid-point of the program and a physical activity scale at the mid-point and end of the program. Path analyses showed that attitude and perceived behavioural control significantly predicted intention for both time intervals (baseline to 3 months, and 3 months to 6 months), whereas subjective norm only predicted intention within the 1st 3 months. Intention significantly predicted implementation intention, which, in turn, significantly predicted exercise for both time intervals. Finally, several underlying accessible beliefs were significantly related to exercise for both time intervals. Therefore, results suggest that the TPB is a potentially useful framework for understanding exercise behaviour during home-based CR.

  20. Coverage theories for metagenomic DNA sequencing based on a generalization of Stevens' theorem.

    PubMed

    Wendl, Michael C; Kota, Karthik; Weinstock, George M; Mitreva, Makedonka

    2013-11-01

    Metagenomic project design has relied variously upon speculation, semi-empirical and ad hoc heuristic models, and elementary extensions of single-sample Lander-Waterman expectation theory, all of which are demonstrably inadequate. Here, we propose an approach based upon a generalization of Stevens' Theorem for randomly covering a domain. We extend this result to account for the presence of multiple species, from which are derived useful probabilities for fully recovering a particular target microbe of interest and for average contig length. These show improved specificities compared to older measures and recommend deeper data generation than the levels chosen by some early studies, supporting the view that poor assemblies were due at least somewhat to insufficient data. We assess predictions empirically by generating roughly 4.5 Gb of sequence from a twelve member bacterial community, comparing coverage for two particular members, Selenomonas artemidis and Enterococcus faecium, which are the least ([Formula: see text]3 %) and most ([Formula: see text]12 %) abundant species, respectively. Agreement is reasonable, with differences likely attributable to coverage biases. We show that, in some cases, bias is simple in the sense that a small reduction in read length to simulate less efficient covering brings data and theory into essentially complete accord. Finally, we describe two applications of the theory. One plots coverage probability over the relevant parameter space, constructing essentially a "metagenomic design map" to enable straightforward analysis and design of future projects. The other gives an overview of the data requirements for various types of sequencing milestones, including a desired number of contact reads and contig length, for detection of a rare viral species.

  1. Programmatic assessment of competency-based workplace learning: when theory meets practice

    PubMed Central

    2013-01-01

    Background In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement

  2. Motivational cues predict the defensive system in team handball: A model based on regulatory focus theory.

    PubMed

    Debanne, T; Laffaye, G

    2015-08-01

    This study was based on the naturalistic decision-making paradigm and regulatory focus theory. Its aim was to model coaches' decision-making processes for handball teams' defensive systems based on relevant cues of the reward structure, and to determine the weight of each of these cues. We collected raw data by video-recording 41 games that were selected using a simple random method. We considered the defensive strategy (DEF: aligned or staged) to be the dependent variable, and the three independent variables were (a) numerical difference between the teams; (b) score difference between the teams; and (c) game periods. We used a logistic regression design (logit model) and a multivariate logistic model to explain the link between DEF and the three category independent variables. Each factor was weighted differently during the decision-making process to select the defensive system, and combining these variables increased the impact on this process; for instance, a staged defense is 43 times more likely to be chosen during the final period in an unfavorable situation and in a man advantage. Finally, this shows that the coach's decision-making process could be based on a simple match or could require a diagnosis of the situation based on the relevant cues.

  3. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  4. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  5. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs.

  6. Motivational cues predict the defensive system in team handball: A model based on regulatory focus theory.

    PubMed

    Debanne, T; Laffaye, G

    2015-08-01

    This study was based on the naturalistic decision-making paradigm and regulatory focus theory. Its aim was to model coaches' decision-making processes for handball teams' defensive systems based on relevant cues of the reward structure, and to determine the weight of each of these cues. We collected raw data by video-recording 41 games that were selected using a simple random method. We considered the defensive strategy (DEF: aligned or staged) to be the dependent variable, and the three independent variables were (a) numerical difference between the teams; (b) score difference between the teams; and (c) game periods. We used a logistic regression design (logit model) and a multivariate logistic model to explain the link between DEF and the three category independent variables. Each factor was weighted differently during the decision-making process to select the defensive system, and combining these variables increased the impact on this process; for instance, a staged defense is 43 times more likely to be chosen during the final period in an unfavorable situation and in a man advantage. Finally, this shows that the coach's decision-making process could be based on a simple match or could require a diagnosis of the situation based on the relevant cues. PMID:25262855

  7. Toward a general theory of indifference to research-based evidence.

    PubMed

    Lewis, Steven

    2007-07-01

    Evidence-based medicine (EBM) and evidence-based decision-making (EBDM) were intended to revolutionize health care and health policy. Thus far they have not. A great deal of research has demonstrated the persistent ubiquity of error in health care, wide and unjustifiable variations in practice and the minimal impact of decision aids such as clinical practice guidelines. This paper attempts to explain why EBM and EBDM have remained largely unrealized ambitions. It advances 10 propositions that together constitute a general theory of indifference to research-based evidence. Some of these propositions are conceptual (e.g. the epistemic resistance to the randomized trial), some are empirical (e.g. the impact of the corruption of science by industry), some are cognitive (e.g. human problems are holistic while science is typically fragmented and narrative free) and some are normative (e.g. the primary goal is not adherence to methods, but to make better decisions with better outcomes, irrespective of their origins). EBM and EBDM over-reached, and their failure was, as a consequence, inevitable. However, with corrective action on a number of fronts, research-based evidence can and should be more influential. The first step is to reconceive EBM and EBDM as habits of mind rather than a toolbox and to recognize that the sociology of knowledge is as important as its technical content.

  8. Integrative study on chromosome evolution of mammals, ants and wasps based on the minimum interaction theory.

    PubMed

    Imai, H T; Satta, Y; Takahata, N

    2001-06-21

    There is well-known evidence that in many eukaryotes, different species have different karyotypes (e.g. n=1-47 in ants and n=3-51 in mammals). Alternative (fusion and fission) hypotheses have been proposed to interpret this chromosomal diversity. Although the former has long been accepted, accumulating molecular genetics evidence seems to support the latter. We investigated this problem from a stochastic viewpoint using the Monte Carlo simulation method under the minimum interaction theory. We found that the results of simulations consistently interpreted the chromosomal diversity observed in mammals, ants and wasps, and concluded that chromosome evolution tends to evolve as a whole toward increasing chromosome numbers by centric fission. Accordingly, our results support the fission hypothesis. We discussed the process of chromosome evolution based on the latest theory of the molecular structure of chromosomes, and reconfirmed that the fission burst is the prime motive force in long-term chromosome evolution, and is effective in minimizing the genetic risks due to deleterious reciprocal translocations and in increasing the potential of genetic divergence. Centric fusion plays a biological role in eliminating heterochromatin (C-bands), but is only a local reverse flow in contrast to the previously held views.

  9. Speed of synchronization in complex networks of neural oscillators: analytic results based on Random Matrix Theory.

    PubMed

    Timme, Marc; Geisel, Theo; Wolf, Fred

    2006-03-01

    We analyze the dynamics of networks of spiking neural oscillators. First, we present an exact linear stability theory of the synchronous state for networks of arbitrary connectivity. For general neuron rise functions, stability is determined by multiple operators, for which standard analysis is not suitable. We describe a general nonstandard solution to the multioperator problem. Subsequently, we derive a class of neuronal rise functions for which all stability operators become degenerate and standard eigenvalue analysis becomes a suitable tool. Interestingly, this class is found to consist of networks of leaky integrate-and-fire neurons. For random networks of inhibitory integrate-and-fire neurons, we then develop an analytical approach, based on the theory of random matrices, to precisely determine the eigenvalue distributions of the stability operators. This yields the asymptotic relaxation time for perturbations to the synchronous state which provides the characteristic time scale on which neurons can coordinate their activity in such networks. For networks with finite in-degree, i.e., finite number of presynaptic inputs per neuron, we find a speed limit to coordinating spiking activity. Even with arbitrarily strong interaction strengths neurons cannot synchronize faster than at a certain maximal speed determined by the typical in-degree.

  10. Investigations into Generalization of Constraint-Based Scheduling Theories with Applications to Space Telescope Observation Scheduling

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Smith, Steven S.

    1996-01-01

    This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.

  11. The use of theory based semistructured elicitation questionnaires: formative research for CDC's Prevention Marketing Initiative.

    PubMed Central

    Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M

    1996-01-01

    Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153

  12. General Formalism of Decision Making Based on Theory of Open Quantum Systems

    NASA Astrophysics Data System (ADS)

    Asano, M.; Ohya, M.; Basieva, I.; Khrennikov, A.

    2013-01-01

    We present the general formalism of decision making which is based on the theory of open quantum systems. A person (decision maker), say Alice, is considered as a quantum-like system, i.e., a system which information processing follows the laws of quantum information theory. To make decision, Alice interacts with a huge mental bath. Depending on context of decision making this bath can include her social environment, mass media (TV, newspapers, INTERNET), and memory. Dynamics of an ensemble of such Alices is described by Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) equation. We speculate that in the processes of evolution biosystems (especially human beings) designed such "mental Hamiltonians" and GKSL-operators that any solution of the corresponding GKSL-equation stabilizes to a diagonal density operator (In the basis of decision making.) This limiting density operator describes population in which all superpositions of possible decisions has already been resolved. In principle, this approach can be used for the prediction of the distribution of possible decisions in human populations.

  13. Process Reengineering for Quality Improvement in ICU Based on Taylor's Management Theory.

    PubMed

    Tao, Ziqi

    2015-06-01

    Using methods including questionnaire-based surveys and control analysis, we analyzed the improvements in the efficiency of ICU rescue, service quality, and patients' satisfaction, in Xuzhou Central Hospital after the implementation of fine management, with an attempt to further introduce the concept of fine management and implement the brand construction. Originating in Taylor's "Theory of Scientific Management" (1982), fine management uses programmed, standardized, digitalized, and informational approaches to ensure each unit of an organization is running with great accuracy, high efficiency, strong coordination, and at sustained duration (Wang et al., Fine Management, 2007). The nature of fine management is a process that breaks up the strategy and goal, and executes it. Strategic planning takes place at every part of the process. Fine management demonstrates that everybody has a role to play in the management process, every area must be examined through the management process, and everything has to be managed (Zhang et al., The Experience of Hospital Nursing Precise Management, 2006). In other words, this kind of management theory demands all people to be involved in the entire process (Liu and Chen, Med Inf, 2007). As public hospital reform is becoming more widespread, it becomes imperative to "build a unified and efficient public hospital management system" and "improve the quality of medical services" (Guidelines on the Pilot Reform of Public Hospitals, 2010). The execution of fine management is of importance in optimizing the medical process, improving medical services and building a prestigious hospital brand. PMID:25548005

  14. Investigations into Generalization of Constraint-Based Scheduling Theories with Applications to Space Telescope Observation Scheduling

    NASA Astrophysics Data System (ADS)

    Muscettola, Nicola; Smith, Steven S.

    1996-09-01

    This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.

  15. Process Reengineering for Quality Improvement in ICU Based on Taylor's Management Theory.

    PubMed

    Tao, Ziqi

    2015-06-01

    Using methods including questionnaire-based surveys and control analysis, we analyzed the improvements in the efficiency of ICU rescue, service quality, and patients' satisfaction, in Xuzhou Central Hospital after the implementation of fine management, with an attempt to further introduce the concept of fine management and implement the brand construction. Originating in Taylor's "Theory of Scientific Management" (1982), fine management uses programmed, standardized, digitalized, and informational approaches to ensure each unit of an organization is running with great accuracy, high efficiency, strong coordination, and at sustained duration (Wang et al., Fine Management, 2007). The nature of fine management is a process that breaks up the strategy and goal, and executes it. Strategic planning takes place at every part of the process. Fine management demonstrates that everybody has a role to play in the management process, every area must be examined through the management process, and everything has to be managed (Zhang et al., The Experience of Hospital Nursing Precise Management, 2006). In other words, this kind of management theory demands all people to be involved in the entire process (Liu and Chen, Med Inf, 2007). As public hospital reform is becoming more widespread, it becomes imperative to "build a unified and efficient public hospital management system" and "improve the quality of medical services" (Guidelines on the Pilot Reform of Public Hospitals, 2010). The execution of fine management is of importance in optimizing the medical process, improving medical services and building a prestigious hospital brand.

  16. New Approach to Optimize the Apfs Placement Based on Instantaneous Reactive Power Theory by Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Hashemi-Dezaki, Hamed; Mohammadalizadeh-Shabestary, Masoud; Askarian-Abyaneh, Hossein; Rezaei-Jegarluei, Mohammad

    2014-01-01

    In electrical distribution systems, a great amount of power are wasting across the lines, also nowadays power factors, voltage profiles and total harmonic distortions (THDs) of most loads are not as would be desired. So these important parameters of a system play highly important role in wasting money and energy, and besides both consumers and sources are suffering from a high rate of distortions and even instabilities. Active power filters (APFs) are innovative ideas for solving of this adversity which have recently used instantaneous reactive power theory. In this paper, a novel method is proposed to optimize the allocation of APFs. The introduced method is based on the instantaneous reactive power theory in vectorial representation. By use of this representation, it is possible to asses different compensation strategies. Also, APFs proper placement in the system plays a crucial role in either reducing the losses costs and power quality improvement. To optimize the APFs placement, a new objective function has been defined on the basis of five terms: total losses, power factor, voltage profile, THD and cost. Genetic algorithm has been used to solve the optimization problem. The results of applying this method to a distribution network illustrate the method advantages.

  17. Cartographic generalization of urban street networks based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Liu, Gang; Li, Yongshu; Li, Zheng; Guo, Jiawei

    2014-05-01

    The automatic generalization of urban street networks is a constant and important aspect of geographical information science. Previous studies show that the dual graph for street-street relationships more accurately reflects the overall morphological properties and importance of streets than do other methods. In this study, we construct a dual graph to represent street-street relationship and propose an approach to generalize street networks based on gravitational field theory. We retain the global structural properties and topological connectivity of an original street network and borrow from gravitational field theory to define the gravitational force between nodes. The concept of multi-order neighbors is introduced and the gravitational force is taken as the measure of the importance contribution between nodes. The importance of a node is defined as the result of the interaction between a given node and its multi-order neighbors. Degree distribution is used to evaluate the level of maintaining the global structure and topological characteristics of a street network and to illustrate the efficiency of the suggested method. Experimental results indicate that the proposed approach can be used in generalizing street networks and retaining their density characteristics, connectivity and global structure.

  18. Effective meson masses in nuclear matter based on a cutoff field theory

    SciTech Connect

    Nakano, M.; Noda, N.; Mitsumori, T.; Koide, K.; Kouno, H.; Hasegawa, A.

    1997-02-01

    Effective masses of {sigma}, {omega}, {pi}, and {rho} mesons in nuclear matter are calculated based on a cutoff field theory. Instead of the traditional density-Feynman representation, we adopt the particle-hole-antiparticle representation for nuclear propagators so that unphysical components are not included in the meson self-energies. For an estimation of the contribution from the divergent particle-antiparticle excitations, i.e., vacuum polarization in nuclear matter, the idea of the renormalization group method is adopted. In this cutoff field theory, all the counterterms are finite and calculated numerically. It is shown that the predicted meson masses converge even if the cutoff {Lambda} is changed as long as {Lambda} is sufficiently large and that the prescription works well also for so-called nonrenormalized mesons such as {pi} and {rho}. According to this method, it is concluded that meson masses in nuclear matter have a weak dependence on the baryon density. {copyright} {ital 1997} {ital The American Physical Society}

  19. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  20. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    SciTech Connect

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.

  1. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    DOE PAGESBeta

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less

  2. Robust method for infrared small-target detection based on Boolean map visual theory.

    PubMed

    Qi, Shengxiang; Ming, Delie; Ma, Jie; Sun, Xiao; Tian, Jinwen

    2014-06-20

    In this paper, we present an infrared small target detection method based on Boolean map visual theory. The scheme is inspired by the phenomenon that small targets can often attract human attention due to two characteristics: brightness and Gaussian-like shape in the local context area. Motivated by this observation, we perform the task under a visual attention framework with Boolean map theory, which reveals that an observer's visual awareness corresponds to one Boolean map via a selected feature at any given instant. Formally, the infrared image is separated into two feature channels, including a color channel with the original gray intensity map and an orientation channel with the orientation texture maps produced by a designed second order directional derivative filter. For each feature map, Boolean maps delineating targets are computed from hierarchical segmentations. Small targets are then extracted from the target enhanced map, which is obtained by fusing the weighted Boolean maps of the two channels. In experiments, a set of real infrared images covering typical backgrounds with sky, sea, and ground clutters are tested to verify the effectiveness of our method. The results demonstrate that it outperforms the state-of-the-art methods with good performance.

  3. Where are family theories in family-based obesity treatment?: conceptualizing the study of families in pediatric weight management.

    PubMed

    Skelton, J A; Buehler, C; Irby, M B; Grzywacz, J G

    2012-07-01

    Family-based approaches to pediatric obesity treatment are considered the 'gold-standard,' and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment.

  4. The Study and Design of Adaptive Learning System Based on Fuzzy Set Theory

    NASA Astrophysics Data System (ADS)

    Jia, Bing; Zhong, Shaochun; Zheng, Tianyang; Liu, Zhiyong

    Adaptive learning is an effective way to improve the learning outcomes, that is, the selection of learning content and presentation should be adapted to each learner's learning context, learning levels and learning ability. Adaptive Learning System (ALS) can provide effective support for adaptive learning. This paper proposes a new ALS based on fuzzy set theory. It can effectively estimate the learner's knowledge level by test according to learner's target. Then take the factors of learner's cognitive ability and preference into consideration to achieve self-organization and push plan of knowledge. This paper focuses on the design and implementation of domain model and user model in ALS. Experiments confirmed that the system providing adaptive content can effectively help learners to memory the content and improve their comprehension.

  5. Detection and control of combustion instability based on the concept of dynamical system theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru

    2014-02-01

    We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO.

  6. Suicide prevention by online support groups: an action theory-based model of emotional first aid.

    PubMed

    Gilat, Itzhak; Shahar, Golan

    2009-01-01

    In the last two decades, online support groups have become a valuable source of help for individuals in suicidal crisis. Their attractiveness is attributed to features that enhance help-seeking and self-disclosure such as availability, anonymity, and use of written communication. However, online support groups also suffer from limitations and potential risks as agents of suicide prevention. The Israeli Association for Emotional First Aid (ERAN) has developed a practical model that seeks to maximize the benefits and minimize the risks of online suicide prevention. The model applies the Action Theory concepts whereby individuals shape their own environment. The present paper presents the model, which is based on an online support group combined with personal chat and a telephonic help line. The online support group is moderated by paraprofessionals who function as both process regulators and support providers. The principles and practice of the model are described, the theoretical rationale is presented, and directions for future research are suggested.

  7. Grey situation group decision-making method based on prospect theory.

    PubMed

    Zhang, Na; Fang, Zhigeng; Liu, Xiaqing

    2014-01-01

    This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example.

  8. Grey Situation Group Decision-Making Method Based on Prospect Theory

    PubMed Central

    Zhang, Na; Fang, Zhigeng; Liu, Xiaqing

    2014-01-01

    This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example. PMID:25197706

  9. Grating lobes analysis based on blazed grating theory for liquid crystal optical-phased array

    NASA Astrophysics Data System (ADS)

    Chen, Jian; Cui, Guolong; Kong, Lingjiang; Xiao, Feng; Liu, Xin; Zhang, Xiaoguang

    2013-09-01

    The grating lobes of the liquid crystal optical-phased array (LCOPA) based on blazed grating theory is studied. Using the Fraunhofer propagation principle, the analytical expressions of the far-field intensity distribution are derived. Subsequently, we can obtain both the locations and the intensities of the grating lobes. The derived analytical functions that provide an insight into single-slit diffraction and multislit interference effect on the grating lobes are discussed. Utilizing the conventional microwave-phased array technique, the intensities of the grating lobes and the main lobe are almost the same. Different from this, the derived analytical functions demonstrate that the intensities of the grating lobes are less than that of the main lobe. The computer simulations and experiments show that the proposed method can correctly estimate the locations and the intensities of the grating lobes for a LCOPA simultaneously.

  10. A RE-AIM evaluation of theory-based physical activity interventions.

    PubMed

    Antikainen, Iina; Ellis, Rebecca

    2011-04-01

    Although physical activity interventions have been shown to effectively modify behavior, little research has examined the potential of these interventions for adoption in real-world settings. The purpose of this literature review was to evaluate the external validity of 57 theory-based physical activity interventions using the RE-AIM framework. The physical activity interventions included were more likely to report on issues of internal, rather than external validity and on individual, rather than organizational components of the RE-AIM framework, making the translation of many interventions into practice difficult. Furthermore, most studies included motivated, healthy participants, thus reducing the generalizability of the interventions to real-world settings that provide services to more diverse populations. To determine if a given intervention is feasible and effective in translational research, more information should be reported about the factors that affect external validity.

  11. Truncated Wigner theory of coherent Ising machines based on degenerate optical parametric oscillator network

    NASA Astrophysics Data System (ADS)

    Maruo, Daiki; Utsunomiya, Shoko; Yamamoto, Yoshihisa

    2016-08-01

    We present the quantum theory of coherent Ising machines based on networks of degenerate optical parametric oscillators (DOPOs). In a simple model consisting of two coupled DOPOs, both positive-P representation and truncated Wigner representation predict quantum correlation and inseparability between the two DOPOs in spite of the open-dissipative nature of the system. Here, we apply the truncated Wigner representation method to coherent Ising machines with thermal, vacuum, and squeezed reservoir fields. We find that the probability of finding the ground state of a one-dimensional Ising model increases substantially as a result of reducing excess thermal noise and squeezing the incident vacuum fluctuation on the out-coupling port.

  12. Predicting Substance Abuse Treatment Completion using a New Scale Based on the Theory of Planned Behavior

    PubMed Central

    Zemore, Sarah E.; Ajzen, Icek

    2013-01-01

    We examined whether a 9-item scale based on the theory of planned behavior (TPB) predicted substance abuse treatment completion. Data were collected at a public, outpatient program among clients initiating treatment (N=200). Baseline surveys included measures of treatment-related attitudes, norms, perceived control, and intention; discharge status was collected from program records. As expected, TPB attitude and control components independently predicted intention (model R-squared=.56), and intention was positively associated with treatment completion even including clinical and demographic covariates (model R-squared=.24). TPB components were generally associated with the alternative readiness scales as expected, and the TPB remained predictive at higher levels of coercion. Meanwhile, none of the standard measures of readiness (e.g., the URICA and TREAT) or treatment coercion were positively associated with treatment participation. Results suggest promise for application of the TPB to treatment completion and support use of the intention component as a screener, though some refinements are suggested. PMID:23953167

  13. AAA gunnermodel based on observer theory. [predicting a gunner's tracking response

    NASA Technical Reports Server (NTRS)

    Kou, R. S.; Glass, B. C.; Day, C. N.; Vikmanis, M. M.

    1978-01-01

    The Luenberger observer theory is used to develop a predictive model of a gunner's tracking response in antiaircraft artillery systems. This model is composed of an observer, a feedback controller and a remnant element. An important feature of the model is that the structure is simple, hence a computer simulation requires only a short execution time. A parameter identification program based on the least squares curve fitting method and the Gauss Newton gradient algorithm is developed to determine the parameter values of the gunner model. Thus, a systematic procedure exists for identifying model parameters for a given antiaircraft tracking task. Model predictions of tracking errors are compared with human tracking data obtained from manned simulation experiments. Model predictions are in excellent agreement with the empirical data for several flyby and maneuvering target trajectories.

  14. Evaluation of a planned behavior theory-based intervention programme to promote healthy eating.

    PubMed

    Tsorbatzoudis, Haralambos

    2005-10-01

    The objective of the study was to test the effectiveness of an intervention program based on the theoretical framework of the Theory of Planned Behavior, with the addition of attitude strength and role identity. The aim was to alter adolescents' healthy eating attitudes and behaviour. In the sample were 335 high school students, who were divided into intervention and control groups. The intervention lasted 12 weeks and included posters and lectures promoting healthy eating. The measures included a questionnaire assessing the hypothesis and a food frequency questionnaire which measured eating habits. Analysis showed the intervention was effective in proving attitudes toward healthy eating and attitude strength, intention, perceived behavioral control, and healthy eating behaviour, but not effective in predicting subjective norms and role identity. Results provide evidence that intervention changed attitudes toward a behavior in a school setting.

  15. Viscoelastic wave propagation in the viscoelastic single walled carbon nanotubes based on nonlocal strain gradient theory

    NASA Astrophysics Data System (ADS)

    Tang, Yugang; Liu, Ying; Zhao, Dong

    2016-10-01

    In this paper, the viscoelastic wave propagation in an embedded viscoelastic single-walled carbon nanotube (SWCNT) is studied based on the nonlocal strain gradient theory. The characteristic equation for the viscoelastic wave in SWCNTs is derived. The emphasis is placed on the influence of the tube diameter on the viscoelastic wave dispersion. A blocking diameter is observed, above which the wave could not propagate in SWCNTs. The results show that the blocking diameter is greatly dependent on the damping coefficient, the nonlocal and the strain gradient length scale parameters, as well as the Winkler modulus of the surrounding elastic medium. These findings may provide a prospective application of SWCNTs in nanodevices and nanocomposites.

  16. Detection and control of combustion instability based on the concept of dynamical system theory.

    PubMed

    Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru

    2014-02-01

    We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO. PMID:25353548

  17. Nonlinear diffusion in two-dimensional ordered porous media based on a free volume theory

    NASA Astrophysics Data System (ADS)

    Godec, A.; Gaberscek, M.; Jamnik, J.; Merzel, F.

    2009-12-01

    A continuum nonlinear diffusion model is developed to describe molecular transport in ordered porous media. An existing generic van der Waals equation of state based free volume theory of binary diffusion coefficients is modified and introduced into the two-dimensional diffusion equation. The resulting diffusion equation is solved numerically with the alternating-direction fully implicit method under Neumann boundary conditions. Two types of pore structure symmetries are considered, hexagonal and cubic. The former is modeled as parallel channels while in case of the latter equal-sized channels are placed perpendicularly thus creating an interconnected network. First, general features of transport in both systems are explored, followed by the analysis of the impact of molecular properties on diffusion inside and out of the porous matrix. The influence of pore size on the diffusion-controlled release kinetics is assessed and the findings used to comment recent experimental studies of drug release profiles from ordered mesoporous silicates.

  18. Design and control of the precise tracking bed based on complex electromechanical design theory

    NASA Astrophysics Data System (ADS)

    Ren, Changzhi; Liu, Zhao; Wu, Liao; Chen, Ken

    2010-05-01

    The precise tracking technology is wide used in astronomical instruments, satellite tracking and aeronautic test bed. However, the precise ultra low speed tracking drive system is one high integrated electromechanical system, which one complexly electromechanical design method is adopted to improve the efficiency, reliability and quality of the system during the design and manufacture circle. The precise Tracking Bed is one ultra-exact, ultra-low speed, high precision and huge inertial instrument, which some kind of mechanism and environment of the ultra low speed is different from general technology. This paper explores the design process based on complex electromechanical optimizing design theory, one non-PID with a CMAC forward feedback control method is used in the servo system of the precise tracking bed and some simulation results are discussed.

  19. Simulation of the Electrical Properties of ZnO-BASED Ceramic Varistors Using Continuum Theory

    NASA Astrophysics Data System (ADS)

    Fang, Chao; Zhou, Dongxiang

    2012-07-01

    A continuum field model describing the electrical characteristics of polycrystalline semiconductors ceramics is suggested. Taking into account the continuum theory, a static differential equation about electron level on the base of Poisson equation is established. The one-dimensional calculation is carried out using the Runge-Kutta method. The effect of grain size, temperature and donor concentration on the current-voltage characteristic and specific capacitance of the material is calculated quantitatively using ZnO ceramics as an example. The results pointed out that current and voltage characteristics divide into three regions: Linear region before breakdown field, nonlinear region near breakdown field and upturn region after breakdown field. As the applied voltage increases, the grain boundary barrier and the grain boundary capacitance in the nonlinear zone drop drastically. The results are compared with experimental data. An interesting phenomenon is that the Schottky barrier has a small offset along the direction of the applied electric field.

  20. Realization of low-scattering metamaterial shell based on cylindrical wave expanding theory.

    PubMed

    Wu, Xiaoyu; Hu, Chenggang; Wang, Min; Pu, Mingbo; Luo, Xiangang

    2015-04-20

    In this paper, we demonstrate the design of a low-scattering metamaterial shell with strong backward scattering reduction and a wide bandwidth at microwave frequencies. Low echo is achieved through cylindrical wave expanding theory, and such shell only contains one metamaterial layer with simultaneous low permittivity and permeability. Cut-wire structure is selected to realize the low electromagnetic (EM) parameters and low loss on the resonance brim region. The full-model simulations show good agreement with theoretical calculations, and illustrate that near -20dB reduction is achieved and the -10 dB bandwidth can reach up to 0.6 GHz. Compared with the cloak based on transformation electromagnetics, the design possesses advantage of simpler requirement of EM parameters and is much easier to be implemented when only backward scattering field is cared. PMID:25969080