Science.gov

Sample records for clinical process intelligence

  1. Intelligent radar data processing

    NASA Astrophysics Data System (ADS)

    Holzbaur, Ulrich D.

    The application of artificial intelligence principles to the processing of radar signals is considered theoretically. The main capabilities required are learning and adaptation in a changing environment, processing and modeling information (especially dynamics and uncertainty), and decision-making based on all available information (taking its reliability into account). For the application to combat-aircraft radar systems, the tasks include the combination of data from different types of sensors, reacting to electronic counter-countermeasures, evaluation of how much data should be acquired (energy and radiation management), control of the radar, tracking, and identification. Also discussed are related uses such as monitoring the avionics systems, supporting pilot decisions with respect to the radar system, and general applications in radar-system R&D.

  2. Intelligent OCR Processing.

    ERIC Educational Resources Information Center

    Sun, Wei; And Others

    1992-01-01

    Identifies types and distributions of errors in text produced by optical character recognition (OCR) and proposes a process using machine learning techniques to recognize and correct errors in OCR texts. Results of experiments indicating that this strategy can reduce human interaction required for error correction are reported. (25 references)…

  3. Speed of Information Processing and General Intelligence.

    ERIC Educational Resources Information Center

    Vernon, Philip A.

    1983-01-01

    This study investigated the relationship between measures of speed of cognitive information processing and intelligence test scores. Cognitive processing measures were significantly related to IQ scores. Reaction time tests measure cognitive operations basic to intelligence, and individual differences in intelligence are partly due to variability…

  4. Business Intelligence in Process Control

    NASA Astrophysics Data System (ADS)

    Kopčeková, Alena; Kopček, Michal; Tanuška, Pavol

    2013-12-01

    The Business Intelligence technology, which represents a strong tool not only for decision making support, but also has a big potential in other fields of application, is discussed in this paper. Necessary fundamental definitions are offered and explained to better understand the basic principles and the role of this technology for company management. Article is logically divided into five main parts. In the first part, there is the definition of the technology and the list of main advantages. In the second part, an overview of the system architecture with the brief description of separate building blocks is presented. Also, the hierarchical nature of the system architecture is shown. The technology life cycle consisting of four steps, which are mutually interconnected into a ring, is described in the third part. In the fourth part, analytical methods incorporated in the online analytical processing and data mining used within the business intelligence as well as the related data mining methodologies are summarised. Also, some typical applications of the above-mentioned particular methods are introduced. In the final part, a proposal of the knowledge discovery system for hierarchical process control is outlined. The focus of this paper is to provide a comprehensive view and to familiarize the reader with the Business Intelligence technology and its utilisation.

  5. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  6. Instrumenting the Intelligence Analysis Process

    SciTech Connect

    Hampson, Ernest; Cowley, Paula J.

    2005-05-02

    The Advanced Research and Development Activity initiated the Novel Intelligence from Massive Data (NIMD) program to develop advanced analytic technologies and methodologies. In order to support this objective, researchers and developers need to understand what analysts do and how they do it. In the past, this knowledge generally was acquired through subjective feedback from analysts. NIMD established the innovative Glass Box Analysis (GBA) Project to instrument a live intelligence mission and unobtrusively capture and objectively study the analysis process. Instrumenting the analysis process requires tailor-made software hooks that grab data from a myriad of disparate application operations and feed into a complex relational database and hierarchical file store to collect, store, retrieve, and distribute analytic data in a manner that maximizes researchers’ understanding. A key to success is determining the correct data to collect and aggregate low-level data into meaningful analytic events. This paper will examine how the GBA team solved some of these challenges, continues to address others, and supports a growing user community in establishing their own GBA environments and/or studying the data generated by GBA analysts working in the Glass Box.

  7. Exploring the Analytical Processes of Intelligence Analysts

    SciTech Connect

    Chin, George; Kuchar, Olga A.; Wolf, Katherine E.

    2009-04-04

    We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.

  8. The Federal Conference on Intelligent Processing Equipment

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Research and development projects involving intelligent processing equipment within the following U.S. agencies are addressed: Department of Agriculture, Department of Commerce, Department of Energy, Department of Defense, Environmental Protection Agency, Federal Emergency Management Agency, NASA, National Institutes of Health, and the National Science Foundation.

  9. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  10. The Effect Of The Materials Based On Multiple Intelligence Theory Upon The Intelligence Groups' Learning Process

    NASA Astrophysics Data System (ADS)

    Oral, I.; Dogan, O.

    2007-04-01

    The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.

  11. Intelligent processing equipment projects at DLA

    NASA Astrophysics Data System (ADS)

    Obrien, Donald F.

    1992-04-01

    The Defense Logistics Agency is successfully incorporating Intelligent Processing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

  12. Intelligent Processing Equipment Projects at DLA

    NASA Technical Reports Server (NTRS)

    Obrien, Donald F.

    1992-01-01

    The Defense Logistics Agency is successfully incorporating Intelligent Processing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

  13. PAT: an intelligent authoring tool for facilitating clinical trial design.

    PubMed

    Tagaris, Anastasios; Andronikou, Vassiliki; Karanastasis, Efstathios; Chondrogiannis, Efthymios; Tsirmpas, Charalambos; Varvarigou, Theodora; Koutsouris, Dimitris

    2014-01-01

    Great investments are made by both private and public funds and a wealth of research findings is published, the research and development pipeline phases quite low productivity and tremendous delays. In this paper, we present a novel authoring tool which has been designed and developed for facilitating study design. Its underlying models are based on a thorough analysis of existing clinical trial protocols (CTPs) and eligibility criteria (EC) published in clinicaltrials.gov by domain experts. Moreover, its integration with intelligent decision support services and mechanisms linking the study design process with healthcare patient data as well as its direct access to literature designate it as a powerful tool offering great support to researchers during clinical trial design. PMID:25160332

  14. Intelligent processing for thick composites

    NASA Astrophysics Data System (ADS)

    Shin, Daniel Dong-Ok

    2000-10-01

    Manufacturing thick composite parts are associated with adverse curing conditions such as large in-plane temperature gradient and exotherms. The condition is further aggravated because the manufacturer's cycle and the existing cure control systems do not adequately counter such affects. In response, the forecast-based thermal control system is developed to have better cure control for thick composites. Accurate cure kinetic model is crucial for correctly identifying the amount of heat generated for composite process simulation. A new technique for identifying cure parameters for Hercules AS4/3502 prepreg is presented by normalizing the DSC data. The cure kinetics is based on an autocatalytic model for the proposed method, which uses dynamic and isothermal DSC data to determine its parameters. Existing models are also used to determine kinetic parameters but rendered inadequate because of the material's temperature dependent final degree of cure. The model predictions determined from the new technique showed good agreement to both isothermal and dynamic DSC data. The final degree of cure was also in good agreement with experimental data. A realistic cure simulation model including bleeder ply analysis and compaction is validated with Hercules AS4/3501-6 based laminates. The nonsymmetrical temperature distribution resulting from the presence of bleeder plies agreed well to the model prediction. Some of the discrepancies in the predicted compaction behavior were attributed to inaccurate viscosity and permeability models. The temperature prediction was quite good for the 3cm laminate. The validated process simulation model along with cure kinetics model for AS4/3502 prepreg were integrated into the thermal control system. The 3cm Hercules AS4/3501-6 and AS4/3502 laminate were fabricated. The resulting cure cycles satisfied all imposed requirements by minimizing exotherms and temperature gradient. Although the duration of the cure cycles increased, such phenomena was

  15. Towards A Clinical Tool For Automatic Intelligibility Assessment

    PubMed Central

    Berisha, Visar; Utianski, Rene; Liss, Julie

    2014-01-01

    An important, yet under-explored, problem in speech processing is the automatic assessment of intelligibility for pathological speech. In practice, intelligibility assessment is often done through subjective tests administered by speech pathologists; however research has shown that these tests are inconsistent, costly, and exhibit poor reliability. Although some automatic methods for intelligibility assessment for telecommunications exist, research specific to pathological speech has been limited. Here, we propose an algorithm that captures important multi-scale perceptual cues shown to correlate well with intelligibility. Nonlinear classifiers are trained at each time scale and a final intelligibility decision is made using ensemble learning methods from machine learning. Preliminary results indicate a marked improvement in intelligibility assessment over published baseline results. PMID:25004985

  16. Towards A Clinical Tool For Automatic Intelligibility Assessment.

    PubMed

    Berisha, Visar; Utianski, Rene; Liss, Julie

    2013-01-01

    An important, yet under-explored, problem in speech processing is the automatic assessment of intelligibility for pathological speech. In practice, intelligibility assessment is often done through subjective tests administered by speech pathologists; however research has shown that these tests are inconsistent, costly, and exhibit poor reliability. Although some automatic methods for intelligibility assessment for telecommunications exist, research specific to pathological speech has been limited. Here, we propose an algorithm that captures important multi-scale perceptual cues shown to correlate well with intelligibility. Nonlinear classifiers are trained at each time scale and a final intelligibility decision is made using ensemble learning methods from machine learning. Preliminary results indicate a marked improvement in intelligibility assessment over published baseline results. PMID:25004985

  17. Intelligent processing for metal matrix composites

    NASA Astrophysics Data System (ADS)

    Backman, D. G.; Russell, E. S.; Wei, D. Y.; Pang, Y.

    Intelligent processing of materials (IPM) is a powerful processing concept which requires integration of process knowledge, analytical models, process sensors, and expert system based control technology. An IPM system to manufacture metal matrix composites (MMC) using inductively coupled plasma deposition is under development. Process knowledge is contained in a reduced-order process simulator, consisting of thermal, fluid flow, solid mechanics, and material kinetics models. A working deposit thermal model has been developed, while the solid mechanics and material kinetics models are under development. Future directions for IPM development are discussed, including integration with related MMC processing operations, and establishment of a control system in which expert system based control is used to replicate operator decision-making.

  18. Intelligent systems for KSC ground processing

    NASA Technical Reports Server (NTRS)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  19. Information Processing in Cognition Process and New Artificial Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Zheng, Nanning; Xue, Jianru

    In this chapter, we discuss, in depth, visual information processing and a new artificial intelligent (AI) system that is based upon cognitive mechanisms. The relationship between a general model of intelligent systems and cognitive mechanisms is described, and in particular we explore visual information processing with selective attention. We also discuss a methodology for studying the new AI system and propose some important basic research issues that have emerged in the intersecting fields of cognitive science and information science. To this end, a new scheme for associative memory and a new architecture for an AI system with attractors of chaos are addressed.

  20. Clinical and Business Intelligence: Why It's Important to Your Pharmacy.

    PubMed

    Pinto, Brian; Fox, Brent I

    2016-07-01

    According to the Healthcare Information Management and Systems Society, "Clinical & Business Intelligence (C&BI) is the use and analysis of data captured in the healthcare setting to directly inform decision-making" (http://www.himss.org/library/clinical-business-intelligence). Some say that it is the right information given to the right person at the right time in the right way. No matter how you define it, the fact remains that timely access, synthesis, and visualization of clinical data have become key to how health professionals make patient care decisions and improve care delivery. PMID:27559195

  1. Transition from intelligence cycle to intelligence process: the network-centric intelligence in narrow seas

    NASA Astrophysics Data System (ADS)

    Büker, Engin

    2015-05-01

    The defence technologies which have been developing and changing rapidly, today make it difficult to be able to foresee the next environment and spectrum of warfare. When said change and development is looked in specific to the naval operations, it can be said that the possible battlefield and scenarios to be developed in the near and middle terms (5-20 years) are more clarified with compare to other force components. Network Centric Naval Warfare Concept that was developed for the floating, diving and flying fleet platforms which serves away from its own mainland for miles, will keep its significance in the future. Accordingly, Network Centric Intelligence structure completely integrating with the command and control systems will have relatively more importance. This study will firstly try to figure out the transition from the traditional intelligence cycle that is still used in conventional war to Network Centric Intelligence Production Process. In the last part, the use of this new approach on the base of UAV that is alternative to satellite based command control and data transfer systems in the joint operations in narrow seas will be examined, a model suggestion for the use of operative and strategic UAVs which are assured within the scope of the NATO AGS2 for this aim will be brought.

  2. [Revolution of paradigm in clinical diagnosis--from the mechanization to the intelligent being].

    PubMed

    Furukawa, T

    1991-10-01

    The medical advancements, during the 20th century symbolize the industrialization of medical technologies, i.e., many clinical tests are carried out by the highly advanced automated machines. Also, the concept of intelligent processing of clinical diagnosis seems to have been established in the practice. However, it may be an illusion caused from the term artificial intelligence (AI) which attracts the attention of not only specialists of computer science but also clinicians. The essential nature of AI, especially of expert consultation systems is the same as the existing theories, such as Bayes' theorem, Boolean algebra, multivariate statistical analysis, and Fussy theorem, i.e., the evaluation of a weighted sum of multiple parameters. The weak point of these theories is the lack of time parameter. Therefore, the models using a time parameter including physiological simulation, dynamics model, Weibull model and Markov process are important to realize the revolution of clinical diagnosis from the standpoint of intelligent science and technology. PMID:1762180

  3. Intelligent card processing terminal of urban rail transit in Nanjing

    NASA Astrophysics Data System (ADS)

    Xia, Dechuan; Zhang, Xiaojun; Song, Yana; He, Tiejun

    2011-10-01

    In order to improve the compatibility, security and expandability of Automatic Fare Collection System in rail transit, and reduce the maintenance cost, intelligent card processing terminal is proposed in this paper. The operation flow and features of intelligent card processing terminal are analyzed in detailed, and the software and hardware structures and business treatment process are designed. Finally, the security mechanism of intelligent card processing terminal is summarized. The application results shows that Intelligent card processing terminal makes interconnection among lines easier, creates considerable economic efficiency and the social efficiency, and can be widely used.

  4. Processing Speed in Children with Clinical Disorders

    ERIC Educational Resources Information Center

    Calhoun, Susan L.; Mayes, Susan Dickerson

    2005-01-01

    The Processing Speed Index (PSI) was first introduced on the Wechsler Intelligence Scale, Third Edition (WISC-III; D. Wechsler, 1991), and little is known about its clinical significance. In a referred sample (N = 980), children with neurological disorders (ADHD, autism, bipolar disorder, and LD) had mean PSI and Freedom from Distractibility Index…

  5. An intelligent, onboard signal processing payload concept

    SciTech Connect

    Shriver, P. M.; Harikumar, J.; Briles, S. C.; Gokhale, M.

    2003-01-01

    Our approach to onboard processing will enable a quicker return and improved quality of processed data from small, remote-sensing satellites. We describe an intelligent payload concept which processes RF lightning signal data onboard the spacecraft in a power-aware manner. Presently, onboard processing is severely curtailed due to the conventional management of limited resources and power-unaware payload designs. Delays of days to weeks are commonly experienced before raw data is received, processed into a human-usable format, and finally transmitted to the end-user. We enable this resource-critical technology of onboard processing through the concept of Algorithm Power Modulation (APM). APM is a decision process used to execute a specific software algorithm, from a suite of possible algorithms, to make the best use of the available power. The suite of software algorithms chosen for our application is intended to reduce the probability of false alarms through postprocessing. Each algorithm however also has a cost in energy usage. A heuristic decision tree procedure is used which selects an algorithm based on the available power, time allocated, algorithm priority, and algorithm performance. We demonstrate our approach to power-aware onboard processing through a preliminary software simulation.

  6. The application of intelligent process control to space based systems

    NASA Technical Reports Server (NTRS)

    Wakefield, G. Steve

    1990-01-01

    The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.

  7. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  8. Artificial intelligence in the materials processing laboratory

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    Materials science and engineering provides a vast arena for applications of artificial intelligence. Advanced materials research is an area in which challenging requirements confront the researcher, from the drawing board through production and into service. Advanced techniques results in the development of new materials for specialized applications. Hand-in-hand with these new materials are also requirements for state-of-the-art inspection methods to determine the integrity or fitness for service of structures fabricated from these materials. Two problems of current interest to the Materials Processing Laboratory at UAH are an expert system to assist in eddy current inspection of graphite epoxy components for aerospace and an expert system to assist in the design of superalloys for high temperature applications. Each project requires a different approach to reach the defined goals. Results to date are described for the eddy current analysis, but only the original concepts and approaches considered are given for the expert system to design superalloys.

  9. The intelligent control of an inert-gas atomization process

    NASA Astrophysics Data System (ADS)

    Osella, S. A.; Ridder, S. D.; Biancaniello, F. S.; Espina, P. I.

    1991-01-01

    Intelligent control is an attempt to specify the function of a controller in ways which mimic the decision-making capabilities of humans. Traditionally, issues relating to the emulation of human-like capabilities have fallen in the domain of artificial intelligence. Intelligent processing is a specific form of intelligent control in which the system to be controlled is a process rather than the more conventional mechanical or electrical system. The National Institute of Standards and Technology's program on intelligent processing of metal powders is a multi-disciplinary research initiative investigating the application of intelligent control technologies to improve the state of the art of metal powder manufacturing. This paper reviews the design of the institute's supersonic inert-gas metal-atomizer control system.

  10. A Research Program on Artificial Intelligence in Process Engineering.

    ERIC Educational Resources Information Center

    Stephanopoulos, George

    1986-01-01

    Discusses the use of artificial intelligence systems in process engineering. Describes a new program at the Massachusetts Institute of Technology which attempts to advance process engineering through technological advances in the areas of artificial intelligence and computers. Identifies the program's hardware facilities, software support,…

  11. Computer-aided diagnosis and artificial intelligence in clinical imaging.

    PubMed

    Shiraishi, Junji; Li, Qiang; Appelbaum, Daniel; Doi, Kunio

    2011-11-01

    Computer-aided diagnosis (CAD) is rapidly entering the radiology mainstream. It has already become a part of the routine clinical work for the detection of breast cancer with mammograms. The computer output is used as a "second opinion" in assisting radiologists' image interpretations. The computer algorithm generally consists of several steps that may include image processing, image feature analysis, and data classification via the use of tools such as artificial neural networks (ANN). In this article, we will explore these and other current processes that have come to be referred to as "artificial intelligence." One element of CAD, temporal subtraction, has been applied for enhancing interval changes and for suppressing unchanged structures (eg, normal structures) between 2 successive radiologic images. To reduce misregistration artifacts on the temporal subtraction images, a nonlinear image warping technique for matching the previous image to the current one has been developed. Development of the temporal subtraction method originated with chest radiographs, with the method subsequently being applied to chest computed tomography (CT) and nuclear medicine bone scans. The usefulness of the temporal subtraction method for bone scans was demonstrated by an observer study in which reading times and diagnostic accuracy improved significantly. An additional prospective clinical study verified that the temporal subtraction image could be used as a "second opinion" by radiologists with negligible detrimental effects. ANN was first used in 1990 for computerized differential diagnosis of interstitial lung diseases in CAD. Since then, ANN has been widely used in CAD schemes for the detection and diagnosis of various diseases in different imaging modalities, including the differential diagnosis of lung nodules and interstitial lung diseases in chest radiography, CT, and position emission tomography/CT. It is likely that CAD will be integrated into picture archiving and

  12. Emotional Intelligence and the Career Choice Process.

    ERIC Educational Resources Information Center

    Emmerling, Robert J.; Cherniss, Cary

    2003-01-01

    Emotional intelligence as conceptualized by Mayer and Salovey consists of perceiving emotions, using emotions to facilitate thoughts, understanding emotions, and managing emotions to enhance personal growth. The Multifactor Emotional Intelligence Scale has proven a valid and reliable measure that can be used to explore the implications of…

  13. Intelligent Information: A National System for Monitoring Clinical Performance

    PubMed Central

    Bottle, Alex; Aylin, Paul

    2008-01-01

    Objective To use statistical process control charts to monitor in-hospital outcomes at the hospital level for a wide range of procedures and diagnoses. Data Sources Routine English hospital admissions data. Study Design Retrospective analysis using risk-adjusted log-likelihood cumulative sum (CUSUM) charts, comparing each hospital with the national average and its peers for in-hospital mortality, length of stay, and emergency readmission within 28 days. Data Collection Data were derived from the Department of Health administrative hospital admissions database, with monthly uploads from the clearing service. Principal Findings The tool is currently being used by nearly 100 hospitals and also a number of primary care trusts responsible for purchasing hospital care. It monitors around 80 percent of admissions and in-hospital deaths. Case-mix adjustment gives values for the area under the receiver operating characteristic curve between 0.60 and 0.86 for mortality, but the values were poorer for readmission. Conclusions CUSUMs are a promising management tool for managers and clinicians for driving improvement in hospital performance for a range of outcomes, and interactive presentation via a web-based front end has been well received by users. Our methods act as a focus for intelligently directed clinical audit with the real potential to improve outcomes, but wider availability and prospective monitoring are required to fully assess the method's utility. PMID:18300370

  14. Intelligent Shimming for Deep Drawing Processes

    NASA Astrophysics Data System (ADS)

    Tommerup, Søren; Endelt, Benny; Danckert, Joachim; Nielsen, Karl Brian

    2011-08-01

    This paper demonstrates the use of an intelligent shimming system to compensate for changes in process output due to tool wear. A new tool concept with integrated hydraulic cavities used as actuators in feedback control system is presented. By prescribing a hydraulic pressure in the individual cavities the blank-holder force distribution can be controlled during the punch stroke. By means of a sequence of numerical simulations abrasive wear is imposed to the deep drawing of a rectangular cup. The abrasive wear is modelled by changing the tool surface geometry using an algorithm based on the sliding energy density. As the tool surfaces are changed the material draw-in is significantly altered when using conventional open-loop control of the blank-holder force. A feed-back controller is presented which is capable of reducing the draw-in difference to a certain degree. Further a learning algorithm is introduced to the system, which is able to improve the response of the feed-back system significantly.

  15. MindTrial: An Intelligent System for Clinical Trials

    PubMed Central

    Lee, Yugyung; Dinakarpandian, Deendayal; Katakam, Nikhilesh; Owens, Dennis

    2010-01-01

    The recruitment of human subjects for clinical trials research is a critically important step in the discovery of new cures for diseases. However, the current recruitment methodologies are inherently inefficient. Considerable resources are expended in efforts to recruit adequate numbers of patient volunteers who meet the inclusion/exclusion criteria for clinical trials. Recruitment is particularly challenging for trials involving vulnerable, psychiatrically disordered groups. We have developed a prototype system, called MindTrial, that is based on an online model to enhance the efficiency and quality of recruitment of patients with psychiatric disorders for clinical research. The intelligent component of the MindTrial system can facilitate highly specific matches between clinical trial criteria and volunteers for self-enrollment of sufficient numbers of patient volunteers. We believe this system is particularly valuable in optimizing recruitment for clinical trial studies for development of new drugs. PMID:21347017

  16. The intelligent clinical laboratory as a tool to increase cancer care management productivity.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza

    2014-01-01

    Studies of the causes of cancer, early detection, prevention or treatment need accurate, comprehensive, and timely cancer data. The clinical laboratory provides important cancer information needed for physicians which influence clinical decisions regarding treatment, diagnosis and patient monitoring. Poor communication between health care providers and clinical laboratory personnel can lead to medical errors and wrong decisions in providing cancer care. Because of the key impact of laboratory information on cancer diagnosis and treatment the quality of the tests, lab reports, and appropriate lab management are very important. A laboratory information management system (LIMS) can have an important role in diagnosis, fast and effective access to cancer data, decrease redundancy and costs, and facilitate the integration and collection of data from different types of instruments and systems. In spite of significant advantages LIMS is limited by factors such as problems in adaption to new instruments that may change existing work processes. Applications of intelligent software simultaneously with existing information systems, in addition to remove these restrictions, have important benefits including adding additional non-laboratory-generated information to the reports, facilitating decision making, and improving quality and productivity of cancer care services. Laboratory systems must have flexibility to change and have the capability to develop and benefit from intelligent devices. Intelligent laboratory information management systems need to benefit from informatics tools and latest technologies like open sources. The aim of this commentary is to survey application, opportunities and necessity of intelligent clinical laboratory as a tool to increase cancer care management productivity. PMID:24761839

  17. Intelligent Processing Equipment Within the Environmental Protection Agency

    NASA Technical Reports Server (NTRS)

    Greathouse, Daniel G.; Nalesnik, Richard P.

    1992-01-01

    Protection of the environment and environmental remediation requires the cooperation, at all levels, of government and industry. Intelligent processing equipment, in addition to other artificial intelligence based tools, was used by the Environmental Protection Agency to provide personnel safety and improve the efficiency of those responsible for protection and remediation of the environment. These exploratory efforts demonstrate the feasibility and utility of expanding development and widespread use of these tools. A survey of current intelligent processing equipment applications in the Agency is presented and is followed by a brief discussion of possible uses in the future.

  18. INTELLIGENT PROCESSING EQUIPMENT WITHIN THE ENVIRONMENTAL PROTECTION AGENCY

    EPA Science Inventory

    Protection of the environment and environmental remediation requires the cooperation -at all levels- of government and industry. ntelligent processing equipment, in addition to other artificial intelligence based tools, has been used by the Environmental Protection Agency to prov...

  19. Intelligent process quality control system into supply chain

    NASA Astrophysics Data System (ADS)

    Wang, Shijie; Jiang, Xingyu; Wang, Yingchun

    2010-01-01

    To cope with the challenges of monitoring dynamic and variable quality variation into supply chain, diagnosing the abnormal variation at the right moment, is a difficult problem that a enterprise in supply chain faces in process quality control. An intelligent process quality control mode into supply chain, which integrated quality prevention, analysis, diagnosis and adjustment, and corresponding functional modules and framework were all put forward. This mode dealt mainly with constructing and running intelligent quality control system, such as the theory of similarity manufacturing, Statistical Process Control (SPC), neural network. Furthermore, some key enabling technologies were studied in detail, including process quality analysis on-line based on similarity process and process quality diagnosis based on Elman and expert system of process quality adjustment. It is basis of realizing network, intelligent and automatic process quality control into supply.

  20. Intelligent process quality control system into supply chain

    NASA Astrophysics Data System (ADS)

    Wang, Shijie; Jiang, Xingyu; Wang, Yingchun

    2009-12-01

    To cope with the challenges of monitoring dynamic and variable quality variation into supply chain, diagnosing the abnormal variation at the right moment, is a difficult problem that a enterprise in supply chain faces in process quality control. An intelligent process quality control mode into supply chain, which integrated quality prevention, analysis, diagnosis and adjustment, and corresponding functional modules and framework were all put forward. This mode dealt mainly with constructing and running intelligent quality control system, such as the theory of similarity manufacturing, Statistical Process Control (SPC), neural network. Furthermore, some key enabling technologies were studied in detail, including process quality analysis on-line based on similarity process and process quality diagnosis based on Elman and expert system of process quality adjustment. It is basis of realizing network, intelligent and automatic process quality control into supply.

  1. Got EQ?: Increasing Cultural and Clinical Competence through Emotional Intelligence

    ERIC Educational Resources Information Center

    Robertson, Shari A.

    2007-01-01

    Cultural intelligence has been described across three parameters of human behavior: cognitive intelligence, emotional intelligence (EQ), and physical intelligence. Each contributes a unique and important perspective to the ability of speech-language pathologists and audiologists to provide benefits to their clients regardless of cultural…

  2. Intelligent Testing: Integrating Psychological Theory and Clinical Practice

    ERIC Educational Resources Information Center

    Kaufman, James C., Ed.

    2009-01-01

    The field of intelligence testing has been revolutionized by Alan S. Kaufman. He developed the Wechsler Intelligence Scale for Children-Revised (WISC-R) with David Wechsler, and his best-selling book, Intelligent Testing with the WISC-R, introduced the phrase "intelligent testing." Kaufman, with his wife, Nadeen, then created his own series of…

  3. Relation between spiritual intelligence and clinical competency of nurses in Iran

    PubMed Central

    Karimi-Moonaghi, Hossein; Gazerani, Akram; Vaghee, Saeed; Gholami, Hassan; Salehmoghaddam, Amir Reza; Gharibnavaz, Raheleh

    2015-01-01

    Background: Clinical competency is one of the most important requirements in nursing profession, based on which nurses are assessed. To obtain an effective and improved form of clinical competency, several factors are observed and monitored by the health educational systems. Among these observed factors, spiritual intelligence is considered as one of the most significant factors in nurses’ success and efficacy. In this study, it is aimed to determine the spiritual intelligence status and its relationship with clinical competency. Materials and Methods: The descriptive–correlational research was carried out on 250 nurses in Mashhad educational hospitals, selected by multi-stage sampling. Demographic, clinical competency, and spiritual intelligence questionnaires were used for data collection and 212 questionnaires were analyzed. Results: About 53.3% of nurses obtained above average scores in spiritual intelligence. Clinical competency was evaluated by both self-evaluation and head nurse evaluation methods. Most nurses (53.8%) were having good level of clinical competency based on self-evaluation, 48.2% were at average level based on head nurse evaluation, and 53.3% were at average level based on overall score. A significant correlation was found between spiritual intelligence and clinical competency. Conclusions: In this study, the positive significant correlation between nurses’ spiritual intelligence and their clinical competency is investigated. Because of the positive effects of spiritual intelligence on nurses’ clinical competency and quality of care, it is recommended to develop nurses’ spiritual intelligence during their education and by way of continuous medical education. PMID:26793250

  4. Intelligence.

    PubMed

    Deary, Ian J

    2012-01-01

    Individual differences in human intelligence are of interest to a wide range of psychologists and to many people outside the discipline. This overview of contributions to intelligence research covers the first decade of the twenty-first century. There is a survey of some of the major books that appeared since 2000, at different levels of expertise and from different points of view. Contributions to the phenotype of intelligence differences are discussed, as well as some contributions to causes and consequences of intelligence differences. The major causal issues covered concern the environment and genetics, and how intelligence differences are being mapped to brain differences. The major outcomes discussed are health, education, and socioeconomic status. Aging and intelligence are discussed, as are sex differences in intelligence and whether twins and singletons differ in intelligence. More generally, the degree to which intelligence has become a part of broader research in neuroscience, health, and social science is discussed. PMID:21943169

  5. A Map for Clinical Laboratories Management Indicators in the Intelligent Dashboard

    PubMed Central

    Azadmanjir, Zahra; Torabi, Mashallah; Safdari, Reza; Bayat, Maryam; Golmahi, Fatemeh

    2015-01-01

    Introduction: management challenges of clinical laboratories are more complicated for educational hospital clinical laboratories. Managers can use tools of business intelligence (BI), such as information dashboards that provide the possibility of intelligent decision-making and problem solving about increasing income, reducing spending, utilization management and even improving quality. Critical phase of dashboard design is setting indicators and modeling causal relations between them. The paper describes the process of creating a map for laboratory dashboard. Methods: the study is one part of an action research that begins from 2012 by innovation initiative for implementing laboratory intelligent dashboard. Laboratories management problems were determined in educational hospitals by the brainstorming sessions. Then, with regard to the problems key performance indicators (KPIs) specified. Results: the map of indicators designed in form of three layered. They have a causal relationship so that issues measured in the subsequent layers affect issues measured in the prime layers. Conclusion: the proposed indicator map can be the base of performance monitoring. However, these indicators can be modified to improve during iterations of dashboard designing process. PMID:26483593

  6. FDEMS Sensing for Automated Intelligent Processing of PMR-15

    NASA Technical Reports Server (NTRS)

    Kranbuehl, David E.; Hood, D. K.; Rogozinski, J.; Barksdale, R.; Loos, Alfred C.; McRae, Doug

    1993-01-01

    The purpose of this grant was to develop frequency dependent dielectric measurements, often called FDEMS (frequency dependent electromagnetic sensing), to monitor and intelligently control the cure process in PMR-15, a stoichiometric mixture of a nadic ester, dimethyl ester, and methylendianiline in a monomor ratio.

  7. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  8. The Role of Intelligence Quotient and Emotional Intelligence in Cognitive Control Processes

    PubMed Central

    Checa, Purificación; Fernández-Berrocal, Pablo

    2015-01-01

    The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed. PMID:26648901

  9. The Role of Intelligence Quotient and Emotional Intelligence in Cognitive Control Processes.

    PubMed

    Checa, Purificación; Fernández-Berrocal, Pablo

    2015-01-01

    The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed. PMID:26648901

  10. An Or Processing Multiprocessor System For Artificial Intelligence

    NASA Astrophysics Data System (ADS)

    Fu, Hsin-Chia; Chiang, Cheng-Chin

    1989-03-01

    In this paper, an OR-Parallel Execution model based multiprocessor system is proposed. Our OR-parallel execution model addresses the following features: (1) Run-time Intelligent Backtracking, (2) Distributed process control and execution, (3) Minimization of data communication between processors, and (4) Minimization of parallel processing management overhead. Special hardware modules such as Intelligent Backtracking Controller, and Forward Execution Controller are designed to further enhance these features in run-time. A bus connected multiprocessor system is designed to experience the proposed OR-parallel execution model. Recent simulation results indicate that the OR-parallel execution model can be successfully used to conduct the parallel processing of most non-deterministic Prolog applications such as database systems, rule-based expert systems, natural language processing and theorem proving, etc.

  11. Intelligent Signal Processing for Detection System Optimization

    SciTech Connect

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  12. Intelligent Signal Processing for Detection System Optimization

    SciTech Connect

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  13. Intelligent Agents for Science Data Processing

    NASA Technical Reports Server (NTRS)

    Golden, Keith

    2003-01-01

    In order to conduct research into global warming, the Earth Observing System Data and Information System (EOSDIS) generates a large and growing volume of data. This viewgraph presentation describes the architecture needed to manage the remote sensing data, and the numerical analysis used to process it.

  14. Speech intelligibility enhancement using hearing-aid array processing.

    PubMed

    Saunders, G H; Kates, J M

    1997-09-01

    Microphone arrays can improve speech recognition in the noise for hearing-impaired listeners by suppressing interference coming from other than desired signal direction. In a previous paper [J. M. Kates and M. R. Weiss, J. Acoust. Soc. Am. 99, 3138-3148 (1996)], several array-processing techniques were evaluated in two rooms using the AI-weighted array gain as the performance metric. The array consisted of five omnidirectional microphones having uniform 2.5-cm spacing, oriented in the endfire direction. In this paper, the speech intelligibility for two of the array processing techniques, delay-and-sum beamforming and superdirective processing, is evaluated for a group of hearing-impaired subjects. Speech intelligibility was measured using the speech reception threshold (SRT) for spondees and speech intelligibility rating (SIR) for sentence materials. The array performance is compared with that for a single omnidirectional microphone and a single directional microphone having a cardioid response pattern. The SRT and SIR results show that the superdirective array processing was the most effective, followed by the cardioid microphone, the array using delay-and-sum beamforming, and the single omnidirectional microphone. The relative processing ratings do not appear to be strongly affected by the size of the room, and the SRT values determined using isolated spondees are similar to the SIR values produced from continuous discourse. PMID:9301060

  15. Emerging methods for the intelligent processing of materials

    NASA Astrophysics Data System (ADS)

    Garrett, P. H.; Jones, J. G.; Moore, D. C.; Malas, J. C.

    1993-10-01

    Emerging methods, procedures, and performance measures are presented for the design of intelligent materials processing systems that combine both comprehensive new process representations and correspondingly advanced process control systems. The description of these developments is presented in five parts. The first provides the partitioning of global processes into decoupled finite subprocesses for improved accommodation of process nonlinearities with accompanying simplification of control system complexity. The second is sensor/controller/actuator accountability to establish an on-line process variability baseline whose greatest sensitivity is attributable to process measurement limitations. Development three combines multiloop control with decoupled subprocesses for enhanced process disorder reduction and improved likelihood of achieving material parameters of interest. The fourth, closely associated with development three, provides accurate multiloop control compensation by identification of decoupled trapezoidal subprocess models. The fifth presents a process description language of qualitative subprocess influences for augmenting incompletely modeled processes and unmeasurable control elements by supervising the control space to minimize control conflicts and process variability.

  16. Sensor fusion for intelligent process control.

    SciTech Connect

    Connors, John J.; Hill, Kevin; Hanekamp, David; Haley, William F.; Gallagher, Robert J.; Gowin, Craig; Farrar, Arthur R.; Sheaffer, Donald A.; DeYoung, Mark A.; Bertram, Lee A.; Dodge, Craig; Binion, Bruce; Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R.; Tiwary, Rajiv; Stokes, Michael R.; Miller, Alan J.; Michael, Richard W.; Mayer, Raymond M.; Jiao, Yu; Smith, Philip J.; Arbab, Mehran; Hillaire, Robert G.

    2004-08-01

    An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

  17. Use of artificial intelligence in analytical systems for the clinical laboratory.

    PubMed

    Place, J F; Truchaud, A; Ozawa, K; Pardue, H; Schnipelsky, P

    1995-01-01

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks.This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system.In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories.It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

  18. A conceptual framework for intelligent real-time information processing

    NASA Technical Reports Server (NTRS)

    Schudy, Robert

    1987-01-01

    By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

  19. Intelligent elevator management system using image processing

    NASA Astrophysics Data System (ADS)

    Narayanan, H. Sai; Karunamurthy, Vignesh; Kumar, R. Barath

    2015-03-01

    In the modern era, the increase in the number of shopping malls and industrial building has led to an exponential increase in the usage of elevator systems. Thus there is an increased need for an effective control system to manage the elevator system. This paper is aimed at introducing an effective method to control the movement of the elevators by considering various cases where in the location of the person is found and the elevators are controlled based on various conditions like Load, proximity etc... This method continuously monitors the weight limit of each elevator while also making use of image processing to determine the number of persons waiting for an elevator in respective floors. Canny edge detection technique is used to find out the number of persons waiting for an elevator. Hence the algorithm takes a lot of cases into account and locates the correct elevator to service the respective persons waiting in different floors.

  20. Intelligent laser soldering inspection and process control

    NASA Technical Reports Server (NTRS)

    Vanzetti, Riccardo

    1987-01-01

    Component assembly on printed circuitry keeps making giant strides toward denser packaging and smaller dimensions. From a single layer to multilayer, from through holes to surface mounted components and tape applied bonds, unrelenting progress results in new, difficult problems in assembling, soldering, inspecting and controlling the manufacturing process of the new electronics. Among the major problems are the variables introduced by human operators. The small dimensions and the tight assembly tolerances are now successfully met by machines which are faster and more precise than the human hand. The same is true for soldering. But visual inspection of the solder joints is now so severely limited by the ever shrinking area accessible to the human eye that the inspector's diagnosis cannot be trusted any longer. Solutions to correcting these problems are discussed.

  1. Artificial intelligence, expert systems, computer vision, and natural language processing

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  2. An intelligent allocation algorithm for parallel processing

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Homaifar, Abdollah; Ananthram, Kishan G.

    1988-01-01

    The problem of allocating nodes of a program graph to processors in a parallel processing architecture is considered. The algorithm is based on critical path analysis, some allocation heuristics, and the execution granularity of nodes in a program graph. These factors, and the structure of interprocessor communication network, influence the allocation. To achieve realistic estimations of the executive durations of allocations, the algorithm considers the fact that nodes in a program graph have to communicate through varying numbers of tokens. Coarse and fine granularities have been implemented, with interprocessor token-communication duration, varying from zero up to values comparable to the execution durations of individual nodes. The effect on allocation of communication network structures is demonstrated by performing allocations for crossbar (non-blocking) and star (blocking) networks. The algorithm assumes the availability of as many processors as it needs for the optimal allocation of any program graph. Hence, the focus of allocation has been on varying token-communication durations rather than varying the number of processors. The algorithm always utilizes as many processors as necessary for the optimal allocation of any program graph, depending upon granularity and characteristics of the interprocessor communication network.

  3. Working memory and intelligibility of hearing-aid processed speech

    PubMed Central

    Souza, Pamela E.; Arehart, Kathryn H.; Shen, Jing; Anderson, Melinda; Kates, James M.

    2015-01-01

    Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity) and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing [wide-dynamic range compression (WDRC) and frequency compression (FC)] applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than to the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with WDRC and/or FC parameters that maintain the fidelity of the signal envelope. PMID:25999874

  4. Working memory and intelligibility of hearing-aid processed speech.

    PubMed

    Souza, Pamela E; Arehart, Kathryn H; Shen, Jing; Anderson, Melinda; Kates, James M

    2015-01-01

    Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity) and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing [wide-dynamic range compression (WDRC) and frequency compression (FC)] applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than to the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with WDRC and/or FC parameters that maintain the fidelity of the signal envelope. PMID:25999874

  5. Intelligence.

    PubMed

    Sternberg, Robert J

    2012-09-01

    Intelligence is the ability to learn from past experience and, in general, to adapt to, shape, and select environments. Aspects of intelligence are measured by standardized tests of intelligence. Average raw (number-correct) scores on such tests vary across the life span and also across generations, as well as across ethnic and socioeconomic groups. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex. Measured values correlate with brain size, at least within humans. The heritability coefficient (ratio of genetic to phenotypic variation) is between 0.4 and 0.8. But genes always express themselves through environment. Heritability varies as a function of a number of factors, including socioeconomic status and range of environments. Racial-group differences in measured intelligence have been reported, but race is a socially constructed rather than biological variable. As a result, these differences are difficult to interpret. Different cultures have different conceptions of the nature of intelligence, and also require different skills in order to express intelligence in the environment. WIREs Cogn Sci 2012 doi: 10.1002/wcs.1193 For further resources related to this article, please visit the WIREs website. PMID:26302705

  6. Processing of x-ray image in the intelligent setting system for fracture

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Zhang, Liyong; Liu, Sijiu; Yu, Zhiguo

    2006-11-01

    Intelligent setting system based on biomechanics and bone fracture therapy can accomplish micro-wound, intelligence and high efficiency of fracture setting. X-ray images grabbed by C-shape-arm X-ray machine supply the most key data for intelligent setting. Processing, analysis and transmission security of the image is the core in the system. According to characteristics being shown in three dimensions gray distribution figure and frequency spectrum of the image, histogram equalization in space domain and homomorphic filtering in frequency domain are separately proposed to enhance contrast and sharpness. On the foundation of mining orthopedics experts experience knowledge, setting for femoral-neck fracture is turned into three in-continuous operations that are reflected in the X-ray images through nine points, six lines, two angles and one distance and that are able to be implemented by mechanical manipulator and control device in the system. Master-slave reference frame is put forward to supply a stable reference standard to calculate parameters. Encryption method based on chaos dynamics system is brought forward to ensure image information security in the process of telemedicine intelligent setting for fracture. Clinic experience proved that the system can help orthopedists to correctly and reliably complete setting for bone fracture.

  7. Developing an intelligence analysis process through social network analysis

    NASA Astrophysics Data System (ADS)

    Waskiewicz, Todd; LaMonica, Peter

    2008-04-01

    Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.

  8. Image processing of false identity documents for forensic intelligence.

    PubMed

    Talbot-Wright, Benjamin; Baechler, Simon; Morelato, Marie; Ribaux, Olivier; Roux, Claude

    2016-06-01

    Forensic intelligence has recently gathered increasing attention as a potential expansion of forensic science that may contribute in a wider policing and security context. Whilst the new avenue is certainly promising, relatively few attempts to incorporate models, methods and techniques into practical projects are reported. This work reports a practical application of a generalised and transversal framework for developing forensic intelligence processes referred to here as the Transversal model adapted from previous work. Visual features present in the images of four datasets of false identity documents were systematically profiled and compared using image processing for the detection of a series of modus operandi (M.O.) actions. The nature of these series and their relation to the notion of common source was evaluated with respect to alternative known information and inferences drawn regarding respective crime systems. 439 documents seized by police and border guard authorities across 10 jurisdictions in Switzerland with known and unknown source level links formed the datasets for this study. Training sets were developed based on both known source level data, and visually supported relationships. Performance was evaluated through the use of intra-variability and inter-variability scores drawn from over 48,000 comparisons. The optimised method exhibited significant sensitivity combined with strong specificity and demonstrates its ability to support forensic intelligence efforts. PMID:27081791

  9. Intelligent On-Board Processing in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Tanner, S.

    2005-12-01

    Most existing sensing systems are designed as passive, independent observers. They are rarely aware of the phenomena they observe, and are even less likely to be aware of what other sensors are observing within the same environment. Increasingly, intelligent processing of sensor data is taking place in real-time, using computing resources on-board the sensor or the platform itself. One can imagine a sensor network consisting of intelligent and autonomous space-borne, airborne, and ground-based sensors. These sensors will act independently of one another, yet each will be capable of both publishing and receiving sensor information, observations, and alerts among other sensors in the network. Furthermore, these sensors will be capable of acting upon this information, perhaps altering acquisition properties of their instruments, changing the location of their platform, or updating processing strategies for their own observations to provide responsive information or additional alerts. Such autonomous and intelligent sensor networking capabilities provide significant benefits for collections of heterogeneous sensors within any environment. They are crucial for multi-sensor observations and surveillance, where real-time communication with external components and users may be inhibited, and the environment may be hostile. In all environments, mission automation and communication capabilities among disparate sensors will enable quicker response to interesting, rare, or unexpected events. Additionally, an intelligent network of heterogeneous sensors provides the advantage that all of the sensors can benefit from the unique capabilities of each sensor in the network. The University of Alabama in Huntsville (UAH) is developing a unique approach to data processing, integration and mining through the use of the Adaptive On-Board Data Processing (AODP) framework. AODP is a key foundation technology for autonomous internetworking capabilities to support situational awareness by

  10. Intelligence

    PubMed Central

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  11. Intelligence.

    PubMed

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  12. Intelligent Processing Equipment Research Supported by the National Science Foundation

    NASA Technical Reports Server (NTRS)

    Rao, Suren B.

    1992-01-01

    The research in progress on processes, workstations, and systems has the goal of developing a high level of understanding of the issues involved. This will enable the incorporation of a level of intelligence that will allow the creation of autonomous manufacturing systems that operate in an optimum manner, under a wide range of conditions. The emphasis of the research has been on the development of highly productive and flexible techniques to address current and future problems in manufacturing and processing. Several of these projects have resulted in well-defined and established models that can now be implemented in the application arena in the next few years.

  13. Quantity Intelligent Reckoning for Packaged Granary Grain Based Onimage Processing

    NASA Astrophysics Data System (ADS)

    Lin, Ying; Liu, Yong; Sun, Yueheng; Sun, Yanhong

    This paper presents a quantity intelligent reckoning approach for packaged granary grain based on image processing. The actual scene video was taken as the analysis object, and the dual-threshold Canny operator and the morphology processing method are used to extract the object grain bags’ characteristic outline-- the boundary of the counter-band of light. Then, a counting algorithm which integrates mode theory and variance analysis technology is presented for the quantity second-judgment. Experimental results show that by accurately extracting the characteristic outline and counting the number of the characteristic outline, the algorithm presents an effective method for grain quantity detection with high recognition precision and efficiency.

  14. Quantity Intelligent Reckoning for Packaged Granary Grain Based Onimage Processing

    NASA Astrophysics Data System (ADS)

    Lin, Ying; Liu, Yong; Sun, Yueheng; Sun, Yanhong

    This paper presents a quantity intelligent reckoning approach for packaged granary grain based on image processing. The actual scene video was taken as the analysis object, and the dual-threshold Canny operator and the morphology processing method are used to extract the object grain bags' characteristic outline-- the boundary of the counter-band of light. Then, a counting algorithm which integrates mode theory and variance analysis technology is presented for the quantity second-judgment. Experimental results show that by accurately extracting the characteristic outline and counting the number of the characteristic outline, the algorithm presents an effective method for grain quantity detection with high recognition precision and efficiency.

  15. Intelligent processing equipment research supported by the National Science Foundation

    NASA Astrophysics Data System (ADS)

    Rao, Suren B.

    1992-04-01

    The research in progress on processes, workstations, and systems has the goal of developing a high level of understanding of the issues involved. This will enable the incorporation of a level of intelligence that will allow the creation of autonomous manufacturing systems that operate in an optimum manner, under a wide range of conditions. The emphasis of the research has been on the development of highly productive and flexible techniques to address current and future problems in manufacturing and processing. Several of these projects have resulted in well-defined and established models that can now be implemented in the application arena in the next few years.

  16. Ramp Technology and Intelligent Processing in Small Manufacturing

    NASA Technical Reports Server (NTRS)

    Rentz, Richard E.

    1992-01-01

    To address the issues of excessive inventories and increasing procurement lead times, the Navy is actively pursuing flexible computer integrated manufacturing (FCIM) technologies, integrated by communication networks to respond rapidly to its requirements for parts. The Rapid Acquisition of Manufactured Parts (RAMP) program, initiated in 1986, is an integral part of this effort. The RAMP program's goal is to reduce the current average production lead times experienced by the Navy's inventory control points by a factor of 90 percent. The manufacturing engineering component of the RAMP architecture utilizes an intelligent processing technology built around a knowledge-based shell provided by ICAD, Inc. Rules and data bases in the software simulate an expert manufacturing planner's knowledge of shop processes and equipment. This expert system can use Product Data Exchange using STEP (PDES) data to determine what features the required part has, what material is required to manufacture it, what machines and tools are needed, and how the part should be held (fixtured) for machining, among other factors. The program's rule base then indicates, for example, how to make each feature, in what order to make it, and to which machines on the shop floor the part should be routed for processing. This information becomes part of the shop work order. The process planning function under RAMP greatly reduces the time and effort required to complete a process plan. Since the PDES file that drives the intelligent processing is 100 percent complete and accurate to start with, the potential for costly errors is greatly diminished.

  17. Clinical-HINTS: integrated intelligent ICU patient monitoring and information management system.

    PubMed

    Kalogeropoulos, D; Carson, E R; Collinson, P O

    1997-01-01

    Clinical-HINTS (Health Intelligence System) is a horizontally integrated decision support system (DSS) designed to meet the requirements for intelligent real-time clinical information management in critical care medical environments and to lay the foundation for the development of the next generation of intelligent medical instrumentation. The system presented was developed to refine and complement the information yielded by clinical laboratory investigations, thereby benefiting the management of the intensive care unit (ICU) patient. More specifically, Clinical-HINTS was developed to provide computer-based assistance with the acquisition, organisation and display, storage and retrieval, communication and generation of real-time patient-specific clinical information in an ICU. Clinical-HINTS is an object-oriented system developed in C+2 to run under Microsoft Windows as an embryo intelligent agent. Current generic reasoning skills include perception and reactive cognition of patient status but exclude therapeutic action. The system monitors the patient by communicating with the available sources of data and uses generic reasoning skills to generate intelligent alarms, or HINTS, on various levels of interpretation of an observed dysfunction, even in the presence of complex disorders. The system's communication and information management capabilities are used to acquire physiological data, and to store them along with their interpretations and any interventions for the dynamic recognition of interrelated pathophysiological states or clinical events. PMID:10179800

  18. Process Orchestration With Modular Software Applications On Intelligent Field Devices

    NASA Astrophysics Data System (ADS)

    Orfgen, Marius; Schmitt, Mathias

    2015-07-01

    The method developed by the DFKI-IFS for extending the functionality of intelligent field devices through the use of reloadable software applications (so-called Apps) is to be further augmented with a methodology and communication concept for process orchestration. The concept allows individual Apps from different manufacturers to decentrally share information. This way of communicating forms the basis for the dynamic orchestration of Apps to complete processes, in that it allows the actions of one App (e.g. detecting a component part with a sensor App) to trigger reactions in other Apps (e.g. triggering the processing of that component part). A holistic methodology and its implementation as a configuration tool allows one to model the information flow between Apps, as well as automatically introduce it into physical production hardware via available interfaces provided by the Field Device Middleware. Consequently, configuring industrial facilities is made simpler, resulting in shorter changeover and shutdown times.

  19. Ambient intelligence for monitoring and research in clinical neurophysiology and medicine: the MIMERICA* project and prototype.

    PubMed

    Pignolo, L; Riganello, F; Dolce, G; Sannita, W G

    2013-04-01

    Ambient Intelligence (AmI) provides extended but unobtrusive sensing and computing devices and ubiquitous networking for human/environment interaction. It is a new paradigm in information technology compliant with the international Integrating Healthcare Enterprise board (IHE) and eHealth HL7 technological standards in the functional integration of biomedical domotics and informatics in hospital and home care. AmI allows real-time automatic recording of biological/medical information and environmental data. It is extensively applicable to patient monitoring, medicine and neuroscience research, which require large biomedical data sets; for example, in the study of spontaneous or condition-dependent variability or chronobiology. In this respect, AML is equivalent to a traditional laboratory for data collection and processing, with minimal dedicated equipment, staff, and costs; it benefits from the integration of artificial intelligence technology with traditional/innovative sensors to monitor clinical or functional parameters. A prototype AmI platform (MIMERICA*) has been implemented and is operated in a semi-intensive unit for the vegetative and minimally conscious states, to investigate the spontaneous or environment-related fluctuations of physiological parameters in these conditions. PMID:23545248

  20. Integrating artificial and human intelligence into tablet production process.

    PubMed

    Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton

    2014-12-01

    We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data. PMID:24970587

  1. Artificial Intelligence and Expert Systems: Will They Change the Library? Papers Presented at the Annual Clinic on Library Applications of Data Processing (27th, Urbana, Illinois, March 25-27, 1990). Illinois, March 25-27, 1990).

    ERIC Educational Resources Information Center

    Lancaster, F. W., Ed.; Smith, Linda C., Ed.

    Some of the 12 conference papers presented in this proceedings focus on the present and potential capabilities of artificial intelligence and expert systems as they relate to a wide range of library applications, including descriptive cataloging, technical services, collection development, subject indexing, reference services, database searching,…

  2. US Department of Energy's Efforts in Intelligent Processing Equipment

    NASA Technical Reports Server (NTRS)

    Peavy, Richard D.; Mcfarland, Janet C.

    1992-01-01

    The Department of Energy (DOE) uses intelligent processing equipment (IPE) technologies to conduct research and development and manufacturing for energy and nuclear weapons programs. This paper highlights several significant IPE efforts underway in DOE. IPE technologies are essential to the accomplishment of DOE's missions, because of the need for small lot production, precision, and accuracy in manufacturing, hazardous waste management, and protection of the environment and the safety and health of the workforce and public. Applications of IPE technologies include environmental remediation and waste handling, advanced manufacturing, and automation of tasks carried out in hazardous areas. DOE laboratories have several key programs that integrate robotics, sensor, and control technologies. These programs embody a considerable technical capability that also may be used to enhance U.S. industrial competitiveness. DOE encourages closer cooperation with U.S. industrial partners based on mutual benefits. This paper briefly describes technology transfer mechanisms available for industrial involvement.

  3. Parameter tuning of PVD process based on artificial intelligence technique

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  4. Natural language processing in an intelligent writing strategy tutoring system.

    PubMed

    McNamara, Danielle S; Crossley, Scott A; Roscoe, Rod

    2013-06-01

    The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system. PMID:23055164

  5. ISMAC: An Intelligent System for Customized Clinical Case Management and Analysis

    PubMed Central

    You, Mingyu; Chen, Chong; Li, Guo-Zheng; Yan, Shi-Xing; Sun, Sheng; Zeng, Xue-Qiang; Zhao, Qing-Ce; Xu, Liao-Yu; Huang, Su-Ying

    2015-01-01

    Clinical cases are primary and vital evidence for Traditional Chinese Medicine (TCM) clinical research. A great deal of medical knowledge is hidden in the clinical cases of the highly experienced TCM practitioner. With a deep Chinese culture background and years of clinical experience, an experienced TCM specialist usually has his or her unique clinical pattern and diagnosis idea. Preserving huge clinical cases of experienced TCM practitioners as well as exploring the inherent knowledge is then an important but arduous task. The novel system ISMAC (Intelligent System for Management and Analysis of Clinical Cases in TCM) is designed and implemented for customized management and intelligent analysis of TCM clinical data. Customized templates with standard and expert-standard symptoms, diseases, syndromes, and Chinese Medince Formula (CMF) are constructed in ISMAC, according to the clinical diagnosis and treatment characteristic of each TCM specialist. With these templates, clinical cases are archived in order to maintain their original characteristics. Varying data analysis and mining methods, grouped as Basic Analysis, Association Rule, Feature Reduction, Cluster, Pattern Classification, and Pattern Prediction, are implemented in the system. With a flexible dataset retrieval mechanism, ISMAC is a powerful and convenient system for clinical case analysis and clinical knowledge discovery. PMID:26495425

  6. The Relationship between Emotional Intelligence, Self-Efficacy, and Clinical Performance in Associate Degree Nursing Students

    ERIC Educational Resources Information Center

    Rice, Eileen W.

    2013-01-01

    The purpose of this study was to explore self-efficacy, an individual's beliefs about his or her ability to perform a series of tasks, and emotional intelligence, an individual's ability to perceive, use, understand, and manage emotions, as predictors for successful clinical performance in nursing students. The participants were 49 female and 7…

  7. The Nature of Social Intelligence: Processes and Outcomes.

    ERIC Educational Resources Information Center

    Ford, Martin E.

    Although many people have studied social intelligence and theorized about it over the past 60 years, no one has been able to provide a clear picture of its nature. Traditional methods have overemphasized the social-cognitive outcomes of human functioning instead of social-behavioral outcomes. Two approaches used to study social intelligence can be…

  8. Developmental Process Model for the Java Intelligent Tutoring System

    ERIC Educational Resources Information Center

    Sykes, Edward

    2007-01-01

    The Java Intelligent Tutoring System (JITS) was designed and developed to support the growing trend of Java programming around the world. JITS is an advanced web-based personalized tutoring system that is unique in several ways. Most programming Intelligent Tutoring Systems require the teacher to author problems with corresponding solutions. JITS,…

  9. Sensor Driven Intelligent Control System For Plasma Processing

    SciTech Connect

    Bell, G.; Campbell, V.B.

    1998-02-23

    This Cooperative Research and Development Agreement (CRADA) between Innovative Computing Technologies, Inc. (IC Tech) and Martin Marietta Energy Systems (MMES) was undertaken to contribute to improved process control for microelectronic device fabrication. Process data from an amorphous silicon thin film deposition experiment was acquired to validate the performance of an intelligent, adaptive, neurally-inspired control software module designed to provide closed loop control of plasma processing machines used in the microelectronics industry. Data acquisition software was written using LabView The data was collected from an inductively coupled plasma (ICP) source, which was available for this project through LMES's RF/Microwave Technology Center. Experimental parameters measured were RF power, RF current and voltage on the antenna delivering power to the plasma, hydrogen and silane flow rate, chamber pressure, substrate temperature and H-alpha optical emission. Experimental results obtained were poly-crystallin silicon deposition rate, crystallinity, crystallographic orientation and electrical conductivity. Owing to experimental delays resulting from hardware failures, it was not possible to assemble a complete data for IC Tech use within the time and resource constraints of the CRADA. IC Tech was therefore not able to verify the performance of their existing models and control structures and validate model performance under this CRADA.

  10. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and

  11. What is the relationship between emotional intelligence and dental student clinical performance?

    PubMed

    Victoroff, Kristin Zakariasen; Boyatzis, Richard E

    2013-04-01

    Emotional intelligence has emerged as a key factor in differentiating average from outstanding performers in managerial and leadership positions across multiple business settings, but relatively few studies have examined the role of emotional intelligence in the health care professions. The purpose of this study was to examine the relationship between emotional intelligence (EI) and dental student clinical performance. All third- and fourth-year students at a single U.S. dental school were invited to participate. Participation rate was 74 percent (100/136). Dental students' EI was assessed using the Emotional Competence Inventory-University version (ECI-U), a seventy-two-item, 360-degree questionnaire completed by both self and other raters. The ECI-U measured twenty-two EI competencies grouped into four clusters (Self-Awareness, Self-Management, Social Awareness, and Relationship Management). Clinical performance was assessed using the mean grade assigned by clinical preceptors. This grade represents an overall assessment of a student's clinical performance including diagnostic and treatment planning skills, time utilization, preparation and organization, fundamental knowledge, technical skills, self-evaluation, professionalism, and patient management. Additional variables were didactic grade point average (GPA) in Years 1 and 2, preclinical GPA in Years 1 and 2, Dental Admission Test academic average and Perceptual Ability Test scores, year of study, age, and gender. Multiple linear regression analyses were conducted. The Self-Management cluster of competencies (b=0.448, p<0.05) and preclinical GPA (b=0.317, p<0.01) were significantly correlated with mean clinical grade. The Self-Management competencies were emotional self-control, achievement orientation, initiative, trustworthiness, conscientiousness, adaptability, and optimism. In this sample, dental students' EI competencies related to Self-Management were significant predictors of mean clinical grade

  12. Racial Equality in Intelligence: Predictions from a Theory of Intelligence as Processing

    ERIC Educational Resources Information Center

    Fagan, Joseph F.; Holland, Cynthia R.

    2007-01-01

    African-Americans and Whites were asked to solve problems typical of those administered on standard tests of intelligence. Half of the problems were solvable on the basis of information generally available to either race and/or on the basis of information newly learned. Such knowledge did not vary with race. Other problems were only solvable on…

  13. Detection, information fusion, and temporal processing for intelligence in recognition

    SciTech Connect

    Casasent, D.

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  14. Short-Term Memory and Auditory Processing Disorders: Concurrent Validity and Clinical Diagnostic Markers

    ERIC Educational Resources Information Center

    Maerlender, Arthur

    2010-01-01

    Auditory processing disorders (APDs) are of interest to educators and clinicians, as they impact school functioning. Little work has been completed to demonstrate how children with APDs perform on clinical tests. In a series of studies, standard clinical (psychometric) tests from the Wechsler Intelligence Scale for Children, Fourth Edition…

  15. Development of a user customizable imaging informatics-based intelligent workflow engine system to enhance rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent

    2014-03-01

    Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.

  16. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  17. Interated Intelligent Industrial Process Sensing and Control: Applied to and Demonstrated on Cupola Furnaces

    SciTech Connect

    Mohamed Abdelrahman; roger Haggard; Wagdy Mahmoud; Kevin Moore; Denis Clark; Eric Larsen; Paul King

    2003-02-12

    The final goal of this project was the development of a system that is capable of controlling an industrial process effectively through the integration of information obtained through intelligent sensor fusion and intelligent control technologies. The industry of interest in this project was the metal casting industry as represented by cupola iron-melting furnaces. However, the developed technology is of generic type and hence applicable to several other industries. The system was divided into the following four major interacting components: 1. An object oriented generic architecture to integrate the developed software and hardware components @. Generic algorithms for intelligent signal analysis and sensor and model fusion 3. Development of supervisory structure for integration of intelligent sensor fusion data into the controller 4. Hardware implementation of intelligent signal analysis and fusion algorithms

  18. International Federation of Clinical Chemistry. Use of artificial intelligence in analytical systems for the clinical laboratory. IFCC Committee on Analytical Systems.

    PubMed

    Place, J F; Truchaud, A; Ozawa, K; Pardue, H; Schnipelsky, P

    1994-12-16

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI) both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel-processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of this paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual property and that there is a need for better documentation, evaluation and regulation of the systems already being used widely in clinical laboratories. PMID:7889593

  19. The application of neural networks with artificial intelligence technique in the modeling of industrial processes

    SciTech Connect

    Saini, K. K.; Saini, Sanju

    2008-10-07

    Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

  20. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  1. Intelligence, Information Processing, and Specific Learning Disabilities: A Triarchic Synthesis.

    ERIC Educational Resources Information Center

    Kolligian, John, Jr.; Sternberg, Robert J.

    1987-01-01

    The article describes the triarchic theory of human intelligence, which is composed of three subtheories: componential, experiential, and contextual. Deficient cognitive strategies and inadequate knowledge in certain domains may result from the inability of the learning disabled to selectively encode, compare, and combine information, or from an…

  2. Artificial Intelligence in ADA: Pattern-Directed Processing. Final Report.

    ERIC Educational Resources Information Center

    Reeker, Larry H.; And Others

    To demonstrate to computer programmers that the programming language Ada provides superior facilities for use in artificial intelligence applications, the three papers included in this report investigate the capabilities that exist within Ada for "pattern-directed" programming. The first paper (Larry H. Reeker, Tulane University) is designed to…

  3. Boosting intelligence analysis process and situation awareness using the self-organizing map

    NASA Astrophysics Data System (ADS)

    Kärkkäinen, Anssi P.

    2009-05-01

    Situational awareness is critical on the modern battlefield. A large amount of intelligence information is collected to improve decision-making processes, but in many cases this huge information load is even decelerating analysis and decision-making because of the lack of reasonable tools and methods to process information. To improve the decision making process and situational awareness, lots of research is done to analyze and visualize intelligence information data automatically. Different data fusion and mining techniques are applied to produce an understandable situational picture. Automated processes are based on a data model which is used in information exchange between war operators. The data model requires formal message structures which makes information processing simpler in many cases. In this paper, generated formal intelligence message data is visualized and analyzed by using the self-organizing map (SOM). The SOM is a widely used neural network model, and it has shown its effectiveness in representing multi-dimensional data in two or three dimensional space. The results show that multidimensional intelligence data can be visualized and classified with this technique. The SOM can be used for monitoring intelligence message data (e.g. in purpose of error hunting), message classification and hunting correlations. Thus with the SOM it is possible to speed up the intelligence process and make better and faster decisions.

  4. Life span decrements in fluid intelligence and processing speed predict mortality risk.

    PubMed

    Aichele, Stephen; Rabbitt, Patrick; Ghisletta, Paolo

    2015-09-01

    We examined life span changes in 5 domains of cognitive performance as predictive of mortality risk. Data came from the Manchester Longitudinal Study of Cognition, a 20-plus-year investigation of 6,203 individuals ages 42-97 years. Cognitive domains were general crystallized intelligence, general fluid intelligence, verbal memory, visuospatial memory, and processing speed. Life span decrements were evident across these domains, controlling for baseline performance at age 70 and adjusting for retest effects. Survival analyses stratified by sex and conducted independently by cognitive domain showed that lower baseline performance levels in all domains-and larger life span decrements in general fluid intelligence and processing speed-were predictive of increased mortality risk for both women and men. Critically, analyses of the combined predictive power of cognitive performance variables showed that baseline levels of processing speed (in women) and general fluid intelligence (in men), and decrements in processing speed (in women and in men) and general fluid intelligence (in women), accounted for most of the explained variation in mortality risk. In light of recent evidence from brain-imaging studies, we speculate that cognitive abilities closely linked to cerebral white matter integrity (such as processing speed and general fluid intelligence) may represent particularly sensitive markers of mortality risk. In addition, we presume that greater complexity in cognition-survival associations observed in women (in analyses incorporating all cognitive predictors) may be a consequence of longer and more variable cognitive declines in women relative to men. PMID:26098167

  5. Semantic querying of relational data for clinical intelligence: a semantic web services-based approach

    PubMed Central

    2013-01-01

    Background Clinical Intelligence, as a research and engineering discipline, is dedicated to the development of tools for data analysis for the purposes of clinical research, surveillance, and effective health care management. Self-service ad hoc querying of clinical data is one desirable type of functionality. Since most of the data are currently stored in relational or similar form, ad hoc querying is problematic as it requires specialised technical skills and the knowledge of particular data schemas. Results A possible solution is semantic querying where the user formulates queries in terms of domain ontologies that are much easier to navigate and comprehend than data schemas. In this article, we are exploring the possibility of using SADI Semantic Web services for semantic querying of clinical data. We have developed a prototype of a semantic querying infrastructure for the surveillance of, and research on, hospital-acquired infections. Conclusions Our results suggest that SADI can support ad-hoc, self-service, semantic queries of relational data in a Clinical Intelligence context. The use of SADI compares favourably with approaches based on declarative semantic mappings from data schemas to ontologies, such as query rewriting and RDFizing by materialisation, because it can easily cope with situations when (i) some computation is required to turn relational data into RDF or OWL, e.g., to implement temporal reasoning, or (ii) integration with external data sources is necessary. PMID:23497556

  6. Design and validation of an intelligent wheelchair towards a clinically-functional outcome

    PubMed Central

    2013-01-01

    Background Many people with mobility impairments, who require the use of powered wheelchairs, have difficulty completing basic maneuvering tasks during their activities of daily living (ADL). In order to provide assistance to this population, robotic and intelligent system technologies have been used to design an intelligent powered wheelchair (IPW). This paper provides a comprehensive overview of the design and validation of the IPW. Methods The main contributions of this work are three-fold. First, we present a software architecture for robot navigation and control in constrained spaces. Second, we describe a decision-theoretic approach for achieving robust speech-based control of the intelligent wheelchair. Third, we present an evaluation protocol motivated by a meaningful clinical outcome, in the form of the Robotic Wheelchair Skills Test (RWST). This allows us to perform a thorough characterization of the performance and safety of the system, involving 17 test subjects (8 non-PW users, 9 regular PW users), 32 complete RWST sessions, 25 total hours of testing, and 9 kilometers of total running distance. Results User tests with the RWST show that the navigation architecture reduced collisions by more than 60% compared to other recent intelligent wheelchair platforms. On the tasks of the RWST, we measured an average decrease of 4% in performance score and 3% in safety score (not statistically significant), compared to the scores obtained with conventional driving model. This analysis was performed with regular users that had over 6 years of wheelchair driving experience, compared to approximately one half-hour of training with the autonomous mode. Conclusions The platform tested in these experiments is among the most experimentally validated robotic wheelchairs in realistic contexts. The results establish that proficient powered wheelchair users can achieve the same level of performance with the intelligent command mode, as with the conventional command mode

  7. Effects of Noise and Speech Intelligibility on Listener Comprehension and Processing Time of Korean-Accented English

    ERIC Educational Resources Information Center

    Wilson, Erin O'Brien; Spaulding, Tammie J.

    2010-01-01

    Purpose: This study evaluated the effects of noise and speech intelligibility on the processing of speech produced from native English; high-intelligibility, Korean-accented English; and moderate-intelligibility, Korean-accented English speakers. Method: Both listener comprehension, determined by accuracy judgment on true/false sentences, and…

  8. Application of Intelligent Agents in a Notice and Takedown Process

    NASA Astrophysics Data System (ADS)

    De Rosa, Alessia; Bartolini, Franco; Piva, Alessandro

    2003-01-01

    The future development of networked multimedia services is conditioned by the achievement of efficient methods to protect data owners against non-authorised copying and redistribution of the material put on the network, to grant that the Intellectual Property Rights (IRP) are well respected and the assets properly managed. A Notice and Takedown Procedure is considered, based on a self-regulatory regime, and as a possible implementation of this system, an Intelligent Agent based platform is proposed.

  9. Development and Implementation of a Clinical and Business Intelligence System for the Florida Health Data Warehouse

    PubMed Central

    AlHazme, Raed H.; Rana, Arif M.; De Lucca, Michael

    2014-01-01

    Objective To develop and implement a Clinical and Business Intelligence (CBI) system for the Florida Health Data Warehouse (FHDW) in order to bridge the gap between Florida’s healthcare stakeholders and the health data archived in FHWD. Materials and Methods A gap analysis study has been conducted to evaluate the technological divide between the relevant users and FHWD health data, which is maintained by the Broward Regional Health Planning Council (BRHPC). The study revealed a gap between the health care data and the decision makers that utilize the FHDW data. To bridge the gap, a CBI system was proposed, developed and implemented by BRHPC as a viable solution to address this issue, using the System Development Life Cycle methodology. Results The CBI system was successfully implemented and yielded a number of positive outcomes. In addition to significantly shortening the time required to analyze the health data for decision-making processes, the solution also provided end-users with the ability to automatically track public health parameters. Discussion A large amount of data is collected and stored by various health care organizations at the local, state, and national levels. If utilized properly, such data can go a long way in optimizing health care services. CBI systems provide health care organizations with valuable insights for improving patient care, tracking trends for medical research, and for controlling costs. Conclusion The CBI system has been found quite effective in bridging the gap between Florida’s healthcare stake holders and FHDW health data. Consequently, the solution has improved in the planning and coordination of health care services for the state of Florida. PMID:25379128

  10. Professional competencies in health sciences education: from multiple intelligences to the clinic floor.

    PubMed

    Lane, India F

    2010-03-01

    Nontechnical competencies identified as essential to the health professional's success include ethical behavior, interpersonal, self-management, leadership, business, and thinking competencies. The literature regarding such diverse topics, and the literature regarding "professional success" is extensive and wide-ranging, crossing educational, psychological, business, medical and vocational fields of study. This review is designed to introduce ways of viewing nontechnical competence from the psychology of human capacity to current perspectives, initiatives and needs in practice. After an introduction to the tensions inherent in educating individuals for both biomedical competency and "bedside" or "cageside" manner, the paper presents a brief overview of the major lines of inquiry into intelligence theory and how theories of multiple intelligences can build a foundation for conceptualizing professional and life skills. The discussion then moves from broad concepts of intelligence to more specific workplace skill sets, with an emphasis on professional medical education. This section introduces the research on noncognitive variables in various disciplines, the growing emphasis on competency based education, and the SKA movement in veterinary education. The next section presents the evidence that nontechnical, noncognitive or humanistic skills influence achievement in academic settings, medical education and clinical performance, as well as the challenges faced when educational priorities must be made. PMID:19585247

  11. Research on intelligent detection and processing technology of laser pulse

    NASA Astrophysics Data System (ADS)

    Zhao, Haili; Jiang, Huilin

    2005-01-01

    Aimed at the influence of turbulent atmosphere effect on laser pulse detection, it discusses the key factors that affect the signal test in this paper. Based on it, the article also discusses two key techniques, namely, floating threshold value and AGC (Automatic Gain Control) technology in detail, especially about the technique of floating threshold value. According to discussion about intelligent detection technology of laser pulse, the system designs a low noise detecting unit of laser pulse, tests its performance by the experiment, and validates correctness of the results.

  12. Intelligent image processing for vegetation classification using multispectral LANDSAT data

    NASA Astrophysics Data System (ADS)

    Santos, Stewart R.; Flores, Jorge L.; Garcia-Torales, G.

    2015-09-01

    We propose an intelligent computational technique for analysis of vegetation imaging, which are acquired with multispectral scanner (MSS) sensor. This work focuses on intelligent and adaptive artificial neural network (ANN) methodologies that allow segmentation and classification of spectral remote sensing (RS) signatures, in order to obtain a high resolution map, in which we can delimit the wooded areas and quantify the amount of combustible materials present into these areas. This could provide important information to prevent fires and deforestation of wooded areas. The spectral RS input data, acquired by the MSS sensor, are considered in a random propagation remotely sensed scene with unknown statistics for each Thematic Mapper (TM) band. Performing high-resolution reconstruction and adding these spectral values with neighbor pixels information from each TM band, we can include contextual information into an ANN. The biggest challenge in conventional classifiers is how to reduce the number of components in the feature vector, while preserving the major information contained in the data, especially when the dimensionality of the feature space is high. Preliminary results show that the Adaptive Modified Neural Network method is a promising and effective spectral method for segmentation and classification in RS images acquired with MSS sensor.

  13. Intelligent process monitoring of multilayer ceramic actuators using high temperature optical fiber displacement sensors

    SciTech Connect

    Gunther, M.F.; Claus, R.O.; Ritter, A.; Tran, T.A.; Greene, J.A.

    1994-12-31

    The Fiber and Electro-Optics Research Center (FEORC) has developed a sensing technique for the intelligent processing of a multilayer ceramic actuator (MCA) elements manufactured by the AVX Corporation in Conway, SC. Presented are the results of the fiber optic strain sensor used to monitor the burnout of organic binders from a green actuator sample. The results establish the operation of the short gage length, low finesse Fabry-Perot interferometric strain sensor as a tool for intelligent processing of such ceramic actuator elements. Also presented is the method of sensor operation, and post processing results using the same sensor for tracking actuator performance and hysteresis.

  14. Cognitive Processes in Clinical Practice.

    ERIC Educational Resources Information Center

    Witkin, Stanley L.

    1982-01-01

    Explores the cognitive processes that can lead social workers to make erroneous judgements about clients, and inappropriate practice decisions. Similarities between the assessment and practice methods advocated underscore the notion of practice as a process of systematic exploration and problem solving. (Author/JAC)

  15. Competitive Intelligence.

    ERIC Educational Resources Information Center

    Bergeron, Pierrette; Hiller, Christine A.

    2002-01-01

    Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…

  16. A Comparison of Laboratory and Clinical Working Memory Tests and Their Prediction of Fluid Intelligence

    PubMed Central

    Shelton, Jill T.; Elliott, Emily M.; Hill, B. D.; Calamia, Matthew R.; Gouvier, Wm. Drew

    2010-01-01

    The working memory (WM) construct is conceptualized similarly across domains of psychology, yet the methods used to measure WM function vary widely. The present study examined the relationship between WM measures used in the laboratory and those used in applied settings. A large sample of undergraduates completed three laboratory-based WM measures (operation span, listening span, and n-back), as well as the WM subtests from the Wechsler Adult Intelligence Scale-III and the Wechsler Memory Scale-III. Performance on all of the WM subtests of the clinical batteries shared positive correlations with the lab measures; however, the Arithmetic and Spatial Span subtests shared lower correlations than the other WM tests. Factor analyses revealed that a factor comprising scores from the three lab WM measures and the clinical subtest, Letter-Number Sequencing (LNS), provided the best measurement of WM. Additionally, a latent variable approach was taken using fluid intelligence as a criterion construct to further discriminate between the WM tests. The results revealed that the lab measures, along with the LNS task, were the best predictors of fluid abilities. PMID:20161647

  17. Cleveland Clinic intelligent mouthguard: a new technology to accurately measure head impact in athletes and soldiers

    NASA Astrophysics Data System (ADS)

    Bartsch, Adam; Samorezov, Sergey

    2013-05-01

    Nearly 2 million Traumatic Brain Injuries (TBI) occur in the U.S. each year, with societal costs approaching $60 billion. Including mild TBI and concussion, TBI's are prevalent in soldiers returning from Iraq and Afghanistan as well as in domestic athletes. Long-term risks of single and cumulative head impact dosage may present in the form of post traumatic stress disorder (PTSD), depression, suicide, Chronic Traumatic Encephalopathy (CTE), dementia, Alzheimer's and Parkinson's diseases. Quantifying head impact dosage and understanding associated risk factors for the development of long-term sequelae is critical toward developing guidelines for TBI exposure and post-exposure management. The current knowledge gap between head impact exposure and clinical outcomes limits the understanding of underlying TBI mechanisms, including effective treatment protocols and prevention methods for soldiers and athletes. In order to begin addressing this knowledge gap, Cleveland Clinic is developing the "Intelligent Mouthguard" head impact dosimeter. Current testing indicates the Intelligent Mouthguard can quantify linear acceleration with 3% error and angular acceleration with 17% error during impacts ranging from 10g to 174g and 850rad/s2 to 10000rad/s2, respectively. Correlation was high (R2 > 0.99, R2 = 0.98, respectively). Near-term development will be geared towards quantifying head impact dosages in vitro, longitudinally in athletes and to test new sensors for possible improved accuracy and reduced bias. Long-term, the IMG may be useful to soldiers to be paired with neurocognitive clinical data quantifying resultant TBI functional deficits.

  18. Validity of the Luria-Nebraska Intellectual Processes Scale as a Measure of Adult Intelligence.

    ERIC Educational Resources Information Center

    Prifitera, Aurelio; Ryan, Joseph J.

    1981-01-01

    Investigated the validity of the Luria-Nebraska Intellectual Processes Scale (IPS) as a substitute for the Wechsler Adult Intelligence Scale (WAIS). IPS scores were correlated with the three WAIS IQs, and regression equations were computed to obtain estimated Verbal IQ, Performance IQ, and Full Scale IQ. (Author)

  19. Bibliographic Post-Processing with the TIS Intelligent Gateway: Analytical and Communication Capabilities.

    ERIC Educational Resources Information Center

    Burton, Hilary D.

    TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…

  20. Processing Speed and Intelligence as Predictors of School Achievement: Mediation or Unique Contribution?

    ERIC Educational Resources Information Center

    Dodonova, Yulia A.; Dodonov, Yury S.

    2012-01-01

    The relationships between processing speed, intelligence, and school achievement were analyzed on a sample of 184 Russian 16-year-old students. Two speeded tasks required the discrimination of simple geometrical shapes and the recognition of the presented meaningless figures. Raven's Advanced Progressive Matrices and the verbal subtests of…

  1. Processing Speed, Intelligence, Creativity, and School Performance: Testing of Causal Hypotheses Using Structural Equation Models

    ERIC Educational Resources Information Center

    Rindermann, H.; Neubauer, A. C.

    2004-01-01

    According to mental speed theory of intelligence, the speed of information processing constitutes an important basis for cognitive abilities. However, the question, how mental speed relates to real world criteria, like school, academic, or job performance, is still unanswered. The aim of the study is to test an indirect speed-factor model in…

  2. Clinical report writing: Process and perspective

    NASA Technical Reports Server (NTRS)

    Ewald, H. R.

    1981-01-01

    Clinical report writing in psychology and psychiatry is addressed. Audience/use analysis and the basic procedures of information gathering, diagnosis, and prognosis are described. Two interlinking processes are involved: the process of creation and the process of communication. Techniques for good report writing are presented.

  3. Intelligent sensor-model automated control of PMR-15 autoclave processing

    NASA Technical Reports Server (NTRS)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    1992-01-01

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  4. Intelligent sensor-model automated control of PMR-15 autoclave processing

    NASA Astrophysics Data System (ADS)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  5. Numerical magnitude processing deficits in children with mathematical difficulties are independent of intelligence.

    PubMed

    Brankaer, Carmen; Ghesquière, Pol; De Smedt, Bert

    2014-11-01

    Developmental dyscalculia (DD) is thought to arise from difficulties in the ability to process numerical magnitudes. Most research relied on IQ-discrepancy based definitions of DD and only included individuals with normal IQ, yet little is known about the role of intelligence in the association between numerical magnitude processing and mathematical difficulties (MD). The present study examined numerical magnitude processing in matched groups of 7- to 8-year-olds (n=42) who had either discrepant MD (poor math scores, average IQ), nondiscrepant MD (poor math scores, below-average IQ) or no MD. Both groups of children with MD showed similar impairments in numerical magnitudes processing compared to controls, suggesting that the association between numerical magnitude processing deficits and MD is independent of intelligence. PMID:25036314

  6. Space Shuttle processing - A case study in artificial intelligence

    NASA Technical Reports Server (NTRS)

    Mollikarimi, Cindy; Gargan, Robert; Zweben, Monte

    1991-01-01

    A scheduling system incorporating AI is described and applied to the automated processing of the Space Shuttle. The unique problem of addressing the temporal, resource, and orbiter-configuration requirements of shuttle processing is described with comparisons to traditional project management for manufacturing processes. The present scheduling system is developed to handle the late inputs and complex programs that characterize shuttle processing by incorporating fixed preemptive scheduling, constraint-based simulated annealing, and the characteristics of an 'anytime' algorithm. The Space-Shuttle processing environment is modeled with 500 activities broken down into 4000 subtasks and with 1600 temporal constraints, 8000 resource constraints, and 3900 state requirements. The algorithm is shown to scale to very large problems and maintain anytime characteristics suggesting that an automated scheduling process is achievable and potentially cost-effective.

  7. Rapid Intelligent Inspection Process Definition for dimensional measurement in advanced manufacturing

    SciTech Connect

    Brown, C.W.

    1993-03-01

    The Rapid Intelligent Inspection Process Definition (RIIPD) project is an industry-led effort to advance computer integrated manufacturing (CIM) systems for the creation and modification of inspection process definitions. The RIIPD project will define, design, develop, and demonstrate an automated tool (i.e., software) to generate inspection process plans and coordinate measuring machine (CMM) inspection programs, as well as produce support information for the dimensional measurement of piece parts. The goal of this project is to make the inspection and part verification function, specifically CMM measurements, a more effective production support tool by reducing inspection process definition flowtime, creating consistent and standard inspections, increasing confidence of measurement results, and capturing inspection expertise. This objective is accomplished through importing STEP geometry definitions, applying solid modeling, incorporating explicit tolerance representations, establishing dimensional inspection,techniques, embedding artificial intelligence techniques, and adhering to the Dimensional Measuring Interface Standard (DMIS) national standard.

  8. Business Intelligence Applied to the ALMA Software Integration Process

    NASA Astrophysics Data System (ADS)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  9. Team B Intelligence Coups

    ERIC Educational Resources Information Center

    Mitchell, Gordon R.

    2006-01-01

    The 2003 Iraq prewar intelligence failure was not simply a case of the U.S. intelligence community providing flawed data to policy-makers. It also involved subversion of the competitive intelligence analysis process, where unofficial intelligence boutiques "stovepiped" misleading intelligence assessments directly to policy-makers and undercut…

  10. Influence of Family Processes, Motivation, and Beliefs about Intelligence on Creative Problem Solving of Scientifically Talented Individuals

    ERIC Educational Resources Information Center

    Cho, Seokhee; Lin, Chia-Yi

    2011-01-01

    Predictive relationships among perceived family processes, intrinsic and extrinsic motivation, incremental beliefs about intelligence, confidence in intelligence, and creative problem-solving practices in mathematics and science were examined. Participants were 733 scientifically talented Korean students in fourth through twelfth grades as well as…

  11. Differences in Cognitive Processes between Gifted, Intelligent, Creative, and Average Individuals While Solving Complex Problems: An EEG Study.

    ERIC Educational Resources Information Center

    Jausovec, Norbert

    2000-01-01

    Studied differences in cognitive processes related to creativity and intelligence using EEG coherence and power measures in the lower and upper alpha bands. Results of 2 experiments involving 49 and 48 right-handed student teachers suggest that creativity and intelligence are different abilities that also differ in the neurological activity…

  12. QuikForm: Intelligent deformation processing of structural alloys

    SciTech Connect

    Bourcier, R.J.; Wellman, G.W.

    1994-09-01

    There currently exists a critical need for tools to enhance the industrial competitiveness and agility of US industries involved in deformation processing of structural alloys. In response to this need, Sandia National Laboratories has embarked upon the QuikForm Initiative. The goal of this program is the development of computer-based tools to facilitate the design of deformation processing operations. The authors are currently focusing their efforts on the definition/development of a comprehensive system for the design of sheet metal stamping operations. The overall structure of the proposed QuikForm system is presented, and the focus of their thrust in each technical area is discussed.

  13. Intelligent process mapping through systematic improvement of heuristics

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.

    1992-01-01

    The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

  14. An intelligent advisory system for pre-launch processing

    NASA Technical Reports Server (NTRS)

    Engrand, Peter A.; Mitchell, Tami

    1991-01-01

    The shuttle system of interest in this paper is the shuttle's data processing system (DPS). The DPS is composed of the following: (1) general purpose computers (GPC); (2) a multifunction CRT display system (MCDS); (3) mass memory units (MMU); and (4) a multiplexer/demultiplexer (MDM) and related software. In order to ensure the correct functioning of shuttle systems, some level of automatic error detection has been incorporated into all shuttle systems. For the DPS, error detection equipment has been incorporated into all of its subsystems. The automated diagnostic system, (MCDS) diagnostic tool, that aids in a more efficient processing of the DPS is described.

  15. Assessment of Intelligent Processing Equipment in the National Aeronautics and Space Administration, 1991

    NASA Technical Reports Server (NTRS)

    Jones, C. S.

    1992-01-01

    Summarized here is an assessment of intelligent processing equipment (IPE) within NASA. An attempt is made to determine the state of IPE development and research in specific areas where NASA might contribute to the national capability. Mechanisms to transfer NASA technology to the U.S. private sector in this critical area are discussed. It was concluded that intelligent processing equipment is finding extensive use in the manufacture of space hardware, especially in the propulsion components of the shuttle. The major benefits are found in improved process consistency, which lowers cost as it reduces rework. Advanced feedback controls are under development and being implemented gradually into shuttle manufacturing. Implementation is much more extensive in new programs, such as in the advanced solid rocket motor and the Space Station Freedom.

  16. Assessment of intelligent processing equipment in the National Aeronautics and Space Administration, 1991

    NASA Astrophysics Data System (ADS)

    Jones, C. S.

    1992-04-01

    Summarized here is an assessment of intelligent processing equipment (IPE) within NASA. An attempt is made to determine the state of IPE development and research in specific areas where NASA might contribute to the national capability. Mechanisms to transfer NASA technology to the U.S. private sector in this critical area are discussed. It was concluded that intelligent processing equipment is finding extensive use in the manufacture of space hardware, especially in the propulsion components of the shuttle. The major benefits are found in improved process consistency, which lowers cost as it reduces rework. Advanced feedback controls are under development and being implemented gradually into shuttle manufacturing. Implementation is much more extensive in new programs, such as in the advanced solid rocket motor and the Space Station Freedom.

  17. Shared and service-oriented CNC machining system for intelligent manufacturing process

    NASA Astrophysics Data System (ADS)

    Li, Yao; Liu, Qiang; Tong, Ronglei; Cui, Xiaohong

    2015-11-01

    To improve efficiency, reduce cost, ensure quality effectively, researchers on CNC machining have focused on virtual machine tool, cloud manufacturing, wireless manufacturing. However, low level of information shared among different systems is a common disadvantage. In this paper, a machining database with data evaluation module is set up to ensure integrity and update. An online monitoring system based on internet of things and multi-sensors "feel" a variety of signal features to "percept" the state in CNC machining process. A high efficiency and green machining parameters optimization system "execute" service-oriented manufacturing, intelligent manufacturing and green manufacturing. The intelligent CNC machining system is applied in production. CNC machining database effectively shares and manages process data among different systems. The prediction accuracy of online monitoring system is up to 98.8% by acquiring acceleration and noise in real time. High efficiency and green machining parameters optimization system optimizes the original processing parameters, and the calculation indicates that optimized processing parameters not only improve production efficiency, but also reduce carbon emissions. The application proves that the shared and service-oriented CNC machining system is reliable and effective. This research presents a shared and service-oriented CNC machining system for intelligent manufacturing process.

  18. Information processing speed mediates the relationship between white matter and general intelligence in schizophrenia.

    PubMed

    Alloza, Clara; Cox, Simon R; Duff, Barbara; Semple, Scott I; Bastin, Mark E; Whalley, Heather C; Lawrie, Stephen M

    2016-08-30

    Several authors have proposed that schizophrenia is the result of impaired connectivity between specific brain regions rather than differences in local brain activity. White matter abnormalities have been suggested as the anatomical substrate for this dysconnectivity hypothesis. Information processing speed may act as a key cognitive resource facilitating higher order cognition by allowing multiple cognitive processes to be simultaneously available. However, there is a lack of established associations between these variables in schizophrenia. We hypothesised that the relationship between white matter and general intelligence would be mediated by processing speed. White matter water diffusion parameters were studied using Tract-based Spatial Statistics and computed within 46 regions-of-interest (ROI). Principal component analysis was conducted on these white matter ROI for fractional anisotropy (FA) and mean diffusivity, and on neurocognitive subtests to extract general factors of white mater structure (gFA, gMD), general intelligence (g) and processing speed (gspeed). There was a positive correlation between g and gFA (r= 0.67, p =0.001) that was partially and significantly mediated by gspeed (56.22% CI: 0.10-0.62). These findings suggest a plausible model of structure-function relations in schizophrenia, whereby white matter structure may provide a neuroanatomical substrate for general intelligence, which is partly supported by speed of information processing. PMID:27308721

  19. Artificial intelligence and signal processing for infrastructure assessment

    NASA Astrophysics Data System (ADS)

    Assaleh, Khaled; Shanableh, Tamer; Yehia, Sherif

    2015-04-01

    The Ground Penetrating Radar (GPR) is being recognized as an effective nondestructive evaluation technique to improve the inspection process. However, data interpretation and complexity of the results impose some limitations on the practicality of using this technique. This is mainly due to the need of a trained experienced person to interpret images obtained by the GPR system. In this paper, an algorithm to classify and assess the condition of infrastructures utilizing image processing and pattern recognition techniques is discussed. Features extracted form a dataset of images of defected and healthy slabs are used to train a computer vision based system while another dataset is used to evaluate the proposed algorithm. Initial results show that the proposed algorithm is able to detect the existence of defects with about 77% success rate.

  20. Intelligent Computational Systems. Opening Remarks: CFD Application Process Workshop

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    1994-01-01

    This discussion will include a short review of the challenges that must be overcome if computational physics technology is to have a larger impact on the design cycles of U.S. aerospace companies. Some of the potential solutions to these challenges may come from the information sciences fields. A few examples of potential computational physics/information sciences synergy will be presented, as motivation and inspiration for the Improving The CFD Applications Process Workshop.

  1. The process of deforestation in weak democracies and the role of Intelligence.

    PubMed

    Obydenkova, Anastassia; Nazarov, Zafar; Salahodjaev, Raufhon

    2016-07-01

    This article examines the interconnection between national intelligence, political institutions, and the mismanagement of public resources (deforestations). The paper examines the reasons for deforestation and investigates the factors accountable for it. The analysis builds on authors-compiled cross-national dataset on 185 countries over the time period of twenty years, from 1990 to 2010. We find that, first, nation's intelligence reduces significantly the level of deforestation in a state. Moreover, the nations' IQ seems to play an offsetting role in the natural resource conservation (forest management) in the countries with weak democratic institutions. The analysis also discovered the presence of the U-shaped relationship between democracy and deforestation. Intelligence sheds more light on this interconnection and explains the results. Our results are robust to various sample selection strategies and model specifications. The main implication from our study is that intelligence not only shapes formal rules and informal regulations such as social trust, norms and traditions but also it has the ability to reverse the paradoxical process known as "resource curse." The study contributes to better understanding of reasons of deforestation and shed light on the debated impact of political regime on forest management. PMID:27148671

  2. Suppressive mechanisms in visual motion processing: From perception to intelligence.

    PubMed

    Tadin, Duje

    2015-10-01

    Perception operates on an immense amount of incoming information that greatly exceeds the brain's processing capacity. Because of this fundamental limitation, the ability to suppress irrelevant information is a key determinant of perceptual efficiency. Here, I will review a series of studies investigating suppressive mechanisms in visual motion processing, namely perceptual suppression of large, background-like motions. These spatial suppression mechanisms are adaptive, operating only when sensory inputs are sufficiently robust to guarantee visibility. Converging correlational and causal evidence links these behavioral results with inhibitory center-surround mechanisms, namely those in cortical area MT. Spatial suppression is abnormally weak in several special populations, including the elderly and individuals with schizophrenia-a deficit that is evidenced by better-than-normal direction discriminations of large moving stimuli. Theoretical work shows that this abnormal weakening of spatial suppression should result in motion segregation deficits, but direct behavioral support of this hypothesis is lacking. Finally, I will argue that the ability to suppress information is a fundamental neural process that applies not only to perception but also to cognition in general. Supporting this argument, I will discuss recent research that shows individual differences in spatial suppression of motion signals strongly predict individual variations in IQ scores. PMID:26299386

  3. Structural and incremental validity of the Wechsler Adult Intelligence Scale-Fourth Edition with a clinical sample.

    PubMed

    Nelson, Jason M; Canivez, Gary L; Watkins, Marley W

    2013-06-01

    Structural and incremental validity of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV; Wechsler, 2008a) was examined with a sample of 300 individuals referred for evaluation at a university-based clinic. Confirmatory factor analysis indicated that the WAIS-IV structure was best represented by 4 first-order factors as well as a general intelligence factor in a direct hierarchical model. The general intelligence factor accounted for the most common and total variance among the subtests. Incremental validity analyses indicated that the Full Scale IQ (FSIQ) generally accounted for medium to large portions of academic achievement variance. For all measures of academic achievement, the first-order factors combined accounted for significant achievement variance beyond that accounted for by the FSIQ, but individual factor index scores contributed trivial amounts of achievement variance. Implications for interpreting WAIS-IV results are discussed. PMID:23544395

  4. Outcome Prediction in Clinical Treatment Processes.

    PubMed

    Huang, Zhengxing; Dong, Wei; Ji, Lei; Duan, Huilong

    2016-01-01

    Clinical outcome prediction, as strong implications for health service delivery of clinical treatment processes (CTPs), is important for both patients and healthcare providers. Prior studies typically use a priori knowledge, such as demographics or patient physical factors, to estimate clinical outcomes at early stages of CTPs (e.g., admission). They lack the ability to deal with temporal evolution of CTPs. In addition, most of the existing studies employ data mining or machine learning methods to generate a prediction model for a specific type of clinical outcome, however, a mathematical model that predicts multiple clinical outcomes simultaneously, has not yet been established. In this study, a hybrid approach is proposed to provide a continuous predictive monitoring service on multiple clinical outcomes. More specifically, a probabilistic topic model is applied to discover underlying treatment patterns of CTPs from electronic medical records. Then, the learned treatment patterns, as low-dimensional features of CTPs, are exploited for clinical outcome prediction across various stages of CTPs based on multi-label classification. The proposal is evaluated to predict three typical classes of clinical outcomes, i.e., length of stay, readmission time, and the type of discharge, using 3492 pieces of patients' medical records of the unstable angina CTP, extracted from a Chinese hospital. The stable model was characterized by 84.9% accuracy and 6.4% hamming-loss with 3 latent treatment patterns discovered from data, which outperforms the benchmark multi-label classification algorithms for clinical outcome prediction. Our study indicates the proposed approach can potentially improve the quality of clinical outcome prediction, and assist physicians to understand the patient conditions, treatment inventions, and clinical outcomes in an integrated view. PMID:26573645

  5. Network-Capable Application Process and Wireless Intelligent Sensors for ISHM

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray

    2011-01-01

    Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This

  6. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  7. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    NASA Astrophysics Data System (ADS)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-12-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  8. Risk Intelligence: Making Profit from Uncertainty in Data Processing System

    PubMed Central

    Liao, Xiangke; Liu, Xiaodong

    2014-01-01

    In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput. PMID:24883392

  9. Risk intelligence: making profit from uncertainty in data processing system.

    PubMed

    Zheng, Si; Liao, Xiangke; Liu, Xiaodong

    2014-01-01

    In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput. PMID:24883392

  10. Intelligent editing for post-processing of ROI segmentation

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo

    2015-03-01

    Segmentation of regions of interest (ROIs), such as suspect lesions, is a preliminary but vital step for computeraided breast cancer diagnosis, but the task is quite challenging due to image quality and the complicated phenomena that are usually involved with the ROIs. On one hand, it is possible for physicians and clinicians to dig out more information from imaging; on another hand, efficient, robust, and accurate segmentation of such kind of anatomical lesions is often a difficult and open task to researcher and technical development. As a counterbalance between automatic methods, which are usually highly application dependent, and manual approaches, which are too time consuming, live wire, which provide full user control during segmentation while minimizing user interaction, is a promising option for assisting in breast lesion segmentation in ultrasound (US) images. This work proposes a live-wire-based adjustment method to further extend its potentials in computer-aided diagnosis (CAD) applications. It allows for local boundary adjustment, based on the live-wire paradigms, for a given segmentation, and can be attached as a post-process step to the live wire method or other segmentation approaches.

  11. Instruction: Does It Mean Creating Intelligence?

    ERIC Educational Resources Information Center

    Brethower, Dale

    1990-01-01

    Argues that the mission of the university is to create intelligence. Defines intelligence, discusses research on cognitive processes of learning, and discusses obstacles to using the demonstrate-label-coach-mastery strategy emphasizing the value of the clinical approach used to teach seven specific skills. Presents a classroom illustration of this…

  12. Extended Logic Intelligent Processing System for a Sensor Fusion Processor Hardware

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Thomas, Tyson; Li, Wei-Te; Daud, Taher; Fabunmi, James

    2000-01-01

    The paper presents the hardware implementation and initial tests from a low-power, highspeed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) is described, which combines rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor signals in compact low power VLSI. The development of the ELIPS concept is being done to demonstrate the interceptor functionality which particularly underlines the high speed and low power requirements. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Processing speeds of microseconds have been demonstrated using our test hardware.

  13. Academic Due Process in Clinical Pharmacy Education.

    ERIC Educational Resources Information Center

    Abood, Richard R.; Iovacchini, Eric V.

    1979-01-01

    The historical evolution of academic due process, its current concept as revealed in the Supreme Court ruling in Horowitz vs Board of Curators of the University of Missouri, and the application of that judicial opinion to clinical clerkship programs in pharmacy are discussed. Guidelines to protect faculty and administration are offered. (JMD)

  14. The Therapeutic Process in Clinical Social Work.

    ERIC Educational Resources Information Center

    Siporin, Max

    1983-01-01

    Suggests that current outmoded and inadequate conceptions of the therapeutic process are a major obstacle to the advancement of clinical social work practice. Presents an integrative ecosystem model that expresses the distinctive social work concern with person, situation, and helping relationship, in their reciprocal psychodynamic and…

  15. Plant intelligence

    NASA Astrophysics Data System (ADS)

    Trewavas, Anthony

    2005-09-01

    Intelligent behavior is a complex adaptive phenomenon that has evolved to enable organisms to deal with variable environmental circumstances. Maximizing fitness requires skill in foraging for necessary resources (food) in competitive circumstances and is probably the activity in which intelligent behavior is most easily seen. Biologists suggest that intelligence encompasses the characteristics of detailed sensory perception, information processing, learning, memory, choice, optimisation of resource sequestration with minimal outlay, self-recognition, and foresight by predictive modeling. All these properties are concerned with a capacity for problem solving in recurrent and novel situations. Here I review the evidence that individual plant species exhibit all of these intelligent behavioral capabilities but do so through phenotypic plasticity, not movement. Furthermore it is in the competitive foraging for resources that most of these intelligent attributes have been detected. Plants should therefore be regarded as prototypical intelligent organisms, a concept that has considerable consequences for investigations of whole plant communication, computation and signal transduction.

  16. Assessing Speech Intelligibility in Children with Hearing Loss: Toward Revitalizing a Valuable Clinical Tool

    ERIC Educational Resources Information Center

    Ertmer, David J.

    2011-01-01

    Background: Newborn hearing screening, early intervention programs, and advancements in cochlear implant and hearing aid technology have greatly increased opportunities for children with hearing loss to become intelligible talkers. Optimizing speech intelligibility requires that progress be monitored closely. Although direct assessment of…

  17. Research on application of intelligent computation based LUCC model in urbanization process

    NASA Astrophysics Data System (ADS)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  18. Towards an intelligent system for clinical guidance on wheelchair tilt and recline usage.

    PubMed

    Fu, Jicheng; Wiechmann, Paul; Jan, Yih-Kuen; Jones, Maria

    2012-01-01

    We propose to construct an intelligent system for clinical guidance on how to effectively use power wheelchair tilt and recline functions. The motivations fall into the following two aspects. (1) People with spinal cord injury (SCI) are vulnerable to pressure ulcers. SCI can lead to structural and functional changes below the injury level that may predispose individuals to tissue breakdown. As a result, pressure ulcers can significantly affect the quality of life, including pain, infection, altered body image, and even mortality. (2) Clinically, wheelchair power seat function, i.e., tilt and recline, is recommended for relieving sitting-induced pressures. The goal is to increase skin blood flow for the ischemic soft tissues to avoid irreversible damage. Due to variations in the level and completeness of SCI, the effectiveness of using wheelchair tilt and recline to reduce pressure ulcer risks has considerable room for improvement. Our previous study indicated that the blood flow of people with SCI may respond very differently to wheelchair tilt and recline settings. In this study, we propose to use the artificial neural network (ANN) to predict how wheelchair power seat functions affect blood flow response to seating pressure. This is regression learning because the predicted outputs are numerical values. Besides the challenging nature of regression learning, ANN may suffer from the overfitting problem which, when occurring, leads to poor predictive quality (i.e., cannot generalize). We propose using the particle swarm optimization (PSO) algorithm to train ANN to mitigate the impact of overfitting so that ANN can make correct predictions on both existing and new data. Experimental results show that the proposed approach is promising to improve ANN's predictive quality for new data. PMID:23366964

  19. Artificial Intelligence.

    ERIC Educational Resources Information Center

    Wash, Darrel Patrick

    1989-01-01

    Making a machine seem intelligent is not easy. As a consequence, demand has been rising for computer professionals skilled in artificial intelligence and is likely to continue to go up. These workers develop expert systems and solve the mysteries of machine vision, natural language processing, and neural networks. (Editor)

  20. Artificial Intelligence.

    ERIC Educational Resources Information Center

    Smith, Linda C.; And Others

    1988-01-01

    A series of articles focuses on artificial intelligence research and development to enhance information systems and services. Topics discussed include knowledge base designs, expert system development tools, natural language processing, expert systems for reference services, and the role that artificial intelligence concepts should have in…

  1. Intelligent processing equipment developments within the Navy's Manufacturing Technology Centers of Excellence

    NASA Astrophysics Data System (ADS)

    Nanzetta, Philip

    1992-04-01

    The U.S. Navy has had an active Manufacturing Technology (MANTECH) Program aimed at developing advanced production processes and equipment since the late-1960's. During the past decade, however, the resources of the MANTECH program were concentrated in Centers of Excellence. Today, the Navy sponsors four manufacturing technology Centers of Excellence: the Automated Manufacturing Research Facility (AMRF); the Electronics Manufacturing Productivity Facility (EMPF); the National Center for Excellence in Metalworking Technology (NCEMT); and the Center of Excellence for Composites Manufacturing Technology (CECMT). This paper briefly describes each of the centers and summarizes typical Intelligent Equipment Processing (IEP) projects that were undertaken.

  2. Intelligent Processing Equipment Developments Within the Navy's Manufacturing Technology Centers of Excellence

    NASA Technical Reports Server (NTRS)

    Nanzetta, Philip

    1992-01-01

    The U.S. Navy has had an active Manufacturing Technology (MANTECH) Program aimed at developing advanced production processes and equipment since the late-1960's. During the past decade, however, the resources of the MANTECH program were concentrated in Centers of Excellence. Today, the Navy sponsors four manufacturing technology Centers of Excellence: the Automated Manufacturing Research Facility (AMRF); the Electronics Manufacturing Productivity Facility (EMPF); the National Center for Excellence in Metalworking Technology (NCEMT); and the Center of Excellence for Composites Manufacturing Technology (CECMT). This paper briefly describes each of the centers and summarizes typical Intelligent Equipment Processing (IEP) projects that were undertaken.

  3. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  4. Are Intelligence and Creativity Really so Different?: Fluid Intelligence, Executive Processes, and Strategy Use in Divergent Thinking

    ERIC Educational Resources Information Center

    Nusbaum, Emily C.; Silvia, Paul J.

    2011-01-01

    Contemporary creativity research views intelligence and creativity as essentially unrelated abilities, and many studies have found only modest correlations between them. The present research, based on improved approaches to creativity assessment and latent variable modeling, proposes that fluid and executive cognition is in fact central to…

  5. The communication process in clinical settings.

    PubMed

    Mathews, J J

    1983-01-01

    The communication of information in clinical settings is fraught with problems despite avowed common aims of practitioners and patients. Some reasons for the problematic nature of clinical communication are incongruent frames of reference about what information ought to be shared, sociolinguistic differences and social distance between practitioners and patients. Communication between doctors and nurses is also problematic, largely due to differences in ideology between the professions about what ought to be communicated to patients about their illness and who is ratified to give such information. Recent social changes, such as the Patient Bill of Rights and informed consent which assure access to information, and new conceptualizations of the nurse's role, warrant continued study of the communication process especially in regard to what constitutes appropriate and acceptable information about a patient's illness and who ought to give such information to patients. The purpose of this paper is to outline characteristics of communication in clinical settings and to provide a literature review of patient and practitioner interaction studies in order to reflect on why information exchange is problematic in clinical settings. A framework for presentation of the problems employs principles from interaction and role theory to investigate clinical communication from three viewpoints: (1) the level of shared knowledge between participants; (2) the effect of status, role and ideology on transactions; and (3) the regulation of communication imposed by features of the institution. PMID:6359453

  6. Integration of computer-aided design and manufacturing through artificial-intelligence-based process planning

    SciTech Connect

    Arunthavanathan, V.

    1988-01-01

    The research effort reported in this thesis is directed towards the integration of design, process planning, and manufacturing. The principal notion used in system integration through information integration. The main outcome of this research effort is an artificial-intelligence-based computer-aided generative process planning system, which would use a feature-based symbolic geometry as its input. The feature-based symbolic data structure is used as the common data between design, process planning, and manufacturing. As the commercial computer-aided design systems would not generate a feature-based data base, special interfaces are designed and used. As part of the solution strategy, a module to analyze the symbolic geometry from a global perspective is developed. This module imitates a human process planner and derives some overall assertions. The enhanced geometry data is then used by a rule-based expert system to develop the process plan.

  7. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining

    PubMed Central

    Salehi, Mojtaba

    2010-01-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously. PMID:21845020

  8. Function-based design process for an intelligent ground vehicle vision system

    NASA Astrophysics Data System (ADS)

    Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.

    2010-10-01

    An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.

  9. The Relationship between the Kaufman Brief Intelligence Test and the Wechsler Intelligence Scale for Children-III in a Clinical Sample.

    ERIC Educational Resources Information Center

    Javorsky, James

    1993-01-01

    This study found a significant relationship between the Kaufman Brief Intelligence Test (K-BIT) and the Wechsler Intelligence Scale for Children-III (WISC-III) in 63 youth at a psychiatric hospital. A multiple regression equation was derived to provide an estimate of the WISC-III Full Scale Intelligence Quotient using the composites of the K-BIT.…

  10. Age-related decline in cognitive control: the role of fluid intelligence and processing speed

    PubMed Central

    2014-01-01

    Background Research on cognitive control suggests an age-related decline in proactive control abilities whereas reactive control seems to remain intact. However, the reason of the differential age effect on cognitive control efficiency is still unclear. This study investigated the potential influence of fluid intelligence and processing speed on the selective age-related decline in proactive control. Eighty young and 80 healthy older adults were included in this study. The participants were submitted to a working memory recognition paradigm, assessing proactive and reactive cognitive control by manipulating the interference level across items. Results Repeated measures ANOVAs and hierarchical linear regressions indicated that the ability to appropriately use cognitive control processes during aging seems to be at least partially affected by the amount of available cognitive resources (assessed by fluid intelligence and processing speed abilities). Conclusions This study highlights the potential role of cognitive resources on the selective age-related decline in proactive control, suggesting the importance of a more exhaustive approach considering the confounding variables during cognitive control assessment. PMID:24401034

  11. Development of inherently safe and environmentally acceptable intelligent processing technologies for HTS materials

    SciTech Connect

    Peterson, E.J.; Wangen, L.E.; Ott, K.C.; Muenchausen, R.E.; Parkinson, W.J. )

    1989-01-01

    The development of new processing technologies for the production, fabrication, and application of advanced materials proceeds through several complementary dimensions. The advanced materials dimension includes basic research on materials synthesis, composition, and properties; materials processing research; engineering characterization and materials applications; and product and process engineering. The health and environmental dimension includes identification of potential health and environmental constraints; characterization of candidate processes for waste and effluent quality; process optimization for both economic and environmental benefit; and development of control strategies to deal with health and environmental problems that cannot be solved through process modification. The intelligent processing dimension includes application of available sensors and the development of new diagnostics for real-time process measurements; development of control strategies and expert systems to use these process measurements for real-time process control; and development of capabilities to optimize working processes in real-time for both product quality and environmental acceptability. This paper discusses these issues in the context of the Laboratory's efforts to develop technologies based on the processing of the new high-temperature superconducting ceramic oxides.

  12. The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process

    SciTech Connect

    Elter, M.; Schulz-Wendtland, R.; Wittenberg, T.

    2007-11-15

    Mammography is the most effective method for breast cancer screening available today. However, the low positive predictive value of breast biopsy resulting from mammogram interpretation leads to approximately 70% unnecessary biopsies with benign outcomes. To reduce the high number of unnecessary breast biopsies, several computer-aided diagnosis (CAD) systems have been proposed in the last several years. These systems help physicians in their decision to perform a breast biopsy on a suspicious lesion seen in a mammogram or to perform a short term follow-up examination instead. We present two novel CAD approaches that both emphasize an intelligible decision process to predict breast biopsy outcomes from BI-RADS findings. An intelligible reasoning process is an important requirement for the acceptance of CAD systems by physicians. The first approach induces a global model based on decison-tree learning. The second approach is based on case-based reasoning and applies an entropic similarity measure. We have evaluated the performance of both CAD approaches on two large publicly available mammography reference databases using receiver operating characteristic (ROC) analysis, bootstrap sampling, and the ANOVA statistical significance test. Both approaches outperform the diagnosis decisions of the physicians. Hence, both systems have the potential to reduce the number of unnecessary breast biopsies in clinical practice. A comparison of the performance of the proposed decision tree and CBR approaches with a state of the art approach based on artificial neural networks (ANN) shows that the CBR approach performs slightly better than the ANN approach, which in turn results in slightly better performance than the decision-tree approach. The differences are statistically significant (p value <0.001). On 2100 masses extracted from the DDSM database, the CRB approach for example resulted in an area under the ROC curve of A(z)=0.89{+-}0.01, the decision-tree approach in A(z)=0

  13. Research on application of intelligent computation based LUCC model in urbanization process

    NASA Astrophysics Data System (ADS)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  14. Emotionally Intelligent Leadership: An Integrative, Process-Oriented Theory of Student Leadership

    ERIC Educational Resources Information Center

    Allen, Scott J.; Shankman, Marcy Levy; Miguel, Rosanna F.

    2012-01-01

    Emotionally intelligent leadership (EIL) theory combines relevant models, theories, and research in the areas of emotional intelligence (EI) and leadership. With an intentional focus on context, self and others, emotionally intelligent leaders facilitate the attainment of desired outcomes. The 21 capacities described by the theory equip…

  15. Image restoration in multisensor missile seeker environments for design of intelligent integrated processing architectures

    NASA Astrophysics Data System (ADS)

    Sundareshan, Malur K.; Pang, Ho-Yuen; Amphay, Sengvieng A.; Sundstrom, Bryce M.

    1997-10-01

    Two major factors that could limit successful implementations of image restoration and superresolution algorithms in missile seeker applications are, (i) lack of accurate knowledge of sensor point spread function (PSF) parameters, and (ii) noise-induced artifacts in the restoration process. The robustness properties of a recently developed blind iterative Maximum Likelihood (ML) restoration algorithm to inaccuracies in sensor PSF are established in this paper. Two modifications to this algorithm that successfully equip it to suppress artifacts resulting from the presence of high frequency noise components are outlined. Performance evaluation studies with 1D and 2D signals are included to demonstrate that these algorithms have superresolution capabilities while possessing also attractive robustness and artifact suppression properties. The algorithms developed here hence contribute to efficient designs of intelligent integrated processing architectures for smart weapon applications.

  16. Hybrid intelligent control of substrate feeding for industrial fed-batch chlortetracycline fermentation process.

    PubMed

    Jin, Huaiping; Chen, Xiangguang; Yang, Jianwen; Wu, Lei; Wang, Li

    2014-11-01

    The lack of accurate process models and reliable online sensors for substrate measurements poses significant challenges for controlling substrate feeding accurately, automatically and optimally in fed-batch fermentation industries. It is still a common practice to regulate the feeding rate based upon manual operations. To address this issue, a hybrid intelligent control method is proposed to enable automatic substrate feeding. The resulting control system consists of three modules: a presetting module for providing initial set-points; a predictive module for estimating substrate concentration online based on a new time interval-varying soft sensing algorithm; and a feedback compensator using expert rules. The effectiveness of the proposed approach is demonstrated through its successful applications to the industrial fed-batch chlortetracycline fermentation process. PMID:25245525

  17. Application of artificial intelligence to melter control: Realtime process advisor for the scale melter facility

    SciTech Connect

    Edwards, Jr, R E

    1988-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Plant (SRP) is currently under construction and when completed will process high-level radioactive waste into a borosilicate glass wasteform. This facility will consist of numerous batch chemical processing steps as well as the continuous operation of a joule-heated melter and its off-gas treatment system. A realtime process advisor system based on Artificial Intelligence (AI) techniques has been developed and is currently in use at the semiworks facility, which is operating a 2/3 scale of the DWPF joule-heated melter. The melter advisor system interfaces to the existing data collection and control system and monitors current operations of this facility. The advisor then provides advice to operators and engineers when it identifies process problems. The current system is capable of identifying process problems such as feed system pluggages and thermocouple failures and providing recommended actions. The system also provides facilities normally with distributed control systems. These include the ability to display process flowsheets, monitor alarm conditions, and check the status of process interlocks. 7 figs.

  18. Use of conditional rule structure to automate clinical decision support: a comparison of artificial intelligence and deterministic programming techniques.

    PubMed

    Friedman, R H; Frank, A D

    1983-08-01

    A rule-based computer system was developed to perform clinical decision-making support within a medical information system, oncology practice, and clinical research. This rule-based system, which has been programmed using deterministic rules, possesses features of generalizability, modularity of structure, convenience in rule acquisition, explanability, and utility for patient care and teaching, features which have been identified as advantages of artificial intelligence (AI) rule-based systems. Formal rules are primarily represented as conditional statements; common conditions and actions are stored in system dictionaries so that they can be recalled at any time to form new decision rules. Important similarities and differences exist in the structure of this system and clinical computer systems utilizing artificial intelligence (AI) production rule techniques. The non-AI rule-based system possesses advantages in cost and ease of implementation. The degree to which significant medical decision problems can be solved by this technique remains uncertain as does whether the more complex AI methodologies will be required. PMID:6352165

  19. (Actino)Bacterial "intelligence": using comparative genomics to unravel the information processing capacities of microbes.

    PubMed

    Pinto, Daniela; Mascher, Thorsten

    2016-08-01

    Bacterial genomes encode numerous and often sophisticated signaling devices to perceive changes in their environment and mount appropriate adaptive responses. With their help, microbes are able to orchestrate specific decision-making processes that alter the cellular behavior, but also integrate and communicate information. Moreover and beyond, some signal transducing systems also enable bacteria to remember and learn from previous stimuli to anticipate environmental changes. As recently suggested, all of these aspects indicate that bacteria do, in fact, exhibit cognition remarkably reminiscent of what we refer to as intelligent behavior, at least when referred to higher eukaryotes. In this essay, comprehensive data derived from comparative genomics analyses of microbial signal transduction systems are used to probe the concept of cognition in bacterial cells. Using a recent comprehensive analysis of over 100 actinobacterial genomes as a test case, we illustrate the different layers of the capacities of bacteria that result in cognitive and behavioral complexity as well as some form of 'bacterial intelligence'. We try to raise awareness to approach bacteria as cognitive organisms and believe that this view would enrich and open a new path in the experimental studies of bacterial signal transducing systems. PMID:26852121

  20. Emotional Intelligence: The Sine Qua Non for a Clinical Leadership Toolbox

    ERIC Educational Resources Information Center

    Rao, Paul R.

    2006-01-01

    Over the past decade, it has become increasingly clear that although IQ and technical skills are important, emotional intelligence is the Sine Qua Non of leadership. According to Goleman [Goleman, D. (1998). What makes a leader? "Harvard Business Review," 93-102] "effective leaders are alike in one crucial way: they all have a high degree of…

  1. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception: An Event-Related Potential Study.

    PubMed

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan; Shi, Jiannong

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50-130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320-450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information. PMID:26375031

  2. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception: An Event-Related Potential Study

    PubMed Central

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan; Shi, Jiannong

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50–130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320–450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information. PMID:26375031

  3. Efficient computer architecture for the realization of realtime linear systems with intelligent processing of large sparse matrices

    SciTech Connect

    Chae, S.H.

    1988-01-01

    This dissertation describes an intelligent rule-based iterative parallel algorithm for solving randomly distributed large sparse linear systems of equations, and also the efficient parallel processing computer architecture for the implementation of the algorithm. Implemented with the Jacobi iterative method, the intelligent rule-based algorithm reduces the parallel execution time by reducing the individual inner product operation time. A static dataflow architecture is proposed for implementing the intelligent rule-based iterative parallel algorithm. The dataflow computer architecture has the capability to support parallelism exploited in the algorithm, and the execute the algorithm asynchronously. The proposed computer architecture consists of a main processor, several control processors, scalar slave processors, and pipelined slave processors. Several control processors share with the main processor the heavy burden of allocation of operation packets and of sychronization for parallel processing.

  4. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  5. Intelligent Processing Equipment Research and Development Programs of the Department of Commerce

    NASA Technical Reports Server (NTRS)

    Simpson, J. A.

    1992-01-01

    The intelligence processing equipment (IPE) research and development (R&D) programs of the Department of Commerce are carried out within the National Institute of Standards and Technology (NIST). This institute has had work in support of industrial productivity as part of its mission since its founding in 1901. With the advent of factory automation these efforts have increasingly turned to R&D in IPE. The Manufacturing Engineering Laboratory (MEL) of NIST devotes a major fraction of its efforts to this end while other elements within the organization, notably the Material Science and Engineering Laboratory, have smaller but significant programs. An inventory of all such programs at NIST and a representative selection of projects that at least demonstrate the scope of the efforts are presented.

  6. Effects of social exclusion on cognitive processes: anticipated aloneness reduces intelligent thought.

    PubMed

    Baumeister, Roy F; Twenge, Jean M; Nuss, Christopher K

    2002-10-01

    Three studies examined the effects of randomly assigned messages of social exclusion. In all 3 studies, significant and large decrements in intelligent thought (including IQ and Graduate Record Examination test performance) were found among people told they were likely to end up alone in life. The decline in cognitive performance was found in complex cognitive tasks such as effortful logic and reasoning; simple information processing remained intact despite the social exclusion. The effects were specific to social exclusion, as participants who received predictions of future nonsocial misfortunes (accidents and injuries) performed well on the cognitive tests. The cognitive impairments appeared to involve reductions in both speed (effort) and accuracy. The effect was not mediated by mood. PMID:12374437

  7. Service with a smile: do emotional intelligence, gender, and autonomy moderate the emotional labor process?

    PubMed

    Johnson, Hazel-Anne M; Spector, Paul E

    2007-10-01

    This survey study of 176 participants from eight customer service organizations investigated how individual factors moderate the impact of emotional labor strategies on employee well-being. Hierarchical regression analyses indicated that gender and autonomy were significant moderators of the relationships between emotional labor strategies and the personal outcomes of emotional exhaustion, affective well-being, and job satisfaction. Females were more likely to experience negative consequences when engaging in surface acting. Autonomy served to alleviate negative outcomes for individuals who used emotional labor strategies often. Contrary to our hypotheses, emotional intelligence did not moderate the relationship between the emotional labor strategies and personal outcomes. Results demonstrated how the emotional labor process can influence employee well-being. PMID:17953492

  8. The Relationship between Emotional Intelligence and Cool and Hot Cognitive Processes: A Systematic Review.

    PubMed

    Gutiérrez-Cobo, María José; Cabello, Rosario; Fernández-Berrocal, Pablo

    2016-01-01

    Although emotion and cognition were considered to be separate aspects of the psyche in the past, researchers today have demonstrated the existence of an interplay between the two processes. Emotional intelligence (EI), or the ability to perceive, use, understand, and regulate emotions, is a relatively young concept that attempts to connect both emotion and cognition. While EI has been demonstrated to be positively related to well-being, mental and physical health, and non-aggressive behaviors, little is known about its underlying cognitive processes. The aim of the present study was to systematically review available evidence about the relationship between EI and cognitive processes as measured through "cool" (i.e., not emotionally laden) and "hot" (i.e., emotionally laden) laboratory tasks. We searched Scopus and Medline to find relevant articles in Spanish and English, and divided the studies following two variables: cognitive processes (hot vs. cool) and EI instruments used (performance-based ability test, self-report ability test, and self-report mixed test). We identified 26 eligible studies. The results provide a fair amount of evidence that performance-based ability EI (but not self-report EI tests) is positively related with efficiency in hot cognitive tasks. EI, however, does not appear to be related with cool cognitive tasks: neither through self-reporting nor through performance-based ability instruments. These findings suggest that performance-based ability EI could improve individuals' emotional information processing abilities. PMID:27303277

  9. The Relationship between Emotional Intelligence and Cool and Hot Cognitive Processes: A Systematic Review

    PubMed Central

    Gutiérrez-Cobo, María José; Cabello, Rosario; Fernández-Berrocal, Pablo

    2016-01-01

    Although emotion and cognition were considered to be separate aspects of the psyche in the past, researchers today have demonstrated the existence of an interplay between the two processes. Emotional intelligence (EI), or the ability to perceive, use, understand, and regulate emotions, is a relatively young concept that attempts to connect both emotion and cognition. While EI has been demonstrated to be positively related to well-being, mental and physical health, and non-aggressive behaviors, little is known about its underlying cognitive processes. The aim of the present study was to systematically review available evidence about the relationship between EI and cognitive processes as measured through “cool” (i.e., not emotionally laden) and “hot” (i.e., emotionally laden) laboratory tasks. We searched Scopus and Medline to find relevant articles in Spanish and English, and divided the studies following two variables: cognitive processes (hot vs. cool) and EI instruments used (performance-based ability test, self-report ability test, and self-report mixed test). We identified 26 eligible studies. The results provide a fair amount of evidence that performance-based ability EI (but not self-report EI tests) is positively related with efficiency in hot cognitive tasks. EI, however, does not appear to be related with cool cognitive tasks: neither through self-reporting nor through performance-based ability instruments. These findings suggest that performance-based ability EI could improve individuals’ emotional information processing abilities. PMID:27303277

  10. INTELLIGENT MONITORING SYSTEM WITH HIGH TEMPERATURE DISTRIBUTED FIBEROPTIC SENSOR FOR POWER PLANT COMBUSTION PROCESSES

    SciTech Connect

    Kwang Y. Lee; Stuart S. Yin; Andre Boheman

    2003-12-26

    The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, the efforts focused on developing an innovative high temperature distributed fiber optic sensor by fabricating in-fiber gratings in single crystal sapphire fibers. So far, our major accomplishments include: Successfully grown alumina cladding layers on single crystal sapphire fibers, successfully fabricated in-fiber gratings in single crystal sapphire fibers, and successfully developed a high temperature distributed fiber optic sensor. Under Task 2, the emphasis has been on putting into place a computational capability for simulation of combustors. A PC workstation was acquired with dual Xeon processors and sufficient memory to support 3-D calculations. An existing license for Fluent software was expanded to include two PC processes, where the existing license was for a Unix workstation. Under Task 3, intelligent state estimation theory is being developed which will map the set of 1D (located judiciously within a 3D environment) measurement data into a 3D temperature profile. This theory presents a semigroup

  11. Prodiag--a hybrid artificial intelligence based reactor diagnostic system for process faults

    SciTech Connect

    Reifman, J.; Wei, T.Y.C.; Vitela, J.E.; Applequist, C. A.; Chasensky, T.M.

    1996-03-01

    Commonwealth Research Corporation (CRC) and Argonne National Laboratory (ANL) are collaborating on a DOE-sponsored Cooperative Research and Development Agreement (CRADA), project to perform feasibility studies on a novel approach to Artificial Intelligence (Al) based diagnostics for component faults in nuclear power plants. Investigations are being performed in the construction of a first-principles physics-based plant level process diagnostic expert system (ES) and the identification of component-level fault patterns through operating component characteristics using artificial neural networks (ANNs). The purpose of the proof-of-concept project is to develop a computer-based system using this Al approach to assist process plant operators during off-normal plant conditions. The proposed computer-based system will use thermal hydraulic (T-H) signals complemented by other non-T-H signals available in the data stream to provide the process operator with the component which most likely caused the observed process disturbance.To demonstrate the scale-up feasibility of the proposed diagnostic system it is being developed for use with the Chemical Volume Control System (CVCS) of a nuclear power plant. A full-scope operator training simulator representing the Commonwealth Edison Braidwood nuclear power plant is being used both as the source of development data and as the means to evaluate the advantages of the proposed diagnostic system. This is an ongoing multi-year project and this paper presents the results to date of the CRADA phase.

  12. Artificial intelligence environment for the analysis and classification of errors in discrete sequential processes

    SciTech Connect

    Ahuja, S.B.

    1985-01-01

    The study evolved over two phases. First, an existing artificial intelligence technique, heuristic state space search, was used to successfully address and resolve significant issues that have prevented automated error classification in the past. A general method was devised for constructing heuristic functions to guide the search process, which successfully avoided the combinatorial explosion normally associated with search paradigms. A prototype error classifier, SLIPS/I, was tested and evaluated using both real-world data from a databank of speech errors and artificially generated random errors. It showed that heuristic state space search is a viable paradigm for conducting domain-independent error classification within practical limits of memory space and processing time. The second phase considered sequential error classification as a diagnostic process in which a set of disorders (elementary errors) is said to be a classification of an observed set of manifestations (local differences between an intended sequence and the errorful sequence) it if provides a regular cover for them. Using a model of abductive logic based on the set covering theory, this new perspective of error classification as a diagnostic process models human diagnostic reasoning in classifying complex errors. A high level, non-procedural error specification language (ESL) was also designed.

  13. Validation of the Luria-Nebraska Intellectual Processes Scale as a Measure of Intelligence in Male Alcoholics.

    ERIC Educational Resources Information Center

    Kivlahan, Daniel R.; And Others

    1985-01-01

    Investigated the Luria-Nebraska Intellectual Processes Scale (IPS) as a predictor of Wechsler Adult Intelligence Scale (WAIS) IQs among alcoholic inpatients. Strong correlations were found between IPS and WAIS Verbal IQ and Full Scale IQ; however, the correlation with Performance IQ was only -.41. (NRB)

  14. Gender Differences in the Relationship between Emotional Intelligence and Right Hemisphere Lateralization for Facial Processing

    ERIC Educational Resources Information Center

    Castro-Schilo, Laura; Kee, Daniel W.

    2010-01-01

    The present study examined relationships between emotional intelligence, measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, and right hemisphere dominance for a free vision chimeric face test. A sample of 122 ethnically diverse college students participated and completed online versions of the forenamed tests. A hierarchical…

  15. Problem-Based Learning Pedagogies: Psychological Processes and Enhancement of Intelligences

    ERIC Educational Resources Information Center

    Tan, Oon-Seng

    2007-01-01

    Education in this 21st century is concerned with developing intelligences. Problem solving in real-world contexts involves multiple ways of knowing and learning. Intelligence in the real world involves not only learning how to do things effectively but also more importantly the ability to deal with novelty and growing our capacity to adapt, select…

  16. The Role of Emotional Intelligence in the Career Commitment and Decision-Making Process.

    ERIC Educational Resources Information Center

    Brown, Chris; George-Curran, Roberta; Smith, Marian L.

    2003-01-01

    Measures of emotional intelligence, vocational exploration, and career decision-making self-efficacy (CDMSE) were completed by 288 college students. Emotional intelligence was positively related to CDMSE. Utilization of feelings and self-control factors were inversely related to vocational exploration and commitment. Gender was not a moderator of…

  17. Confirmatory factor analysis of the Wechsler Intelligence Scale for Children--Third Edition in an Australian clinical sample.

    PubMed

    Cockshott, Felicity C; Marsh, Nigel V; Hine, Donald W

    2006-09-01

    A confirmatory factor analysis was conducted on the Wechsler Intelligence Scale for Children-Third Edition (WISC-III; D. Wechsler, 1991) with a sample of 579 Australian children referred for assessment because of academic difficulties in the classroom. The children were administered the WISC-III as part of the initial eligibility determination process for funding of special education services. The children were aged between 6 years and 16 years 7 months. One-, two-, three-, and four-factor models were tested. The four-factor model proposed in the WISC-III manual fit the data significantly better than all other models tested. PMID:16953739

  18. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    PubMed

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes. PMID:21431608

  19. An Intelligent Clinical Decision Support System for Patient-Specific Predictions to Improve Cervical Intraepithelial Neoplasia Detection

    PubMed Central

    Bountris, Panagiotis; Haritou, Maria; Pouliakis, Abraham; Margari, Niki; Kyrgiou, Maria; Spathis, Aris; Pappas, Asimakis; Panayiotides, Ioannis; Paraskevaidis, Evangelos A.; Karakitsos, Petros; Koutsouris, Dimitrios-Dionyssios

    2014-01-01

    Nowadays, there are molecular biology techniques providing information related to cervical cancer and its cause: the human Papillomavirus (HPV), including DNA microarrays identifying HPV subtypes, mRNA techniques such as nucleic acid based amplification or flow cytometry identifying E6/E7 oncogenes, and immunocytochemistry techniques such as overexpression of p16. Each one of these techniques has its own performance, limitations and advantages, thus a combinatorial approach via computational intelligence methods could exploit the benefits of each method and produce more accurate results. In this article we propose a clinical decision support system (CDSS), composed by artificial neural networks, intelligently combining the results of classic and ancillary techniques for diagnostic accuracy improvement. We evaluated this method on 740 cases with complete series of cytological assessment, molecular tests, and colposcopy examination. The CDSS demonstrated high sensitivity (89.4%), high specificity (97.1%), high positive predictive value (89.4%), and high negative predictive value (97.1%), for detecting cervical intraepithelial neoplasia grade 2 or worse (CIN2+). In comparison to the tests involved in this study and their combinations, the CDSS produced the most balanced results in terms of sensitivity, specificity, PPV, and NPV. The proposed system may reduce the referral rate for colposcopy and guide personalised management and therapeutic interventions. PMID:24812614

  20. Sarpi, A Solution For Artificial Intelligence In Image Processing At Intermediate Level

    NASA Astrophysics Data System (ADS)

    Tagliarino, Patricia; Rogala, Jean-Pierre

    1989-03-01

    Our reflection, on the elements of existing Image Processing systems (currently Image Processing, Symbol interpretation level, control mode, level of extracted features) and corresponding use of Artificial Intelligence, leads us to the definition of the SARPI system. This system performs the extraction of features of intermediate level. In the present first step of implementation, we limit ourself to line segments. They are associated to a descriptor including several parameters: position, angle, length, cross contrast, ... and precision on all of these parameters. SARPI applies to single or multiple features detection, it finds the requested feature(s) and produces its (their) total or partial (as requested) description. SARPI takes as input the set of requested parameters and available values of some feature parameters (typically: qualitative measure of contrast). Its main part is a control module automatically generating an Image Processing sequence to solve the problem (extraction of requested feature parameters). Rules allow to divide the problem in elementary ones with respect to the kind of input parameters. They allow the selection of an elementary function set according to the requested feature parameters and the known parameters; in this way, if the known information is insufficient, the control module selects and executes elementary functions that look for the missing information. Each of these elementary functions is pre-associated to Image Procedures and heuristics that select the appropriate procedures according to thc values of the input parameters. The parameters of the image processes are controlled automatically by the precision on the requested feature parameters. Particularly, the sampling steps of the parameters ρ and θ of the 'lough transform are calculated from the requested precision of the feature parameters. The selected Image Processings are applied on a region of the image that is calculated from the approximated position of the

  1. Intelligent information system: for automation of airborne early warning crew decision processes

    NASA Astrophysics Data System (ADS)

    Chin, Hubert H.

    1991-03-01

    This paper describes an automation of AEW crew decision processed implemented in an intelligent information system for an advanced AEW aircraft platform. The system utilizes the existing AEW aircraft database and knowledge base such that the database can provide sufficient data to solve the sizable AEW problems. A database management system is recommended for managing the large amount of data. In order to expand a conventional expert system so that is has the capacity to solve the sizable problems, a cooperative model is required to coordinate with five expert systems in the cooperative decision process. The proposed model partitions the traditional knowledge base into a set of disjoint portions which cover the needs of and are shared by the expert systems. Internal communications take place on common shared portions. A cooperative algorithm is required for updating synchronization and concurrent control. The purpose of this paper is to present a cooperative model for enhancing standard rule-based expert systems to make cooperative decision and to superimpose the global knowledge base and database in a more natural fashion. The tools being used for developing the prototype are the ADA programming language and the ORACLE relational database management system.

  2. Using swarm intelligence to boost the root cause analysis process and enhance patient safety.

    PubMed

    2016-03-01

    In an effort to strengthen patient safety, leadership at the University of Kentucky HealthCare (UKHC) decided to replace its traditional approach to root cause analysis (RCA) with a process based on swarm intelligence, a concept borrowed from other industries. Under this process, when a problem or error is identified, staff quickly hold a swarm--a meeting in which all those involved in the incident or problem quickly evaluate why the issue occurred and identify potential solutions for implementation. A pillar of the swarm concept is a mandate that there be no punishments or finger-pointing during the swarms. The idea is to encourage staff to be forthcoming to achieve effective solutions. Typically, swarms last for one hour and result in action plans designed to correct problems or deficiencies within a specific period of time. The ED was one of the first areas where UKHC applied swarms. For example, hospital administrators note that the approach has been used to address issues involving patient flow, triage protocols, assessments, overcrowding, and boarding. After seven years, incident reporting at UKHC has increased by 52%, and the health system has experienced a 37% decrease in the observed-to-expected mortality ratio. PMID:26979047

  3. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    NASA Astrophysics Data System (ADS)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  4. Artificial Intelligence Based Selection of Optimal Cutting Tool and Process Parameters for Effective Turning and Milling Operations

    NASA Astrophysics Data System (ADS)

    Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta

    2016-06-01

    With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

  5. Intelligent robots and computer vision

    SciTech Connect

    Casasent, D.P.

    1985-01-01

    This book presents the papers given at a conference which examined artificial intelligence and image processing in relation to robotics. Topics considered at the conference included feature extraction and pattern recognition for computer vision, image processing for intelligent robotics, robot sensors, image understanding and artificial intelligence, optical processing techniques in robotic applications, robot languages and programming, processor architectures for computer vision, mobile robots, multisensor fusion, three-dimensional modeling and recognition, intelligent robots applications, and intelligent robot systems.

  6. Mindfulness as a transtheoretical clinical process.

    PubMed

    Dunn, Rose; Callahan, Jennifer L; Swift, Joshua K

    2013-09-01

    The use of mindfulness in psychotherapy has garnered the attention of both researchers and therapists over recent years. Based on established research, use of mindfulness with clients is recommended to improve awareness during sessions, reduce ruminative thinking patterns, and increase self-compassion regardless of theoretical orientation. In this article, de-identified clinical material is used to illustrate both informal and formal mindfulness training in session. Further, we provide illustrations of presession and within-session therapist mindfulness, recommending that therapists develop their own mindfulness practice, as research has demonstrated that it is related to important clinical skills including attentiveness, nonjudgment, and improved client perceptions. PMID:24000842

  7. Modeling of Steam Distillation Mechanism during Steam Injection Process Using Artificial Intelligence

    PubMed Central

    Ahadi, Arash; Kharrat, Riyaz

    2014-01-01

    Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods. PMID:24883365

  8. Modeling of steam distillation mechanism during steam injection process using artificial intelligence.

    PubMed

    Daryasafar, Amin; Ahadi, Arash; Kharrat, Riyaz

    2014-01-01

    Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods. PMID:24883365

  9. Two hybrid Artificial Intelligence approaches for modeling rainfall-runoff process

    NASA Astrophysics Data System (ADS)

    Nourani, Vahid; Kisi, Özgür; Komasi, Mehdi

    2011-05-01

    SummaryThe need for accurate modeling of the rainfall-runoff process has grown rapidly in the past decades. However, considering the high stochastic property of the process, many models are still being developed in order to define such a complex phenomenon. Recently, Artificial Intelligence (AI) techniques such as the Artificial Neural Network (ANN) and the Adaptive Neural-Fuzzy Inference System (ANFIS) have been extensively used by hydrologists for rainfall-runoff modeling as well as for other fields of hydrology. In this paper, two hybrid AI-based models which are reliable in capturing the periodicity features of the process are introduced for watershed rainfall-runoff modeling. In the first model, the SARIMAX (Seasonal Auto Regressive Integrated Moving Average with exogenous input)-ANN model, an ANN is used to find the non-linear relationship among the residuals of the fitted linear SARIMAX model. In the second model, the wavelet-ANFIS model, wavelet transform is linked to the ANFIS concept and the main time series of two variables (rainfall and runoff) are decomposed into some multi-frequency time series by wavelet transform. Afterwards, these time series are imposed as input data to the ANFIS to predict the runoff discharge one time step ahead. The obtained results of the models applications for the rainfall-runoff modeling of two watersheds (located in Azerbaijan, Iran) show that, although the proposed models can predict both short and long terms runoff discharges by considering seasonality effects, the second model is relatively more appropriate because it uses the multi-scale time series of rainfall and runoff data in the ANFIS input layer.

  10. Are Randomized Controlled Trials the (G)old Standard? From Clinical Intelligence to Prescriptive Analytics.

    PubMed

    Van Poucke, Sven; Thomeer, Michiel; Heath, John; Vukicevic, Milan

    2016-01-01

    Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers' scientific epistemology of "falsificationism." Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation. PMID:27383622

  11. Are Randomized Controlled Trials the (G)old Standard? From Clinical Intelligence to Prescriptive Analytics

    PubMed Central

    2016-01-01

    Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers’ scientific epistemology of “falsificationism.” Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation. PMID:27383622

  12. The Intelligent Ventilator (INVENT) project: the role of mathematical models in translating physiological knowledge into clinical practice.

    PubMed

    Rees, Stephen E

    2011-12-01

    This dissertation has addressed the broad hypothesis as to whether building mathematical models is useful as a tool for translating physiological knowledge into clinical practice. In doing so it describes work on the INtelligent VENTilator project (INVENT), the goal of which is to build, evaluate and integrate into clinical practice, a model-based decision support system for control of mechanical ventilation. The dissertation describes the mathematical models included in INVENT, i.e. a model of pulmonary gas exchange focusing on oxygen transport, and a model of the acid-base status of blood, interstitial fluid and tissues. These models have been validated, and applied in two other systems: ALPE, a system for measuring pulmonary gas exchange and ARTY, a system for arterialisation of the acid-base and oxygen status of peripheral venous blood. The major contributions of this work are as follows. A mathematical model has been developed which can describe pulmonary gas exchange more accurately that current clinical techniques. This model is parsimonious in that it can describe pulmonary gas exchange from measurements easily available in the clinic, along with a readily automatable variation in F(I)O(2). This technique and model have been developed into a research and commercial tool (ALPE), and evaluated both in the clinical setting and when compared to the reference multiple inert gas elimination technique (MIGET). Mathematical models have been developed of the acid- base chemistry of blood, interstitial fluid and tissues, with these models formulated using a mass-action mass-balance approach. The model of blood has been validated against literature data describing the addition and removal of CO(2), strong acid or base, and haemoglobin; and the effects of oxygenation or deoxygenation. The model has also been validated in new studies, and shown to simulate accurately and precisely the mixing of blood samples at different PCO(2) and PO(2) levels. This model of acid

  13. Analysis of cognitive theories in artificial intelligence and psychology in relation to the qualitative process of emotion

    SciTech Connect

    Semrau, P.

    1987-01-01

    The purpose of this study was to analyze selected cognitive theories in the areas of artificial intelligence (A.I.) and psychology to determine the role of emotions in the cognitive or intellectual processes. Understanding the relationship of emotions to processes of intelligence has implications for constructing theories of aesthetic response and A.I. systems in art. Psychological theories were examined that demonstrated the changing nature of the research in emotion related to cognition. The basic techniques in A.I. were reviewed and the A.I. research was analyzed to determine the process of cognition and the role of emotion. The A.I. research emphasized the digital, quantifiable character of the computer and associated cognitive models and programs. In conclusion, the cognitive-emotive research in psychology and the cognitive research in A.I. emphasized quantification methods over analog and qualitative characteristics required for a holistic explanation of cognition. Further A.I. research needs to examine the qualitative aspects of values, attitudes, and beliefs on influencing the creative thinking processes. Inclusion of research related to qualitative problem solving in art provides a more comprehensive base of study for examining the area of intelligence in computers.

  14. Intelligence: Real or artificial?

    PubMed Central

    Schlinger, Henry D.

    1992-01-01

    Throughout the history of the artificial intelligence movement, researchers have strived to create computers that could simulate general human intelligence. This paper argues that workers in artificial intelligence have failed to achieve this goal because they adopted the wrong model of human behavior and intelligence, namely a cognitive essentialist model with origins in the traditional philosophies of natural intelligence. An analysis of the word “intelligence” suggests that it originally referred to behavior-environment relations and not to inferred internal structures and processes. It is concluded that if workers in artificial intelligence are to succeed in their general goal, then they must design machines that are adaptive, that is, that can learn. Thus, artificial intelligence researchers must discard their essentialist model of natural intelligence and adopt a selectionist model instead. Such a strategic change should lead them to the science of behavior analysis. PMID:22477051

  15. The effect of narrow-band digital processing and bit error rate on the intelligibility of ICAO spelling alphabet words

    NASA Astrophysics Data System (ADS)

    Schmidt-Nielsen, Astrid

    1987-08-01

    The recognition of ICAO spelling alphabet words (ALFA, BRAVO, CHARLIE, etc.) is compared with diagnostic rhyme test (DRT) scores for the same conditions. The voice conditions include unprocessed speech; speech processed through the DOD standard linear-predictive-coding algorithm operating at 2400 bit/s with random error rates of 0, 2, 5, 8, and 12 percent; and speech processed through an 800-bit/s pattern-matching algorithm. The results suggest that, with distinctive vocabularies, word intelligibility can be expected to remain high even when DRT scores fall into the poor range. However, once the DRT scores fall below 75 percent, the intelligibility can be expected to fall off rapidly; at DRT scores below 50, the recognition of a distinctive vocabulary should also fall below 50 percent.

  16. Applying Statistical Process Control to Clinical Data: An Illustration.

    ERIC Educational Resources Information Center

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  17. Reasoning Processes Used by Paramedics to Solve Clinical Problems

    ERIC Educational Resources Information Center

    Alexander, Melissa

    2009-01-01

    The purpose of this exploratory qualitative study was to determine the reasoning processes used by paramedics to solve clinical problems. Existing research documents concern over the accuracy of paramedics' clinical decision-making, but no research was found that examines the cognitive processes by which paramedics make either faulty or accurate…

  18. Intelligent processing of materials; Proceedings of the Symposium, Fall Meeting of the Minerals, Metals, and Materials Society, Indianapolis, IN, Oct. 2-5, 1989

    NASA Astrophysics Data System (ADS)

    Wadley, Haydn N. G.; Eckhart, W. E., Jr.

    Current research on strategies of controlling product properties during processing is examined in reviews and reports. Problems discussed include predictive modeling, advanced sensing, and intelligent control. Particular attention is given to design and manufacturing of advanced materials and structures, intelligent control of carbon-carbon pyrolysis, computer simulation of crystal growth, modeling of phase change phenomena in boundary fitted coordinates, applications of optimization modeling techniques to materials processing, the effect of carbonization kinetics on in-process mechanical properties, nondestructive characterization and strength of solid-solid bond, and acoustic emission and ultrasonic sensing. Consideration is also given to collective learning systems for automatic control, a multicomponent knowledge base for spray casting process control, optimal control of microstructure during near-net shape processing, intelligent control of arc welding, and an intelligent control architecture for carbonization science.

  19. Intelligence supportability in future systems

    NASA Astrophysics Data System (ADS)

    Gold, Brian; Watson, Mariah; Vayette, Corey; Fiduk, Francis

    2010-08-01

    Advanced weaponry is providing an exponential increase in intelligence data collection capabilities and the Intelligence Community (IC) is not properly positioned for the influx of intelligence supportabilitiy requirements the defense acquisition community is developing for it. The Air Force Material Command (AFMC) has initiated the Intelligence Supportability Analysis (ISA) process to allow the IC to triage programs for intelligence sensitivities as well as begin preparations within the IC for the transition of future programs to operational status. The ISA process is accomplished through system decomposition, allowing analysts to identify intelligence requirements and deficiencies. Early collaboration and engagement by program managers and intelligence analysts is crucial to the success of intelligence sensitive programs through the utilization of a repeatable analytical framework for evaluating and making cognizant trade-offs between cost, schedule and performance. Addressing intelligence supportability early in the acquisition process will also influence system design and provide the necessary lead time for intelligence community to react and resource new requirements.

  20. [Process of perversion. Methodological and clinical study].

    PubMed

    Marchais, P

    1975-07-01

    Studies in classical psychiatry and psychoanalysis have reduced perversions to pathological phenomena, by lessening the moral criterion progressively. The applying of a comprehensive method of study of acquired perversions incites to consider various levels of observation. Henceforth properties common to each of these levels appears permitting thus to isolate a general process of perversion. This process, which keeps undeniable links with mental pathology, must however be differentiated, for it is not necessarily to be assimilated or reduced to the latter. Besides, it entails notable consequences on the social and cultural level. PMID:1233902

  1. [Definition and stabilization of processes II. Clinical Processes in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, Jose Ramón; Diz, Manuel Ramón; Martín, Carlos; López, M C

    2015-01-01

    New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart. PMID:25688534

  2. Clinical Application of "The Change Process".

    ERIC Educational Resources Information Center

    Thompson, Mary L.

    A change process described in the work of Yochelson and Samenow was adapted to students committed as delinquents to a state correctional facility. Their criminal profile accurately described the majority of the offenders. While minor problems continued, their frequency was reduced by as much as nine times. Serious incidents occurred only after the…

  3. Design and validation of an intelligent patient monitoring and alarm system based on a fuzzy logic process model.

    PubMed

    Becker, K; Thull, B; Käsmacher-Leidinger, H; Stemmer, J; Rau, G; Kalff, G; Zimmermann, H J

    1997-09-01

    The process of patient care performed by an anaesthesiologist during high invasive surgery requires fundamental knowledge of the physiologic processes and a long standing experience in patient management to cope with the inter-individual variability of the patients. Biomedical engineering research improves the patient monitoring task by providing technical devices to measure a large number of a patient's vital parameters. These measurements improve the safety of the patient during the surgical procedure, because pathological states can be recognised earlier, but may also lead to an increased cognitive load of the physician. In order to reduce cognitive strain and to support intra-operative monitoring for the anaesthesiologist an intelligent patient monitoring and alarm system has been proposed and implemented which evaluates a patient's haemodynamic state on the basis of a current vital parameter constellation with a knowledge-based approach. In this paper general design aspects and evaluation of the intelligent patient monitoring and alarm system in the operating theatre are described. The validation of the inference engine of the intelligent patient monitoring and alarm system was performed in two steps. Firstly, the knowledge base was validated with real patient data which was acquired online in the operating theatre. Secondly, a research prototype of the whole system was implemented in the operating theatre. In the first step, the anaesthetists were asked to enter a state variable evaluation before a drug application or any other intervention on the patient into a recording system. These state variable evaluations were compared to those generated by the intelligent alarm system on the same vital parameter constellations. Altogether 641 state variable evaluations were entered by six different physicians. In total, the sensitivity of alarm recognition is 99.3%, the specificity is 66% and the predictability is 45%. The second step was performed using a research

  4. The Influence of Cochlear Mechanical Dysfunction, Temporal Processing Deficits, and Age on the Intelligibility of Audible Speech in Noise for Hearing-Impaired Listeners.

    PubMed

    Johannesen, Peter T; Pérez-González, Patricia; Kalluri, Sridhar; Blanco, José L; Lopez-Poveda, Enrique A

    2016-01-01

    The aim of this study was to assess the relative importance of cochlear mechanical dysfunction, temporal processing deficits, and age on the ability of hearing-impaired listeners to understand speech in noisy backgrounds. Sixty-eight listeners took part in the study. They were provided with linear, frequency-specific amplification to compensate for their audiometric losses, and intelligibility was assessed for speech-shaped noise (SSN) and a time-reversed two-talker masker (R2TM). Behavioral estimates of cochlear gain loss and residual compression were available from a previous study and were used as indicators of cochlear mechanical dysfunction. Temporal processing abilities were assessed using frequency modulation detection thresholds. Age, audiometric thresholds, and the difference between audiometric threshold and cochlear gain loss were also included in the analyses. Stepwise multiple linear regression models were used to assess the relative importance of the various factors for intelligibility. Results showed that (a) cochlear gain loss was unrelated to intelligibility, (b) residual cochlear compression was related to intelligibility in SSN but not in a R2TM, (c) temporal processing was strongly related to intelligibility in a R2TM and much less so in SSN, and (d) age per se impaired intelligibility. In summary, all factors affected intelligibility, but their relative importance varied across maskers. PMID:27604779

  5. An Intelligent Computerized Stretch Reflex Measurement System For Clinical And Investigative Neurology

    NASA Astrophysics Data System (ADS)

    Flanagan, P. M.; Chutkow, J. G.; Riggs, M. T.; Cristiano, V. D.

    1987-05-01

    We describe the design of a reliable, user-friendly preprototype system for quantifying the tendon stretch reflexes in humans and large mammals. A hand-held, instrumented reflex gun, the impactor of which contains a single force sensor, interfaces with a computer. The resulting test system can deliver sequences of reproducible stimuli at graded intensities and adjustable durations to a muscle's tendon ("tendon taps"), measure the impacting force of each tap, and record the subsequent reflex muscle contraction from the same tendon -- all automatically. The parameters of the reflex muscle contraction include latency; mechanical threshold; and peak time, peak magnitude, and settling time. The results of clinical tests presented in this paper illustrate the system's potential usefulness in detecting neurologic dysfunction affecting the tendon stretch reflexes, in documenting the course of neurologic illnesses and their response to therapy, and in clinical and laboratory neurologic research.

  6. Become a Star: Teaching the Process of Design and Implementation of an Intelligent System

    ERIC Educational Resources Information Center

    Venables, Anne; Tan, Grace

    2005-01-01

    Teaching future knowledge engineers, the necessary skills for designing and implementing intelligent software solutions required by business, industry and research today, is a very tall order. These skills are not easily taught in traditional undergraduate computer science lectures; nor are the practical experiences easily reinforced in laboratory…

  7. Performance on Temporal Information Processing as an Index of General Intelligence

    ERIC Educational Resources Information Center

    Rammsayer, Thomas H.; Brandler, Susanne

    2007-01-01

    The relation between general intelligence (psychometric "g") and temporal resolution capacity of the central nervous system was examined by assessing performance on eight different temporal tasks in a sample of 100 participants. Correlational and principal component analyses suggested a unitary timing mechanism, referred to as temporal "g".…

  8. Analyzing Learner Language: Towards a Flexible Natural Language Processing Architecture for Intelligent Language Tutors

    ERIC Educational Resources Information Center

    Amaral, Luiz; Meurers, Detmar; Ziai, Ramon

    2011-01-01

    Intelligent language tutoring systems (ILTS) typically analyze learner input to diagnose learner language properties and provide individualized feedback. Despite a long history of ILTS research, such systems are virtually absent from real-life foreign language teaching (FLT). Taking a step toward more closely linking ILTS research to real-life…

  9. Artificial intelligence in process control: Knowledge base for the shuttle ECS model

    NASA Technical Reports Server (NTRS)

    Stiffler, A. Kent

    1989-01-01

    The general operation of KATE, an artificial intelligence controller, is outlined. A shuttle environmental control system (ECS) demonstration system for KATE is explained. The knowledge base model for this system is derived. An experimental test procedure is given to verify parameters in the model.

  10. Animated-simulation modeling facilitates clinical-process costing.

    PubMed

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making. PMID:11552586

  11. An international observational study suggests that artificial intelligence for clinical decision support optimizes anemia management in hemodialysis patients.

    PubMed

    Barbieri, Carlo; Molina, Manuel; Ponce, Pedro; Tothova, Monika; Cattinelli, Isabella; Ion Titapiccolo, Jasmine; Mari, Flavio; Amato, Claudia; Leipold, Frank; Wehmeyer, Wolfgang; Stuard, Stefano; Stopper, Andrea; Canaud, Bernard

    2016-08-01

    Managing anemia in hemodialysis patients can be challenging because of competing therapeutic targets and individual variability. Because therapy recommendations provided by a decision support system can benefit both patients and doctors, we evaluated the impact of an artificial intelligence decision support system, the Anemia Control Model (ACM), on anemia outcomes. Based on patient profiles, the ACM was built to recommend suitable erythropoietic-stimulating agent doses. Our retrospective study consisted of a 12-month control phase (standard anemia care), followed by a 12-month observation phase (ACM-guided care) encompassing 752 patients undergoing hemodialysis therapy in 3 NephroCare clinics located in separate countries. The percentage of hemoglobin values on target, the median darbepoetin dose, and individual hemoglobin fluctuation (estimated from the intrapatient hemoglobin standard deviation) were deemed primary outcomes. In the observation phase, median darbepoetin consumption significantly decreased from 0.63 to 0.46 μg/kg/month, whereas on-target hemoglobin values significantly increased from 70.6% to 76.6%, reaching 83.2% when the ACM suggestions were implemented. Moreover, ACM introduction led to a significant decrease in hemoglobin fluctuation (intrapatient standard deviation decreased from 0.95 g/dl to 0.83 g/dl). Thus, ACM support helped improve anemia outcomes of hemodialysis patients, minimizing erythropoietic-stimulating agent use with the potential to reduce the cost of treatment. PMID:27262365

  12. Real time intelligent process control system for thin film solar cell manufacturing

    SciTech Connect

    George Atanasoff

    2010-10-29

    significant equipment refurbishing needed for installation of multiple separate ellipsometric systems, and development of customized software to control all of them simultaneously. The proposed optical monitoring system comprises AccuStrata’s fiber optics sensors installed inside the thin film deposition equipment, a hardware module of different components (beyond the scope of this project) and our software program with iterative predicting capability able to control material bandgap and surface roughness as films are deposited. Our miniature fiber optics monitoring sensors are installed inside the vacuum chamber compartments in very close proximity where the independent layers are deposited (an option patented by us in 2003). The optical monitoring system measures two of the most important parameters of the photovoltaic thin films during deposition on a moving solar panel - material bandgap and surface roughness. In this program each sensor array consists of two fiber optics sensors monitoring two independent areas of the panel under deposition. Based on the monitored parameters and their change in time and from position to position on the panel, the system is able to provide to the equipment operator immediate information about the thin films as they are deposited. This DoE Supply Chain program is considered the first step towards the development of intelligent optical control system capable of dynamically adjusting the manufacturing process “on-the-fly” in order to achieve better performance. The proposed system will improve the thin film solar cell manufacturing by improving the quality of the individual solar cells and will allow for the manufacturing of more consistent and uniform products resulting in higher solar conversion efficiency and manufacturing yield. It will have a significant impact on the multibillion-dollar thin film solar market. We estimate that the financial impact of these improvements if adopted by only 10% of the industry ($7.7 Billion) would

  13. Development of an `intelligent grinding wheel` for in-process monitoring of ceramic grinding. Semi-annual report No. 2, March 1, 1997--August 31, 1997

    SciTech Connect

    Malkin, S.; Gao, R.; Guo, C.; Varghese, B.; Pathare, S.

    1997-09-29

    The overall objective of this project is to develop sensor-integrated `intelligent` diamond wheels for grinding ceramics. Such wheels will be `smart` enough to monitor and supervise both the wheel preparation and grinding processes without the need to instrument the machine tool. Intelligent wheels will utilize re-useable cores integrated with two types of sensors: acoustic emission (AE) and dynamic force transducers. Signals from the sensors will be transmitted from a rotating wheel to a receiver by telemetry. Intelligent wheels will be `trained` to recognize distinct characteristics associated with truing, dressing and grinding.

  14. Comparing Binaural Pre-processing Strategies III: Speech Intelligibility of Normal-Hearing and Hearing-Impaired Listeners.

    PubMed

    Völker, Christoph; Warzybok, Anna; Ernst, Stephan M A

    2015-01-01

    A comprehensive evaluation of eight signal pre-processing strategies, including directional microphones, coherence filters, single-channel noise reduction, binaural beamformers, and their combinations, was undertaken with normal-hearing (NH) and hearing-impaired (HI) listeners. Speech reception thresholds (SRTs) were measured in three noise scenarios (multitalker babble, cafeteria noise, and single competing talker). Predictions of three common instrumental measures were compared with the general perceptual benefit caused by the algorithms. The individual SRTs measured without pre-processing and individual benefits were objectively estimated using the binaural speech intelligibility model. Ten listeners with NH and 12 HI listeners participated. The participants varied in age and pure-tone threshold levels. Although HI listeners required a better signal-to-noise ratio to obtain 50% intelligibility than listeners with NH, no differences in SRT benefit from the different algorithms were found between the two groups. With the exception of single-channel noise reduction, all algorithms showed an improvement in SRT of between 2.1 dB (in cafeteria noise) and 4.8 dB (in single competing talker condition). Model predictions with binaural speech intelligibility model explained 83% of the measured variance of the individual SRTs in the no pre-processing condition. Regarding the benefit from the algorithms, the instrumental measures were not able to predict the perceptual data in all tested noise conditions. The comparable benefit observed for both groups suggests a possible application of noise reduction schemes for listeners with different hearing status. Although the model can predict the individual SRTs without pre-processing, further development is necessary to predict the benefits obtained from the algorithms at an individual level. PMID:26721922

  15. A study of clinical and information management processes in the surgical pre-assessment clinic

    PubMed Central

    2014-01-01

    Background Establishing day-case surgery as the preferred hospital admission route for all eligible patients requires adequate preoperative assessment of patients in order to quickly distinguish those who will require minimum assessment and are suitable for day-case admission from those who will require more extensive management and will need to be admitted as inpatients. Methods As part of a study to elucidate clinical and information management processes within the patient surgical pathway in NHS Scotland, we conducted a total of 10 in-depth semi-structured interviews during 4 visits to the Dumfries & Galloway Royal Infirmary surgical pre-assessment clinic. We modelled clinical processes using process-mapping techniques and analysed interview data using qualitative methods. We used Normalisation Process Theory as a conceptual framework to interpret the factors which were identified as facilitating or hindering information elucidation tasks and communication within the multi-disciplinary team. Results The pre-assessment clinic of Dumfries & Galloway Royal Infirmary was opened in 2008 in response to clinical and workflow issues which had been identified with former patient management practices in the surgical pathway. The preoperative clinic now operates under well established processes and protocols. The use of a computerised system for managing preoperative documentation substantially transformed clinical practices and facilitates communication and information-sharing among the multi-disciplinary team. Conclusion Successful deployment and normalisation of innovative clinical and information management processes was possible because both local and national strategic priorities were synergistic and the system was developed collaboratively by the POA staff and the health-board IT team, resulting in a highly contextualised operationalisation of clinical and information management processes. Further concerted efforts from a range of stakeholders are required to fully

  16. Neuropsychological Profiles in Individuals at Clinical High Risk for Psychosis: Relationship to Psychosis and Intelligence

    PubMed Central

    Woodberry, Kristen A.; Seidman, Larry J.; Giuliano, Anthony J.; Verdi, Mary B.; Cook, William L.; McFarlane, William R.

    2010-01-01

    Background Characterizing neuropsychological (NP) functioning of individuals at clinical high risk (CHR) for psychosis may be useful for prediction of psychosis and understanding functional outcome. The degree to which NP impairments are associated with general cognitive ability and/or later emergence of full psychosis in CHR samples requires study with well-matched controls. Methods We assessed NP functioning across eight cognitive domains in a sample of 73 CHR youth, 13 of whom developed psychotic-level symptoms after baseline assessment, and 34 healthy comparison (HC) subjects. Groups were matched on age, sex, ethnicity, handedness, subject and parent grade attainment, and median family income, and were comparable on WRAT-3 Reading, an estimate of premorbid IQ. Profile analysis was used to examine group differences and the role of IQ in profile shape. Results The CHR sample demonstrated a significant difference in overall magnitude of NP impairment but only a small and nearly significant difference in profile shape, primarily due to a large impairment in olfactory identification. Individuals who subsequently developed psychotic-level symptoms demonstrated large impairments in verbal IQ, verbal memory and olfactory identification comparable in magnitude to first episode samples. Conclusions CHR status may be associated with moderate generalized cognitive impairments marked by some degree of selective impairment in olfaction and verbal memory. Impairments were greatest in those who later developed psychotic symptoms. Future study of olfaction in CHR samples may enhance early detection and specification of neurodevelopmental mechanisms of risk. PMID:20692125

  17. Synthesis and processing of intelligent cost-effective structures: a final review of the ARPA SPICES program

    NASA Astrophysics Data System (ADS)

    Jacobs, Jack H.

    1996-05-01

    The Synthesis and Processing of Intelligent Cost Effective Structures (SPICES) program is comprised of a consortium of industrial, academic and government labs working to develop cost effective material processing and synthesis technologies to enable new products using active vibration suppression and control devices to be brought to market. Since smart structures involve the integration of multiple engineering disciplines, it has been the objective of the consortium to establish cost effective design processes between this multi-organizational team for future incorporating of this new technology into each members respective product lines. Over the twenty-four month program many new improvements in sensors, actuators, modeling, manufacturing/integration and controls have been realized. The paper outlines the four phases of development in the program and the impact some of the key technologies will have on the smart structure development process in the future.

  18. Thinking Processes Used by Nurses in Clinical Decision Making.

    ERIC Educational Resources Information Center

    Higuchi, Kathryn A. Smith; Donald, Janet G.

    2002-01-01

    Interviews with eight medical and surgical nurses and audits of patient charts investigated clinical decision-making processes. Predominant thinking processes were description of facts, selection of information, inference, syntheses, and verification, with differences between medical and surgical specialties. Exemplars of thinking processes…

  19. Adaptive healthcare processes for personalized emergency clinical pathways.

    PubMed

    Poulymenopoulou, M; Papakonstantinou, D; Malamateniou, F; Vassilacopoulos, G

    2014-01-01

    Pre-hospital and in-hospital emergency healthcare delivery involves a variety of activities and people that should be coordinated in order effectively to create an emergency care plan. Emergency care provided by emergency healthcare professionals can be improved by personalized emergency clinical pathways that are instances of relevant emergency clinical guidelines based on emergency case needs as well as on ambulance and hospital resource availability, while also enabling better resource use. Business Process Management Systems (BPMSs) in conjunction with semantic technologies can be used to support personalized emergency clinical pathways by incorporating clinical guidelines logic into the emergency healthcare processes at run-time according to emergency care context information (current emergency case and resource information). On these grounds, a framework is proposed that uses ontology to model knowledge on emergency case medical history, on healthcare resource availability, on relevant clinical guidelines and on process logic; this is inferred to result in the most suitable process model for the case, in line with relevant clinical guidelines. PMID:25160219

  20. A clinical study of the effects of lead poisoning on the intelligence and neurobehavioral abilities of children

    PubMed Central

    2013-01-01

    Background Lead is a heavy metal and important environmental toxicant and nerve poison that can destruction many functions of the nervous system. Lead poisoning is a medical condition caused by increased levels of lead in the body. Lead interferes with a variety of body processes and is toxic to many organs and issues, including the central nervous system. It interferes with the development of the nervous system, and is therefore particularly toxic to children, causing potentially permanent neural and cognitive impairments. In this study, we investigated the relationship between lead poisoning and the intellectual and neurobehavioral capabilities of children. Methods The background characteristics of the research subjects were collected by questionnaire survey. Blood lead levels were detected by differential potentiometric stripping analysis (DPSA). Intelligence was assessed using the Gesell Developmental Scale. The Achenbach Child Behavior Checklist (CBCL) was used to evaluate each child’s behavior. Results Blood lead levels were significantly negatively correlated with the developmental quotients of adaptive behavior, gross motor performance, fine motor performance, language development, and individual social behavior (P < 0.01). Compared with healthy children, more children with lead poisoning had abnormal behaviors, especially social withdrawal, depression, and atypical body movements, aggressions and destruction. Conclusion Lead poisoning has adverse effects on the behavior and mental development of 2–4-year-old children, prescribing positive and effective precautionary measures. PMID:23414525

  1. The Problem of Defining Intelligence.

    ERIC Educational Resources Information Center

    Lubar, David

    1981-01-01

    The major philosophical issues surrounding the concept of intelligence are reviewed with respect to the problems surrounding the process of defining and developing artificial intelligence (AI) in computers. Various current definitions and problems with these definitions are presented. (MP)

  2. Processes for quality improvements in radiation oncology clinical trials.

    PubMed

    FitzGerald, T J; Urie, Marcia; Ulin, Kenneth; Laurie, Fran; Yorty, Jeffrey; Hanusik, Richard; Kessel, Sandy; Jodoin, Maryann Bishop; Osagie, Gani; Cicchetti, M Giulia; Pieters, Richard; McCarten, Kathleen; Rosen, Nancy

    2008-01-01

    Quality assurance in radiotherapy (RT) has been an integral aspect of cooperative group clinical trials since 1970. In early clinical trials, data acquisition was nonuniform and inconsistent and computational models for radiation dose calculation varied significantly. Process improvements developed for data acquisition, credentialing, and data management have provided the necessary infrastructure for uniform data. With continued improvement in the technology and delivery of RT, evaluation processes for target definition, RT planning, and execution undergo constant review. As we move to multimodality image-based definitions of target volumes for protocols, future clinical trials will require near real-time image analysis and feedback to field investigators. The ability of quality assurance centers to meet these real-time challenges with robust electronic interaction platforms for imaging acquisition, review, archiving, and quantitative review of volumetric RT plans will be the primary challenge for future successful clinical trials. PMID:18406943

  3. What can Natural Language Processing do for Clinical Decision Support?

    PubMed Central

    Demner-Fushman, Dina; Chapman, Wendy W.; McDonald, Clement J.

    2009-01-01

    Computerized Clinical Decision Support (CDS) aims to aid decision making of health care providers and the public by providing easily accessible health-related information at the point and time it is needed. Natural Language Processing (NLP) is instrumental in using free-text information to drive CDS, representing clinical knowledge and CDS interventions in standardized formats, and leveraging clinical narrative. The early innovative NLP research of clinical narrative was followed by a period of stable research conducted at the major clinical centers and a shift of mainstream interest to biomedical NLP. This review primarily focuses on the recently renewed interest in development of fundamental NLP methods and advances in the NLP systems for CDS. The current solutions to challenges posed by distinct sublanguages, intended user groups, and support goals are discussed. PMID:19683066

  4. Process of research investigations in artificial intelligence-an unified view

    SciTech Connect

    Baldwin, D.; Yadav, S.B.

    1995-05-01

    A number of research communities recognize Artificial Intelligence (AI) as a valid reference discipline. However, several papers have criticized AI`s research methodologies. This paper attempts to clarify and improve the methods used in AI. Definitions are proposed for terms such as AI theory, principles, hypotheses, and observations. Next, a unified view of AI research methodology is proposed. This methodology contains a long term dimension based upon the scientific method and an individual project dimension. The individual project dimension identifies four strategies: Hypothetical/deductive, hermeneutical/inductive, case-based, and historical analysis. The strategies differ according to how prototyping is used in an experiment. 78 refs.

  5. Full Intelligent Cancer Classification of Thermal Breast Images to Assist Physician in Clinical Diagnostic Applications

    PubMed Central

    Lashkari, AmirEhsan; Pak, Fatemeh; Firouzmand, Mohammad

    2016-01-01

    Breast cancer is the most common type of cancer among women. The important key to treat the breast cancer is early detection of it because according to many pathological studies more than 75% – 80% of all abnormalities are still benign at primary stages; so in recent years, many studies and extensive research done to early detection of breast cancer with higher precision and accuracy. Infra-red breast thermography is an imaging technique based on recording temperature distribution patterns of breast tissue. Compared with breast mammography technique, thermography is more suitable technique because it is noninvasive, non-contact, passive and free ionizing radiation. In this paper, a full automatic high accuracy technique for classification of suspicious areas in thermogram images with the aim of assisting physicians in early detection of breast cancer has been presented. Proposed algorithm consists of four main steps: pre-processing & segmentation, feature extraction, feature selection and classification. At the first step, using full automatic operation, region of interest (ROI) determined and the quality of image improved. Using thresholding and edge detection techniques, both right and left breasts separated from each other. Then relative suspected areas become segmented and image matrix normalized due to the uniqueness of each person's body temperature. At feature extraction stage, 23 features, including statistical, morphological, frequency domain, histogram and Gray Level Co-occurrence Matrix (GLCM) based features are extracted from segmented right and left breast obtained from step 1. To achieve the best features, feature selection methods such as minimum Redundancy and Maximum Relevance (mRMR), Sequential Forward Selection (SFS), Sequential Backward Selection (SBS), Sequential Floating Forward Selection (SFFS), Sequential Floating Backward Selection (SFBS) and Genetic Algorithm (GA) have been used at step 3. Finally to classify and TH labeling procedures

  6. Full Intelligent Cancer Classification of Thermal Breast Images to Assist Physician in Clinical Diagnostic Applications.

    PubMed

    Lashkari, AmirEhsan; Pak, Fatemeh; Firouzmand, Mohammad

    2016-01-01

    Breast cancer is the most common type of cancer among women. The important key to treat the breast cancer is early detection of it because according to many pathological studies more than 75% - 80% of all abnormalities are still benign at primary stages; so in recent years, many studies and extensive research done to early detection of breast cancer with higher precision and accuracy. Infra-red breast thermography is an imaging technique based on recording temperature distribution patterns of breast tissue. Compared with breast mammography technique, thermography is more suitable technique because it is noninvasive, non-contact, passive and free ionizing radiation. In this paper, a full automatic high accuracy technique for classification of suspicious areas in thermogram images with the aim of assisting physicians in early detection of breast cancer has been presented. Proposed algorithm consists of four main steps: pre-processing & segmentation, feature extraction, feature selection and classification. At the first step, using full automatic operation, region of interest (ROI) determined and the quality of image improved. Using thresholding and edge detection techniques, both right and left breasts separated from each other. Then relative suspected areas become segmented and image matrix normalized due to the uniqueness of each person's body temperature. At feature extraction stage, 23 features, including statistical, morphological, frequency domain, histogram and Gray Level Co-occurrence Matrix (GLCM) based features are extracted from segmented right and left breast obtained from step 1. To achieve the best features, feature selection methods such as minimum Redundancy and Maximum Relevance (mRMR), Sequential Forward Selection (SFS), Sequential Backward Selection (SBS), Sequential Floating Forward Selection (SFFS), Sequential Floating Backward Selection (SFBS) and Genetic Algorithm (GA) have been used at step 3. Finally to classify and TH labeling procedures

  7. The role of accent imitation in sensorimotor integration during processing of intelligible speech.

    PubMed

    Adank, Patti; Rueschemeyer, Shirley-Ann; Bekkering, Harold

    2013-01-01

    Recent theories on how listeners maintain perceptual invariance despite variation in the speech signal allocate a prominent role to imitation mechanisms. Notably, these simulation accounts propose that motor mechanisms support perception of ambiguous or noisy signals. Indeed, imitation of ambiguous signals, e.g., accented speech, has been found to aid effective speech comprehension. Here, we explored the possibility that imitation in speech benefits perception by increasing activation in speech perception and production areas. Participants rated the intelligibility of sentences spoken in an unfamiliar accent of Dutch in a functional Magnetic Resonance Imaging experiment. Next, participants in one group repeated the sentences in their own accent, while a second group vocally imitated the accent. Finally, both groups rated the intelligibility of accented sentences in a post-test. The neuroimaging results showed an interaction between type of training and pre- and post-test sessions in left Inferior Frontal Gyrus, Supplementary Motor Area, and left Superior Temporal Sulcus. Although alternative explanations such as task engagement and fatigue need to be considered as well, the results suggest that imitation may aid effective speech comprehension by supporting sensorimotor integration. PMID:24109447

  8. E-facts: business process management in clinical data repositories.

    PubMed

    Wattanasin, Nich; Peng, Zhaoping; Raine, Christine; Mitchell, Mariah; Wang, Charles; Murphy, Shawn N

    2008-01-01

    The Partners Healthcare Research Patient Data Registry (RPDR) is a centralized data repository that gathers clinical data from various hospital systems. The RPDR allows clinical investigators to obtain aggregate numbers of patients with user-defined characteristics such as diagnoses, procedures, medications, and laboratory values. They may then obtain patient identifiers and electronic medical records with prior IRB approval. Moreover, the accurate identification and efficient population of worthwhile and quantifiable facts from doctor's report into the RPDR is a significant process. As part of our ongoing e-Fact project, this work describes a new business process management technology that helps coordinate and simplify this procedure. PMID:18999043

  9. Intelligent Tutor

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA also seeks to advance American education by employing the technology utilization process to develop a computerized, artificial intelligence-based Intelligent Tutoring System (ITS) to help high school and college physics students. The tutoring system is designed for use with the lecture and laboratory portions of a typical physics instructional program. Its importance lies in its ability to observe continually as a student develops problem solutions and to intervene when appropriate with assistance specifically directed at the student's difficulty and tailored to his skill level and learning style. ITS originated as a project of the Johnson Space Center (JSC). It is being developed by JSC's Software Technology Branch in cooperation with Dr. R. Bowen Loftin at the University of Houston-Downtown. Program is jointly sponsored by NASA and ACOT (Apple Classrooms of Tomorrow). Other organizations providing support include Texas Higher Education Coordinating Board, the National Research Council, Pennzoil Products Company and the George R. Brown Foundation. The Physics I class of Clear Creek High School, League City, Texas are providing the classroom environment for test and evaluation of the system. The ITS is a spinoff product developed earlier to integrate artificial intelligence into training/tutoring systems for NASA astronauts flight controllers and engineers.

  10. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    PubMed

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610