A child with a difficult airway: what do I do next?
Engelhardt, Thomas; Weiss, Markus
2012-06-01
Difficulties in pediatric airway management are common and continue to result in significant morbidity and mortality. This review reports on current concepts in approaching a child with a difficult airway. Routine airway management in healthy children with normal airways is simple in experienced hands. Mask ventilation (oxygenation) is always possible and tracheal intubation normally simple. However, transient hypoxia is common in these children usually due to unexpected anatomical and functional airway problems or failure to ventilate during rapid sequence induction. Anatomical airway problems (upper airway collapse and adenoid hypertrophy) and functional airway problems (laryngospasm, bronchospasm, insufficient depth of anesthesia and muscle rigidity, gastric hyperinflation, and alveolar collapse) require urgent recognition and treatment algorithms due to insufficient oxygen reserves. Early muscle paralysis and epinephrine administration aids resolution of these functional airway obstructions. Children with an 'impaired' normal (foreign body, allergy, and inflammation) or an expected difficult (scars, tumors, and congenital) airway require careful planning and expertise. Training in the recognition and management of these different situations as well as a suitably equipped anesthesia workstation and trained personnel are essential. The healthy child with an unexpected airway problem requires clear strategies. The 'impaired' normal pediatric airway may be handled by anesthetists experienced with children, whereas the expected difficult pediatric airway requires dedicated pediatric anesthesia specialist care and should only be managed in specialized centers.
Explicit solution techniques for impact with contact constraints
NASA Technical Reports Server (NTRS)
Mccarty, Robert E.
1993-01-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Explicit solution techniques for impact with contact constraints
NASA Astrophysics Data System (ADS)
McCarty, Robert E.
1993-08-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
ERIC Educational Resources Information Center
Kolata, Gina
1985-01-01
To determine how hard it is for computers to solve problems, researchers have classified groups of problems (polynomial hierarchy) according to how much time they seem to require for their solutions. A difficult and complex proof is offered which shows that a combinatorial approach (using Boolean circuits) may resolve the problem. (JN)
Auditory Processing Disorder (For Parents)
... or other speech-language difficulties? Are verbal (word) math problems difficult for your child? Is your child ... inferences from conversations, understanding riddles, or comprehending verbal math problems — require heightened auditory processing and language levels. ...
Procedural versus Content-Related Hints for Word Problem Solving: An Exploratory Study
ERIC Educational Resources Information Center
Kock, W. D.; Harskamp, E. G.
2016-01-01
For primary school students, mathematical word problems are often more difficult to solve than straightforward number problems. Word problems require reading and analysis skills, and in order to explain their situational contexts, the proper mathematical knowledge and number operations have to be selected. To improve students' ability in solving…
Home Economics Reading Skills: Problems and Selected References.
ERIC Educational Resources Information Center
Cranney, A. Garr; And Others
Home economics presents at least eight problems to secondary school reading teachers. These problems include poor readers, difficult reading material, lack of reading materials, teachers' lack of training in reading instruction, scarce information about home economics for reading teachers, diversity of the home economics field (requiring a wide…
Cognitive Development, Genetics Problem Solving, and Genetics Instruction: A Critical Review.
ERIC Educational Resources Information Center
Smith, Mike U.; Sims, O. Suthern, Jr.
1992-01-01
Review of literature concerning problem solving in genetics and Piagetian stage theory. Authors conclude the research suggests that formal-operational thought is not strictly required for the solution of the majority of classical genetics problems; however, some genetic concepts are difficult for concrete operational students to understand.…
Using Predictor-Corrector Methods in Numerical Solutions to Mathematical Problems of Motion
ERIC Educational Resources Information Center
Lewis, Jerome
2005-01-01
In this paper, the author looks at some classic problems in mathematics that involve motion in the plane. Many case problems like these are difficult and beyond the mathematical skills of most undergraduates, but computational approaches often require less insight into the subtleties of the problems and can be used to obtain reliable solutions.…
Cognitive Load Mediates the Effect of Emotion on Analytical Thinking.
Trémolière, Bastien; Gagnon, Marie-Ève; Blanchette, Isabelle
2016-11-01
Although the detrimental effect of emotion on reasoning has been evidenced many times, the cognitive mechanism underlying this effect remains unclear. In the present paper, we explore the cognitive load hypothesis as a potential explanation. In an experiment, participants solved syllogistic reasoning problems with either neutral or emotional contents. Participants were also presented with a secondary task, for which the difficult version requires the mobilization of cognitive resources to be correctly solved. Participants performed overall worse and took longer on emotional problems than on neutral problems. Performance on the secondary task, in the difficult version, was poorer when participants were reasoning about emotional, compared to neutral contents, consistent with the idea that processing emotion requires more cognitive resources. Taken together, the findings afford evidence that the deleterious effect of emotion on reasoning is mediated by cognitive load.
A hybrid symbolic/finite-element algorithm for solving nonlinear optimal control problems
NASA Technical Reports Server (NTRS)
Bless, Robert R.; Hodges, Dewey H.
1991-01-01
The general code described is capable of solving difficult nonlinear optimal control problems by using finite elements and a symbolic manipulator. Quick and accurate solutions are obtained with a minimum for user interaction. Since no user programming is required for most problems, there are tremendous savings to be gained in terms of time and money.
An Extension of Holographic Moiré to Micromechanics
NASA Astrophysics Data System (ADS)
Sciammarella, C. A.; Sciammarella, F. M.
The electronic Holographic Moiré is an ideal tool for micromechanics studies. It does not require a modification of the surface by the introduction of a reference grating. This is of particular advantage when dealing with materials such as solid propellant grains whose chemical nature and surface finish makes the application of a reference grating very difficult. Traditional electronic Holographic Moiré presents some difficult problems when large magnifications are needed and large rigid body motion takes place. This paper presents developments that solves these problems and extends the application of the technique to micromechanics.
Efficient Credit Assignment through Evaluation Function Decomposition
NASA Technical Reports Server (NTRS)
Agogino, Adrian; Turner, Kagan; Mikkulainen, Risto
2005-01-01
Evolutionary methods are powerful tools in discovering solutions for difficult continuous tasks. When such a solution is encoded over multiple genes, a genetic algorithm faces the difficult credit assignment problem of evaluating how a single gene in a chromosome contributes to the full solution. Typically a single evaluation function is used for the entire chromosome, implicitly giving each gene in the chromosome the same evaluation. This method is inefficient because a gene will get credit for the contribution of all the other genes as well. Accurately measuring the fitness of individual genes in such a large search space requires many trials. This paper instead proposes turning this single complex search problem into a multi-agent search problem, where each agent has the simpler task of discovering a suitable gene. Gene-specific evaluation functions can then be created that have better theoretical properties than a single evaluation function over all genes. This method is tested in the difficult double-pole balancing problem, showing that agents using gene-specific evaluation functions can create a successful control policy in 20 percent fewer trials than the best existing genetic algorithms. The method is extended to more distributed problems, achieving 95 percent performance gains over tradition methods in the multi-rover domain.
Competences of Mathematical Modelling of High School Students
ERIC Educational Resources Information Center
Sekerak, Josef
2010-01-01
Thanks to technological progress the world becomes more and more complicated. People stand in front of new and difficult problems that need to be solved. These are problems, the solutions of which are not universal, and cannot be learned. Many solutions require specific data that cannot be learned, as new data is part of the ongoing generation of…
Transboundary environmental assessment: lessons from OTAG. The Ozone Transport Assessment Group.
Farrell, Alexander E; Keating, Terry J
2002-06-15
The nature and role of assessments in creating policy for transboundary environmental problems is discussed. Transboundary environmental problems are particularly difficult to deal with because they typically require cooperation among independent political jurisdictions (e.g., states or nations) which face differing costs and benefits and which often have different technical capabilities and different interests. In particular, transboundary pollution issues generally involve the problem of an upstream source and a downstream receptor on opposite sides of a relevant political boundary, making it difficult for the jurisdiction containing the receptor to obtain relief from the pollution problem. The Ozone Transport Assessment Group (OTAG) addressed such a transboundary problem: the long-range transport of tropospheric ozone (i.e., photochemical smog) across the eastern United States. The evolution of the science and policy that led to OTAG, the OTAG process, and its outcomes are presented. Lessons that are available to be learned from the OTAG experience, particularly for addressing similar transboundary problems such as regional haze, are discussed.
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA)
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Cohen, Gerald C.; Meissner, Charles W.
1990-01-01
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA) is a two-phase program which was initiated by NASA in the early 80s. The first phase, IAPSA 1, studied different architectural approaches to the problem of integrating engine control systems with airframe control systems in an advanced tactical fighter. One of the conclusions of IAPSA 1 was that the technology to construct a suitable system was available, yet the ability to create these complex computer architectures has outpaced the ability to analyze the resulting system's performance. With this in mind, the second phase of IAPSA approached the same problem with the added constraint that the system be designed for validation. The intent of the design for validation requirement is that validation requirements should be shown to be achievable early in the design process. IAPSA 2 has demonstrated that despite diligent efforts, integrated systems can retain characteristics which are difficult to model and, therefore, difficult to validate.
Hospital renovation projects: phased construction requires planning at its best.
Cox, J C
1986-01-01
Building a new hospital facility is a difficult task, but adding onto and renovating an existing structure while normal activity continues is even more difficult. Project planners, designers, contractors, and hospital managers must carefully program the joint effort of construction and hospital operation. Several factors in the construction process and potential problems for hospital operations are described to help hospital managers better anticipate difficulties before plans are finalized and construction commences.
Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures
NASA Technical Reports Server (NTRS)
Johnson, C. W.; Holloway, C. M.
2006-01-01
Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.
Monclus, Enric; Garcés, Antonio; Artés, David; Mabrock, Maged
2008-07-01
For a predicted difficult airway, oral intubation techniques are well established in pediatric anesthesia, but nasotracheal intubation remains a problem. There are many reports concerning this, but the risk of bleeding, added to the lack of cooperation make this procedure difficult and hazardous. We describe a modification of the nasal intubation technique in two stages. First an oral intubation and then exchanging the oral for a nasal tube, in the case of a 13-year-old boy affected by an advanced stage of cherubism. Oral intubation using a laryngeal mask technique has already been reported, but problems appear during the exchange procedure and even more when direct laryngoscopy is impossible. Fiberscopic control of the exchange, and the introduction of a Cook Exchange Catheter into the trachea through the oral tube before withdrawal, permits oxygenation of the patient and acts as a guide for oral tube reintroduction if required.
Big data processing in the cloud - Challenges and platforms
NASA Astrophysics Data System (ADS)
Zhelev, Svetoslav; Rozeva, Anna
2017-12-01
Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.
NASA Astrophysics Data System (ADS)
Guven, Bulent; Aydin-Guc, Funda; Medine Ozmen, Zeynep
2016-08-01
The purpose of this study was to determine the relationship between the problems teachers preferred in mathematics lessons and student achievement in different types of problems. In accordance with this purpose, nine mathematics teachers were interviewed, and corresponding problems were prepared and administered to 225 eighth-grade students. The findings indicate that problem types are dependent on teacher preferences. It was found that curriculum-dependent and routine problems were dominant for teacher preferences. Students are more successful at with missing data, problems that are visual and do not require the use of different strategies. They have lower success at long problems, those that contain irrelevant data, problems that require the use of different strategies and difficult problem types. It was found that problem types at which students were successful and which teachers preferred were related. These results relay information about problems used in the learning environment and effect of problem-solving experiences on students' success.
Tuberculous otitis media: two case reports and literature review.
Awan, Mohammad Sohail; Salahuddin, Iftikhar
2002-11-01
Tuberculous otitis media can be difficult to diagnose because it can easily be confused with other acute or chronic middle ear conditions. Compounding this problem is the fact that physicians are generally unfamiliar with the typical features of tuberculous otitis media. Finally, the final diagnosis can be difficult because it requires special culture and pathologic studies. To increase awareness of this condition, we describe two cases of tuberculous otitis media and we review the literature.
Forming YBa2Cu3O7-x Superconductors On Copper Substrates
NASA Technical Reports Server (NTRS)
Mackenzie, J. Devin; Young, Stanley G.
1991-01-01
Experimental process forms layer of high-critical-temperature ceramic superconductor YBa2Cu3O7-x on surface of copper substrate. Offers possible solution to problem of finishing ceramic superconductors to required final sizes and shapes (difficult problem because these materials brittle and cannot be machined or bent). Further research necessary to evaluate superconducting qualities of surface layers and optimize process.
The Difficult Gallbladder: A Safe Approach to a Dangerous Problem.
Santos, B Fernando; Brunt, L Michael; Pucci, Michael J
2017-06-01
Laparoscopic cholecystectomy is a common surgical procedure, and remains the gold standard for the management of benign gallbladder and biliary disease. While this procedure can be technically straightforward, it can also represent one of the most challenging operations facing surgeons. This dichotomy of a routine operation performed so commonly that poses such a hidden risk of severe complications, such as bile duct injury, must keep surgeons steadfast in the pursuit of safety. The "difficult gallbladder" requires strict adherence to the Culture of Safety in Cholecystectomy, which promotes safety first and assists surgeons in managing or avoiding difficult operative situations. This review will discuss the management of the difficult gallbladder and propose the use of subtotal fenestrating cholecystectomy as a definitive option during this dangerous situation.
Predicting drug interactions in addiction treatment.
Lucas, Catherine J; Patel, Joanne; Martin, Jennifer H
2017-08-01
It is not uncommon to be treating people with addiction who also have significant other health problems, including heart, renal or liver failure, diabetes and vascular disease. These conditions require regular medications to be taken. This can be a problem for people living with addiction and difficult social circumstances affecting compliance, among other issues. Our perspective provides a summary of general pharmacological factors affecting medicine taking in people with addiction problems, to provide a guide for hospital doctors in this setting. © 2017 Royal Australasian College of Physicians.
Policy Capacity Meets Politics: Comment on "Health Reform Requires Policy Capacity".
Fafard, Patrick
2015-07-22
It is difficult to disagree with the general argument that successful health reform requires a significant degree of policy capacity or that all players in the policy game need to move beyond self-interested advocacy. However, an overly broad definition of policy capacity is a problem. More important perhaps, health reform inevitably requires not just policy capacity but political leadership and compromise. © 2015 by Kerman University of Medical Sciences.
NASA Technical Reports Server (NTRS)
Clarke, John-Paul
2004-01-01
MEANS, the MIT Extensible Air Network Simulation, was created in February of 2001, and has been developed with support from NASA Ames since August of 2001. MEANS is a simulation tool which is designed to maximize fidelity without requiring data of such a low level as to preclude easy examination of alternative scenarios. To this end, MEANS is structured in a modular fashion to allow more detailed components to be brought in when desired, and left out when they would only be an impediment. Traditionally, one of the difficulties with high-fidelity models is that they require a level of detail in their data that is difficult to obtain. For analysis of past scenarios, the required data may not have been collected, or may be considered proprietary and thus difficult for independent researchers to obtain. For hypothetical scenarios, generation of the data is sufficiently difficult to be a task in and of itself. Often, simulations designed by a researcher will model exactly one element of the problem well and in detail, while assuming away other parts of the problem which are not of interest or for which data is not available. While these models are useful for working with the task at hand, they are very often not applicable to future problems. The MEAN Simulation attempts to address these problems by using a modular design which provides components of varying fidelity for each aspect of the simulation. This allows for the most accurate model for which data is available to be used. It also provides for easy analysis of sensitivity to data accuracy. This can be particularly useful in the case where accurate data is available for some subset of the situations that are to be considered. Furthermore, the ability to use the same model while examining effects on different parts of a system reduces the time spent learning the simulation, and provides for easier comparisons between changes to different parts of the system.
Smith, Steven R
2012-09-01
The insanity defense presents many difficult questions for the legal system. It attracts attention beyond its practical significance (it is seldom used successfully) because it goes to the heart of the concept of legal responsibility. "Not guilty by reason of insanity" generally requires that as a result of mental illness the defendant was unable to distinguish right from wrong at the time of the crime. The many difficult and complex questions presented by the insanity defense have led some in the legal community to hope that neuroscience might help resolve some of these problems, but that hope is not likely to be realized.
NASA Astrophysics Data System (ADS)
Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati
2017-09-01
One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.
ERIC Educational Resources Information Center
Sapiro, Maurice
1983-01-01
Clay sculpture is difficult to produce because of the requirements of kiln firing. The problems can be overcome by modeling the original manikin head and making a plaster mold, pressing molding slabs of clay into the plaster mold to form the hollow clay armature, and sculpting on the armature. (IS)
The Art of Artificial Intelligence. 1. Themes and Case Studies of Knowledge Engineering
1977-08-01
in scientific and medical inference illuminate the art of knowledge engineering and its parent science , Artificial Intelligence....The knowledge engineer practices the art of bringing the principles and tools of AI research to bear on difficult applications problems requiring
Using literature to help physician-learners understand and manage "difficult" patients.
Shapiro, J; Lie, D
2000-07-01
Despite significant clinical and research efforts aimed at recognizing and managing "difficult" patients, such patients remain a frustrating experience for many clinicians. This is especially true for primary care residents, who are required to see a significant volume of patients with diverse and complex problems, but who may not have adequate training and life experience to enable them to deal with problematic doctor-patient situations. Literature--short stories, poems, and patient narratives--is a little-explored educational tool to help residents in understanding and working with difficult patients. In this report, the authors examine the mechanics of using literature to teach about difficult patients, including structuring the learning environment, establishing learning objectives, identifying teaching resources and appropriate pedagogic methods, and incorporating creative writing assignments. They also present an illustrative progression of a typical literature-based teaching session about a difficult patient.
Crisis management during anaesthesia: difficult intubation.
Paix, A D; Williamson, J A; Runciman, W B
2005-06-01
Anaesthetists may experience difficulty with intubation unexpectedly which may be associated with difficulty in ventilating the patient. If not well managed, there may be serious consequences for the patient. A simple structured approach to this problem was developed to assist the anaesthetist in this difficult situation. To examine the role of a specific sub-algorithm for the management of difficult intubation. The potential performance of a structured approach developed by review of the literature and analysis of each of the relevant incidents among the first 4000 reported to the Australian Incident Monitoring Study (AIMS) was compared with the actual management as reported by the anaesthetists involved. There were 147 reports of difficult intubation capable of analysis among the first 4000 incidents reported to AIMS. The difficulty was unexpected in 52% of cases; major physiological changes occurred in 37% of these cases. Saturation fell below 90% in 22% of cases, oesophageal intubation was reported in 19%, and an emergency transtracheal airway was required in 4% of cases. Obesity and limited neck mobility and mouth opening were the most common anatomical contributing factors. The data confirm previously reported failures to predict difficult intubation with existing preoperative clinical tests and suggest an ongoing need to teach a pre-learned strategy to deal with difficult intubation and any associated problem with ventilation. An easy-to-follow structured approach to these problems is outlined. It is recommended that skilled assistance be obtained (preferably another anaesthetist) when difficulty is expected or the patient's cardiorespiratory reserve is low. Patients should be assessed postoperatively to exclude any sequelae and to inform them of the difficulties encountered. These should be clearly documented and appropriate steps taken to warn future anaesthetists.
Riva, Giuseppe; Graffigna, Guendalina; Baitieri, Maddalena; Amato, Alessandra; Bonanomi, Maria Grazia; Valentini, Paolo; Castelli, Guido
2014-01-01
The quest for an active and healthy ageing can be considered a "wicked problem." It is a social and cultural problem, which is difficult to solve because of incomplete, changing, and contradictory requirements. These problems are tough to manage because of their social complexity. They are a group of linked problems embedded in the structure of the communities in which they occur. First, they require the knowledge of the social and cultural context in which they occur. They can be solved only by understanding of what people do and why they do it. Second, they require a multidisciplinary approach. Wicked problems can have different solutions, so it is critical to capture the full range of possibilities and interpretations. Thus, we suggest that Università Cattolica del Sacro Cuore (UCSC) is well suited for accepting and managing this challenge because of its applied research orientation, multidisciplinary approach, and integrated vision. After presenting the research activity of UCSC, we describe a possible "systems thinking" strategy to consider the complexity and interdependence of active ageing and healthy living.
Bringing Them in: The Experiences of Imported and Overseas-Qualified Teachers
ERIC Educational Resources Information Center
Sharplin, Elaine
2009-01-01
This qualitative multiple-site case study explores the experiences of imported and overseas-qualified teachers appointed to fill "difficult-to-staff" Western Australian rural schools. In a climate of global teacher shortages, investigation of the strategies adopted to solve this problem requires empirical examination. The study of six…
Phosphorus (P) remediation is an extremely difficult and costly environmental problem and could cost $44.5 billion for treatment using conventional water treatment plants to meet EPA requirements. Phosphorus runoffs can lead to dead zones due to eutrophication and also ca...
Developing Materials for Deliberative Forums
ERIC Educational Resources Information Center
Rourke, Brad
2014-01-01
When citizens deliberate together about important issues, they can reach decisions and take action together on problems that confront them. Deliberation does not require a certain kind of guide, or framework, or language, or facilitator, but, because it can be difficult to face such choices, supporting materials can make it easier. In Developing…
Riding the Rapids of Classroom-Based Research
ERIC Educational Resources Information Center
Lonergan, Robyn; Cumming, Therese M.
2017-01-01
Conducting classroom-based research can be difficult, often fraught with challenges, analogous to riding a canoe down the rapids. The dynamics of classroom-based research often require flexibility on the parts of both the researcher and school personnel. Classroom-based research is viewed here through a framework of problem-based methodology as…
ERIC Educational Resources Information Center
Thompson, Gail
2007-01-01
Numerous researchers have devoted their careers to school reform. At the same time, many politicians have gotten elected by promising to fix failing schools. Although a lot of time, energy, and money have been invested in tackling this problem, the problem persists: Too many schools in the United States are failing to prepare too many students for…
The study on knowledge transferring incentive for information system requirement development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yang
2015-03-10
Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.
Evaluation of Genetic Algorithm Concepts using Model Problems. Part 1; Single-Objective Optimization
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Pulliam, Thomas H.
2003-01-01
A genetic-algorithm-based optimization approach is described and evaluated using a simple hill-climbing model problem. The model problem utilized herein allows for the broad specification of a large number of search spaces including spaces with an arbitrary number of genes or decision variables and an arbitrary number hills or modes. In the present study, only single objective problems are considered. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all problems attempted. The most difficult problems - those with large hyper-volumes and multi-mode search spaces containing a large number of genes - require a large number of function evaluations for GA convergence, but they always converge.
ERIC Educational Resources Information Center
Reif, Frederick
2008-01-01
Many students find it difficult to learn the kinds of knowledge and thinking required by college or high school courses in mathematics, science, or other complex domains. Thus they often emerge with significant misconceptions, fragmented knowledge, and inadequate problem-solving skills. Most instructors or textbook authors approach their teaching…
Linguistic Skills Involved in Learning to Spell: An Australian Study
ERIC Educational Resources Information Center
Daffern, Tessa
2017-01-01
Being able to accurately spell in Standard English requires efficient coordination of multiple knowledge sources. Therefore, spelling is a word-formation problem-solving process that can be difficult to learn. The present study uses Triple Word Form Theory as a conceptual framework to analyse Standard English spelling performance levels of…
Communication Problems in Requirements Engineering: A Field Study
NASA Technical Reports Server (NTRS)
Al-Rawas, Amer; Easterbrook, Steve
1996-01-01
The requirements engineering phase of software development projects is characterized by the intensity and importance of communication activities. During this phase, the various stakeholders must be able to communicate their requirements to the analysts, and the analysts need to be able to communicate the specifications they generate back to the stakeholders for validation. This paper describes a field investigation into the problems of communication between disparate communities involved in the requirements specification activities. The results of this study are discussed in terms of their relation to three major communication barriers: (1) ineffectiveness of the current communication channels; (2) restrictions on expressiveness imposed by notations; and (3) social and organizational barriers. The results confirm that organizational and social issues have great influence on the effectiveness of communication. They also show that in general, end-users find the notations used by software practitioners to model their requirements difficult to understand and validate.
MLA Certification: Its Present Problems and Future Development *
Proctor, Vilma
1967-01-01
The certification program is reviewed and questions as to the validity of the existing three grades of certification are raised. Large institutions which employ many medical librarians do not officially recognize the certificates. The three types of certificates should be replaced with one certificate or diploma. Problems of meaningful examinations, lack of training facilities, and revision of the curriculum require attention. A program is proposed for the improvement of education for medical librarianship. Public identification of medical librarianship is presently difficult. PMID:6016375
Vehicle coordinated transportation dispatching model base on multiple crisis locations
NASA Astrophysics Data System (ADS)
Tian, Ran; Li, Shanwei; Yang, Guoying
2018-05-01
Many disastrous events are often caused after unconventional emergencies occur, and the requirements of disasters are often different. It is difficult for a single emergency resource center to satisfy such requirements at the same time. Therefore, how to coordinate the emergency resources stored by multiple emergency resource centers to various disaster sites requires the coordinated transportation of emergency vehicles. In this paper, according to the problem of emergency logistics coordination scheduling, based on the related constraints of emergency logistics transportation, an emergency resource scheduling model based on multiple disasters is established.
Conscientious refusals and reason-giving.
Marsh, Jason
2014-07-01
Some philosophers have argued for what I call the reason-giving requirement for conscientious refusal in reproductive healthcare. According to this requirement, healthcare practitioners who conscientiously object to administering standard forms of treatment must have arguments to back up their conscience, arguments that are purely public in character. I argue that such a requirement, though attractive in some ways, faces an overlooked epistemic problem: it is either too easy or too difficult to satisfy in standard cases. I close by briefly considering whether a version of the reason-giving requirement can be salvaged despite this important difficulty. © 2013 John Wiley & Sons Ltd.
Planning and Scheduling for Fleets of Earth Observing Satellites
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Jonsson, Ari; Morris, Robert; Smith, David E.; Norvig, Peter (Technical Monitor)
2001-01-01
We address the problem of scheduling observations for a collection of earth observing satellites. This scheduling task is a difficult optimization problem, potentially involving many satellites, hundreds of requests, constraints on when and how to service each request, and resources such as instruments, recording devices, transmitters, and ground stations. High-fidelity models are required to ensure the validity of schedules; at the same time, the size and complexity of the problem makes it unlikely that systematic optimization search methods will be able to solve them in a reasonable time. This paper presents a constraint-based approach to solving the Earth Observing Satellites (EOS) scheduling problem, and proposes a stochastic heuristic search method for solving it.
The Effect of Process Writing Activities on the Writing Skills of Prospective Turkish Teachers
ERIC Educational Resources Information Center
Dilidüzgün, Sükran
2013-01-01
Problem statement: Writing an essay is a most difficult creative work and consequently requires detailed instruction. There are in fact two types of instruction that contribute to the development of writing skills: Reading activities analysing texts in content and schematic structure to find out how they are composed and process writing…
A Difficult, Confusing, Painful Problem That Requires Many Voices, Many Perceptions.
ERIC Educational Resources Information Center
Scheurich, James Joseph
1993-01-01
Responds to an article by Christine E. Sleeter (1993) and another by W.B. Allen (1993). Refutes criticisms that the author actually promotes racism in his call for increased white discourse on white racism and denies the individuality of races other than whites. However, personal individuality does not cancel the effects of inequality of power,…
ERIC Educational Resources Information Center
Hwang, Yoon-Suk; Klieve, Helen; Kearney, Patrick; Saggers, Beth
2015-01-01
Provision of an individually responsive education requires a comprehensive understanding of the inner worlds of learners, such as their feelings and thoughts. However, this is difficult to achieve when learners, such as those with Autism Spectrum Disorders (ASD) and cognitive difficulties, have problems with communication. To address this issue,…
Changing Schools from the inside out: Small Wins in Hard Times. Third Edition
ERIC Educational Resources Information Center
Larson, Robert
2011-01-01
At any time, public schools labor under great economic, political, and social pressures that make it difficult to create large-scale, "whole school" change. But current top-down mandates require that schools close achievement gaps while teaching more problem solving, inquiry, and research skills--with fewer resources. Failure to meet test-based…
An Efficient MCMC Algorithm to Sample Binary Matrices with Fixed Marginals
ERIC Educational Resources Information Center
Verhelst, Norman D.
2008-01-01
Uniform sampling of binary matrices with fixed margins is known as a difficult problem. Two classes of algorithms to sample from a distribution not too different from the uniform are studied in the literature: importance sampling and Markov chain Monte Carlo (MCMC). Existing MCMC algorithms converge slowly, require a long burn-in period and yield…
USDA-ARS?s Scientific Manuscript database
Identification and differentiation of anthocyanins and non-anthocyanin compounds in natural products can be very difficult by mass spectrometry. Using a ultra-violet/visible detector can be helpful, but not fool-proof, and it requires an additional detector. To solve the problem, a fast and reliab...
ERIC Educational Resources Information Center
Munday, Jenni; Smith, Wyverne
2010-01-01
Pre-service teacher degree programs are increasingly crowded with subjects covering the wide gamut of knowledge a teacher requires. Ensuring musical knowledge and language for classroom teaching poses a difficult problem for teacher educators. This article examines the challenges of including in the pre-service classroom teaching program a music…
ERIC Educational Resources Information Center
Feng, Li; Sass, Tim R.
2015-01-01
Staffing problems are pervasive in certain subject areas, such as secondary math and science and special education, where the combination of training requirements and relatively high alternative wages makes it difficult to attract and retain high-quality teachers. This project evaluated the impacts of the Florida Critical Teacher Shortage Program…
Validity and Reliability of the DeMoulin Self-Concept Developmental Scale for Turkish Preschoolers
ERIC Educational Resources Information Center
Turasli, Nalan Kuru
2014-01-01
Problem Statement: "Self-concept" is a primary issue of emotional and social development. Though the most important stage in the formation of self-concept is childhood, measuring the development of the self in the preschool period is quite difficult, for the tools used to measure children's self-concept either require the child's knowing…
Schools at Work: Targeting Proficiency with Theory to Practice
ERIC Educational Resources Information Center
White-Hood, Marian
2006-01-01
Profound problems in public schools require solutions that are often difficult to implement. Although we, as a society, see the future embodied in the students, our promise to educate them is often not reflected in our practices. A lack of will is evident. The following article explores the notion of "schools at work" and provides…
NASA Technical Reports Server (NTRS)
Goodrich, Charles H.; Kurien, James; Clancy, Daniel (Technical Monitor)
2001-01-01
We present some diagnosis and control problems that are difficult to solve with discrete or purely qualitative techniques. We analyze the nature of the problems, classify them and explain why they are frequently encountered in systems with closed loop control. This paper illustrates the problem with several examples drawn from industrial and aerospace applications and presents detailed information on one important application: In-Situ Resource Utilization (ISRU) on Mars. The model for an ISRU plant is analyzed showing where qualitative techniques are inadequate to identify certain failure modes and to maintain control of the system in degraded environments. We show why the solution to the problem will result in significantly more robust and reliable control systems. Finally, we illustrate requirements for a solution to the problem by means of examples.
NASA Technical Reports Server (NTRS)
Johnson, C. R., Jr.; Balas, M. J.
1980-01-01
A novel interconnection of distributed parameter system (DPS) identification and adaptive filtering is presented, which culminates in a common statement of coupled autoregressive, moving-average expansion or parallel infinite impulse response configuration adaptive parameterization. The common restricted complexity filter objectives are seen as similar to the reduced-order requirements of the DPS expansion description. The interconnection presents the possibility of an exchange of problem formulations and solution approaches not yet easily addressed in the common finite dimensional lumped-parameter system context. It is concluded that the shared problems raised are nevertheless many and difficult.
Exploiting replication in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Joseph, T. A.
1989-01-01
Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.
Beyond rules: The next generation of expert systems
NASA Technical Reports Server (NTRS)
Ferguson, Jay C.; Wagner, Robert E.
1987-01-01
The PARAGON Representation, Management, and Manipulation system is introduced. The concepts of knowledge representation, knowledge management, and knowledge manipulation are combined in a comprehensive system for solving real world problems requiring high levels of expertise in a real time environment. In most applications the complexity of the problem and the representation used to describe the domain knowledge tend to obscure the information from which solutions are derived. This inhibits the acquisition of domain knowledge verification/validation, places severe constraints on the ability to extend and maintain a knowledge base while making generic problem solving strategies difficult to develop. A unique hybrid system was developed to overcome these traditional limitations.
Printed circuit boards: a review on the perspective of sustainability.
Canal Marques, André; Cabrera, José-María; Malfatti, Célia de Fraga
2013-12-15
Modern life increasingly requires newer equipments and more technology. In addition, the fact that society is highly consumerist makes the amount of discarded equipment as well as the amount of waste from the manufacture of new products increase at an alarming rate. Printed circuit boards, which form the basis of the electronics industry, are technological waste of difficult disposal whose recycling is complex and expensive due to the diversity of materials and components and their difficult separation. Currently, printed circuit boards have a fixing problem, which is migrating from traditional Pb-Sn alloys to lead-free alloys without definite choice. This replacement is an attempt to minimize the problem of Pb toxicity, but it does not change the problem of separation of the components for later reuse and/or recycling and leads to other problems, such as temperature rise, delamination, flaws, risks of mechanical shocks and the formation of "whiskers". This article presents a literature review on printed circuit boards, showing their structure and materials, the environmental problem related to the board, some the different alternatives for recycling, and some solutions that are being studied to reduce and/or replace the solder, in order to minimize the impact of solder on the printed circuit boards. Copyright © 2013 Elsevier Ltd. All rights reserved.
Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation
NASA Technical Reports Server (NTRS)
Ross, James C.
2016-01-01
Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.
Side-branch technique for difficult guidewire placement in coronary bifurcation lesion.
He, Xingwei; Gao, Bo; Liu, Yujian; Li, Zhuxi; Zeng, Hesong
2016-01-01
Despite tremendous advances in technology and skills, percutaneous coronary intervention (PCI) of bifurcation lesion (BL) remains a particular challenge for the interventionalist. During bifurcation PCI, safe guidewire placement in the main branch (MB) and the side branch (SB) is the first step for successful procedure. However, in certain cases, the complex pattern of vessel anatomy and the mix of plaque distribution may make target vessel wiring highly challenging. Therefore, specific techniques are required for solving this problem. Hereby, we describe a new use of side-branch technique for difficult guidewire placement in BL. Copyright © 2015 Elsevier Inc. All rights reserved.
Adams, Valerie Margaret; Bagshaw, Dale; Wendt, Sarah; Zannettino, Lana
2014-01-01
Financial abuse by a family member is the most common form of abuse experienced by older Australians, and early intervention is required. National online surveys of 228 chief executive officers and 214 aged care service providers found that, while they were well placed to recognize financial abuse, it was often difficult to intervene successfully. Problems providers encountered included difficulties in detecting abuse, the need for consent before they could take action, the risk that the abusive family member would withdraw the client from the service, and a lack of resources to deal with the complexities inherent in situations of financial abuse.
ERIC Educational Resources Information Center
Cooper, Melanie M.; Klymkowsky, Michael W.
2013-01-01
Helping students understand "chemical energy" is notoriously difficult. Many hold inconsistent ideas about what energy is, how and why it changes during the course of a chemical reaction, and how these changes are related to bond energies and reaction dynamics. There are (at least) three major sources for this problem: 1) the way biologists talk…
ERIC Educational Resources Information Center
Chen, Xianling; Chen, Buyuan; Li, Xiaofan; Song, Qingxiao; Chen, Yuanzhong
2017-01-01
Hematology is difficult for students to learn. A beneficial education method for hematology clerkship training is required to help students develop clinical skills. Foreign medical students often encounter communication issues in China. To address this issue, Chinese post-graduates from our institute are willing to assist with educating foreign…
Prioritizing parts from cutting bills when gang-ripping first
R. Edward Thomas
1996-01-01
Computer optimization of gang-rip-first processing is a difficult problem when working with specific cutting bills. Interactions among board grade and size, arbor setup, and part sizes and quantities greatly complicate the decision making process. Cutting the wrong parts at any moment will mean that more board footage will be required to meet the bill. Using the ROugh...
The NASA firefighter's breathing system program
NASA Technical Reports Server (NTRS)
Mclaughlan, P. B.; Carson, M. A.
1974-01-01
The research is reported in the development of a firefighter's breathing system (FBS) to satisfy the operational requirements of fire departments while remaining within their cost constraints. System definition for the FBS is discussed, and the program status is reported. It is concluded that the most difficult problem in the FBS Program is the achievement of widespread fire department acceptance of the system.
ERIC Educational Resources Information Center
Rajabi, Shima; Azizifar, Akbar; Gowhary, Habib
2015-01-01
Learning a foreign language requires students to acquire both grammatical knowledge and socio-pragmatic rules of a language. Pragmatic competence as one of the most difficult aspects of language provides several challenges to L2 learners in the process of learning a foreign language. To overcome this problem, EFL teachers should find the most…
Virtual Laboratories to Achieve Higher-Order Learning in Fluid Mechanics
NASA Astrophysics Data System (ADS)
Ward, A. S.; Gooseff, M. N.; Toto, R.
2009-12-01
Bloom’s higher-order cognitive skills (analysis, evaluation, and synthesis) are recognized as necessary in engineering education, yet these are difficult to achieve in traditional lecture formats. Laboratory components supplement traditional lectures in an effort to emphasize active learning and provide higher-order challenges, but these laboratories are often subject to the constraints of (a) increasing student enrollment, (b) limited funding for operational, maintenance, and instructional expenses and (c) increasing demands on undergraduate student credit requirements. Here, we present results from a pilot project implementing virtual (or online) laboratory experiences as an alternative to a traditional laboratory experience in Fluid Mechanics, a required third year course. Students and faculty were surveyed to identify the topics that were most difficult, and virtual laboratory and design components developed to supplement lecture material. Each laboratory includes a traditional lab component, requiring student analysis and evaluation. The lab concludes with a design exercise, which imposes additional problem constraints and allows students to apply their laboratory observations to a real-world situation.
Lunar Habitat Optimization Using Genetic Algorithms
NASA Technical Reports Server (NTRS)
SanScoucie, M. P.; Hull, P. V.; Tinker, M. L.; Dozier, G. V.
2007-01-01
Long-duration surface missions to the Moon and Mars will require bases to accommodate habitats for the astronauts. Transporting the materials and equipment required to build the necessary habitats is costly and difficult. The materials chosen for the habitat walls play a direct role in protection against each of the mentioned hazards. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Clearly, an optimization method is warranted for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat wall design tool utilizing genetic algorithms (GAs) has been developed. GAs use a "survival of the fittest" philosophy where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multiobjective formulation of up-mass, heat loss, structural analysis, meteoroid impact protection, and radiation protection. This Technical Publication presents the research and development of this tool as well as a technique for finding the optimal GA search parameters.
Meyer, Annika; Gross, Neil; Teng, Marita
2018-01-01
Head and neck surgeons are commonly faced with surgical patients who have underlying medical problems requiring antithrombotic therapy. It is difficult to achieve a balance between minimizing the risk of thromboembolism and hemorrhage in the perioperative period. Data from randomized, controlled trials are limited, and procedure-specific bleed rates are also difficult to pinpoint. The decision is made more difficult when patients with moderate-to-high risk for thromboembolic events undergo procedures that are high risk for bleeding. This is true for many head and neck oncologic surgeries. Furthermore, although elective procedures may be delayed for optimization of antithrombotic medication, emergent procedures cannot. Head and neck surgery often represents the most challenging of all these circumstances, given the potential risk of airway compromise from bleeding after head and neck surgery. © 2017 Wiley Periodicals, Inc.
Patients with difficult intubation may need referral to sleep clinics.
Chung, Frances; Yegneswaran, Balaji; Herrera, Francisco; Shenderey, Alex; Shapiro, Colin M
2008-09-01
Upper airway abnormalities carry the risk of obstructive sleep apnea (OSA) and difficult tracheal intubations. Both conditions contribute to significant clinical problems and have increased perioperative morbidity and mortality. We hypothesized that patients who presented with difficult intubation would have a very high prevalence of OSA and that those with unexpected difficult intubation may require referral to sleep clinics for polysomnography (PSG). Patients classified as a grade 4 Cormack and Lehane on direct laryngoscopic view, and who required more than two attempts for successful endotracheal intubation, were referred to the study by consultant anesthesiologists at four hospitals. Apnea-hypopnea index (AHI) data and postoperative events were collected. Patients with AHI >5/h were considered positive for OSA. Clinical and PSG variables were compared using t-tests and chi(2) test. Over a 20-mo period, 84 patients with a difficult intubation were referred into the study. Thirty-three patients agreed to participate. Sixty-six percent (22 of 33) had OSA (AHI >5/h). Of the 22 OSA patients, 10 patients (64%) had mild OSA (AHI 5-15), 6 (18%) had moderate OSA (AHI >15/h), and 6 (18%) had severe OSA (AHI >30/h). Of the 33 patients, 11 patients (33%) were recommended for continuous positive airway pressure treatment. Between the OSA group and the non-OSA group, there were significant differences in gender, neck size, and the quality of sleep, but there were no significant differences in age and body mass index. Sixty-six percent of patients with unexpected difficult intubation who consented to undergo a sleep study were diagnosed with OSA by PSG. Patients with difficult intubation are at high risk for OSA and should be screened for signs and symptoms of sleep apnea. Screening for OSA should be considered by referral to a sleep clinic for PSG.
Customer and household matching: resolving entity identity in data warehouses
NASA Astrophysics Data System (ADS)
Berndt, Donald J.; Satterfield, Ronald K.
2000-04-01
The data preparation and cleansing tasks necessary to ensure high quality data are among the most difficult challenges faced in data warehousing and data mining projects. The extraction of source data, transformation into new forms, and loading into a data warehouse environment are all time consuming tasks that can be supported by methodologies and tools. This paper focuses on the problem of record linkage or entity matching, tasks that can be very important in providing high quality data. Merging two or more large databases into a single integrated system is a difficult problem in many industries, especially in the wake of acquisitions. For example, managing customer lists can be challenging when duplicate entries, data entry problems, and changing information conspire to make data quality an elusive target. Common tasks with regard to customer lists include customer matching to reduce duplicate entries and household matching to group customers. These often O(n2) problems can consume significant resources, both in computing infrastructure and human oversight, and the goal of high accuracy in the final integrated database can be difficult to assure. This paper distinguishes between attribute corruption and entity corruption, discussing the various impacts on quality. A metajoin operator is proposed and used to organize past and current entity matching techniques. Finally, a logistic regression approach to implementing the metajoin operator is discussed and illustrated with an example. The metajoin can be used to determine whether two records match, don't match, or require further evaluation by human experts. Properly implemented, the metajoin operator could allow the integration of individual databases with greater accuracy and lower cost.
Review of heat transfer problems associated with magnetically-confined fusion reactor concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, M.A.; Werner, R.W.; Carlson, G.A.
1976-04-01
Conceptual design studies of possible fusion reactor configurations have revealed a host of interesting and sometimes extremely difficult heat transfer problems. The general requirements imposed on the coolant system for heat removal of the thermonuclear power from the reactor are discussed. In particular, the constraints imposed by the fusion plasma, neutronics, structure and magnetic field environment are described with emphasis on those aspects which are unusual or unique to fusion reactors. Then the particular heat transfer characteristics of various possible coolants including lithium, flibe, boiling alkali metals, and helium are discussed in the context of these general fusion reactor requirements.more » Some specific areas where further experimental and/or theoretical work is necessary are listed for each coolant along with references to the pertinent research already accomplished. Specialized heat transfer problems of the plasma injection and removal systems are also described. Finally, the challenging heat transfer problems associated with the superconducting magnets are reviewed, and once again some of the key unsolved heat transfer problems are enumerated.« less
Combinatorial algorithms for design of DNA arrays.
Hannenhalli, Sridhar; Hubell, Earl; Lipshutz, Robert; Pevzner, Pavel A
2002-01-01
Optimal design of DNA arrays requires the development of algorithms with two-fold goals: reducing the effects caused by unintended illumination (border length minimization problem) and reducing the complexity of masks (mask decomposition problem). We describe algorithms that reduce the number of rectangles in mask decomposition by 20-30% as compared to a standard array design under the assumption that the arrangement of oligonucleotides on the array is fixed. This algorithm produces provably optimal solution for all studied real instances of array design. We also address the difficult problem of finding an arrangement which minimizes the border length and come up with a new idea of threading that significantly reduces the border length as compared to standard designs.
Mack, Jennifer W; Ilowite, Maya; Taddei, Sarah
2017-02-15
Previous work on difficult relationships between patients and physicians has largely focused on the adult primary care setting and has typically held patients responsible for challenges. Little is known about experiences in pediatrics and more serious illness; therefore, we examined difficult relationships between parents and physicians of children with cancer. This was a cross-sectional, semistructured interview study of parents and physicians of children with cancer at the Dana-Farber Cancer Institute and Boston Children's Hospital (Boston, Mass) in longitudinal primary oncology relationships in which the parent, physician, or both considered the relationship difficult. Interviews were audiotaped, transcribed, and subjected to a content analysis. Dyadic parent and physician interviews were performed for 29 relationships. Twenty were experienced as difficult by both parents and physicians; 1 was experienced as difficult by the parent only; and 8 were experienced as difficult by the physician only. Parent experiences of difficult relationships were characterized by an impaired therapeutic alliance with physicians; physicians experienced difficult relationships as demanding. Core underlying issues included problems of connection and understanding (n = 8), confrontational parental advocacy (n = 16), mental health issues (n = 2), and structural challenges to care (n = 3). Although problems of connection and understanding often improved over time, problems of confrontational advocacy tended to solidify. Parents and physicians both experienced difficult relationships as highly distressing. Although prior conceptions of difficult relationships have held patients responsible for challenges, this study has found that difficult relationships follow several patterns. Some challenges, such as problems of connection and understanding, offer an opportunity for healing. However, confrontational advocacy appears especially refractory to repair; special consideration of these relationships and avenues for repairing them are needed. Cancer 2017;123:675-681. © 2016 American Cancer Society. © 2016 American Cancer Society.
ERIC Educational Resources Information Center
Institute for Educational Leadership, Washington, DC.
State and local education systems must abandon the century-old model of the principal as a middle manager directly responsible for every aspect of school operations and performance. Intense job stress, excessive time requirements, difficulty satisfying parents and community, social problems that make it difficult to focus on instructional issues,…
Highly Refractory Porous Ceramics,
1979-03-14
some cases is not desirable. A uniform distribution of tem- peratures in the melting ch the use of thermally insulating materials, i. e. stability of...materials with micropores that a are inaccessible to penetration by sl s and other melts , which appears, how- ever, to be a very difficult problem...requirements for refractoriness, chemical stability, etc. In accordance with GOST 5040-68, maximum temperatures were established for the melting chamber
Bordelon, B M; Hobday, K A; Hunter, J G
1992-01-01
An unsolved problem of laparoscopic cholecystectomy is the optimal method of removing the gallbladder with thick walls and a large stone burden. Proposed solutions include fascial dilatation, stone crushing, and ultrasonic, high-speed rotary, or laser lithotripsy. Our observation was that extension of the fascial incision to remove the impacted gallbladder was time efficient and did not increase postoperative pain. We reviewed the narcotic requirements of 107 consecutive patients undergoing laparoscopic cholecystectomy. Fifty-two patients required extension of the umbilical incision, and 55 patients did not have their fascial incision enlarged. Parenteral meperidine use was 39.5 +/- 63.6 mg in the patients requiring fascial incision extension and 66.3 +/- 79.2 mg in those not requiring fascial incision extension (mean +/- standard deviation). Oral narcotic requirements were 1.1 +/- 1.5 doses vs 1.3 +/- 1.7 doses in patients with and without incision extension, respectively. The wide range of narcotic use in both groups makes these apparent differences not statistically significant. We conclude that protracted attempts at stone crushing or expensive stone fragmentation devices are unnecessary for the extraction of a difficult gallbladder during laparoscopic cholecystectomy.
NASA Technical Reports Server (NTRS)
Hargrove, A.
1982-01-01
Optimal digital control of nonlinear multivariable constrained systems was studied. The optimal controller in the form of an algorithm was improved and refined by reducing running time and storage requirements. A particularly difficult system of nine nonlinear state variable equations was chosen as a test problem for analyzing and improving the controller. Lengthy analysis, modeling, computing and optimization were accomplished. A remote interactive teletype terminal was installed. Analysis requiring computer usage of short duration was accomplished using Tuskegee's VAX 11/750 system.
[Primary hyperaldosteronism: problems of diagnostic approaches].
Widimský, Jiří
2015-05-01
Primary hyperaldosteronism (PH) is common cause of endocrine/secondary hypertension with autonomous aldosterone overproduction by adrenal cortex. PH is typically characterized by hypertension, hypokalemia, high plasma aldosterone/renin ratio, high aldosterone, suppressed renin and nonsupressibilty of aldosterone during confirmatory tests. Diagnosis of PH can be difficult since hypokalemia is found only in 50 % of cases and measurement of the parameters of renin-angiotensin-aldosterone system can be influenced by several factors. Morphological dia-gnosis requires in majority of cases adrenal venous sampling. Early diagnostic and therapeutic measures are very important due to high prevalence of PH and potential cure. Patients with suspicion to PH should be investigated in experienced hypertensive centers due to relatively difficult laboratory and morphological diagnostic approaches.
The use of telemetry in testing in high performance racing engines
NASA Astrophysics Data System (ADS)
Hauser, E.
Telemetry measurement data in mobile application and under difficult environmental conditions were recorded. All relevant racing car and engine parameters were measured: pressure, stress, temperature, acceleration, ignition, number of revolutions, control of electronic injection, and flow measurements on the car body. The difficult measuring conditions due to high voltage ignition, mechanical loads and vibrations impose special requirements on a telemetry system built in racing cars. It has to be compact, flexible, light, and mechanically robust and has to fulfil special sheilding conditions. The measured data are transfered to a stationary measurement car via a radio line, involving RF communication problems. The measured data are directly displayed and evaluated in the measurement car.
KAMEDO report no. 87: bomb attack in Finnish shopping center, 2002.
Deverell, Edward; Ortenwall, Per; Almgren, Ola; Riddez, Louis
2007-01-01
The detonation of a bomb in a shopping center in Vantaa, Finland, took place on 11 October 2002. Seven people died as a result and > 160 people required medical attention. Because the rescue teams were inadequately trained to respond to terrorist attacks, the event was handled according to protocol. A number of problems arose, including: people from different rescue agencies were difficult to distinguish from each other; there was inadequate communication between the incident site and the main hospital; relatives of victims were not kept informed; and psychiatric problems in the wake of the disaster were not addressed sufficiently.
Caronia, Francesco Paolo; Fiorelli, Alfonso; Arrigo, Ettore; Santini, Mario; Castorina, Sergio
2016-05-01
Herein, we reported a catastrophic condition as the almost complete rupture of trachea associated with esophageal lesion following an urgent surgical tracheostomy performed for unexpected difficult intubation. The extent of lesions required a surgical management. We decided against a resection and an end to end anastomosis but preferred to perform a direct suture of the lesion due to the presence of local and systemic infection. Then, the diagnosis of a tracheal fistula led us to perform a direct suture of the defect that was covered with muscle flaps. Actually the patient is alive without problems. Emergency situations as unexpected airway difficult intubation increase morbidity and mortality rate of tracheostomy also in expert hands. Sometimes these events are unpredictable. Mastery with a number of advanced airway technique should be sought when faced dealing with unexpected difficult intubations and written consent of such a concern should be given to the patient.
Parallel Preconditioning for CFD Problems on the CM-5
NASA Technical Reports Server (NTRS)
Simon, Horst D.; Kremenetsky, Mark D.; Richardson, John; Lasinski, T. A. (Technical Monitor)
1994-01-01
Up to today, preconditioning methods on massively parallel systems have faced a major difficulty. The most successful preconditioning methods in terms of accelerating the convergence of the iterative solver such as incomplete LU factorizations are notoriously difficult to implement on parallel machines for two reasons: (1) the actual computation of the preconditioner is not very floating-point intensive, but requires a large amount of unstructured communication, and (2) the application of the preconditioning matrix in the iteration phase (i.e. triangular solves) are difficult to parallelize because of the recursive nature of the computation. Here we present a new approach to preconditioning for very large, sparse, unsymmetric, linear systems, which avoids both difficulties. We explicitly compute an approximate inverse to our original matrix. This new preconditioning matrix can be applied most efficiently for iterative methods on massively parallel machines, since the preconditioning phase involves only a matrix-vector multiplication, with possibly a dense matrix. Furthermore the actual computation of the preconditioning matrix has natural parallelism. For a problem of size n, the preconditioning matrix can be computed by solving n independent small least squares problems. The algorithm and its implementation on the Connection Machine CM-5 are discussed in detail and supported by extensive timings obtained from real problem data.
Sequential decision making in computational sustainability via adaptive submodularity
Krause, Andreas; Golovin, Daniel; Converse, Sarah J.
2015-01-01
Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.
Laser inactivation of pathogenic viruses in water
NASA Astrophysics Data System (ADS)
Grishkanich, Alexander; Zhevlakov, Alexander; Kascheev, Sergey; Sidorov, Igor; Ruzankina, Julia; Yakovlev, Alexey; Mak, Andrey
2016-03-01
Currently there is a situation that makes it difficult to provide the population with quality drinking water for the sanitary-hygienic requirements. One of the urgent problems is the need for water disinfection. Since the emergence of microorganisms that are pathogens transmitted through water such as typhoid, cholera, etc. requires constant cleansing of waters against pathogenic bacteria. In the water treatment process is destroyed up to 98% of germs, but among the remaining can be pathogenic viruses, the destruction of which requires special handling. As a result, the conducted research the following methods have been proposed for combating harmful microorganisms: sterilization of water by laser radiation and using a UV lamp.
The Difficult Patron in the Academic Library: Problem Issues or Problem Patrons?
ERIC Educational Resources Information Center
Simmonds Patience L.; Ingold, Jane L.
2002-01-01
Identifies difficult patron issues in academic libraries from the librarians' perspectives and offers solutions to try and prevent them from becoming problems. Topics include labeling academic library users; eliminating sources of conflict between faculty and library staff; and conflicts between students and library staff. (Author/LRW)
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous ``sources`` and ``targets`` requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to ``calibrate`` the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous sources'' and targets'' requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to calibrate'' the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-03-23
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-01-01
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690
Cross-Correlation-Based Structural System Identification Using Unmanned Aerial Vehicles
Yoon, Hyungchul; Hoskere, Vedhus; Park, Jong-Woong; Spencer, Billie F.
2017-01-01
Computer vision techniques have been employed to characterize dynamic properties of structures, as well as to capture structural motion for system identification purposes. All of these methods leverage image-processing techniques using a stationary camera. This requirement makes finding an effective location for camera installation difficult, because civil infrastructure (i.e., bridges, buildings, etc.) are often difficult to access, being constructed over rivers, roads, or other obstacles. This paper seeks to use video from Unmanned Aerial Vehicles (UAVs) to address this problem. As opposed to the traditional way of using stationary cameras, the use of UAVs brings the issue of the camera itself moving; thus, the displacements of the structure obtained by processing UAV video are relative to the UAV camera. Some efforts have been reported to compensate for the camera motion, but they require certain assumptions that may be difficult to satisfy. This paper proposes a new method for structural system identification using the UAV video directly. Several challenges are addressed, including: (1) estimation of an appropriate scale factor; and (2) compensation for the rolling shutter effect. Experimental validation is carried out to validate the proposed approach. The experimental results demonstrate the efficacy and significant potential of the proposed approach. PMID:28891985
Model-Based Optimal Experimental Design for Complex Physical Systems
2015-12-03
for public release. magnitude reduction in estimator error required to make solving the exact optimal design problem tractable. Instead of using a naive...for designing a sequence of experiments uses suboptimal approaches: batch design that has no feedback, or greedy ( myopic ) design that optimally...approved for public release. Equation 1 is difficult to solve directly, but can be expressed in an equivalent form using the principle of dynamic programming
Transient heat conduction in a heat fin
NASA Astrophysics Data System (ADS)
Brody, Jed; Brown, Max
2017-08-01
We immerse the bottom of a rod in ice water and record the time-dependent temperatures at positions along the length of the rod. Though the experiment is simple, a surprisingly difficult problem in heat conduction must be solved to obtain a theoretical fit to the measured data. The required equipment is very inexpensive and could be assigned as a homework exercise or a hands-on component of an online course.
Evaluation of Genetic Algorithm Concepts Using Model Problems. Part 2; Multi-Objective Optimization
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Pulliam, Thomas H.
2003-01-01
A genetic algorithm approach suitable for solving multi-objective optimization problems is described and evaluated using a series of simple model problems. Several new features including a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all optimization problems attempted. The binning algorithm generally provides pareto front quality enhancements and moderate convergence efficiency improvements for most of the model problems. The gene-space transformation procedure provides a large convergence efficiency enhancement for problems with non-convoluted pareto fronts and a degradation in efficiency for problems with convoluted pareto fronts. The most difficult problems --multi-mode search spaces with a large number of genes and convoluted pareto fronts-- require a large number of function evaluations for GA convergence, but always converge.
Why Do Disadvantaged Filipino Children Find Word Problems in English Difficult?
ERIC Educational Resources Information Center
Bautista, Debbie; Mulligan, Joanne
2010-01-01
Young Filipino students are expected to solve mathematical word problems in English, a language that many encounter only in schools. Using individual interviews of 17 Filipino children, we investigated why word problems in English are difficult and the extent to which the language interferes with performance. Results indicate that children could…
Unterrainer, J M; Kaller, C P; Halsband, U; Rahm, B
2006-08-01
Playing chess requires problem-solving capacities in order to search through the chess problem space in an effective manner. Chess should thus require planning abilities for calculating many moves ahead. Therefore, we asked whether chess players are better problem solvers than non-chess players in a complex planning task. We compared planning performance between chess ( N=25) and non-chess players ( N=25) using a standard psychometric planning task, the Tower of London (ToL) test. We also assessed fluid intelligence (Raven Test), as well as verbal and visuospatial working memory. As expected, chess players showed better planning performance than non-chess players, an effect most strongly expressed in difficult problems. On the other hand, they showed longer planning and movement execution times, especially for incorrectly solved trials. No differences in fluid intelligence and verbal/visuospatial working memory were found between both groups. These findings indicate that better performance in chess players is associated with disproportionally longer solution times, although it remains to be investigated whether motivational or strategic differences account for this result.
Cultural-based particle swarm for dynamic optimisation problems
NASA Astrophysics Data System (ADS)
Daneshyari, Moayed; Yen, Gary G.
2012-07-01
Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.
NASA Astrophysics Data System (ADS)
Strack, O. D. L.
2018-02-01
We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.
[The first and foremost tasks of the medical service].
Chizh, I M
1997-07-01
Now in connection with common situation in Russian Federation the problem of reinforcements of army and fleet by healthy personnel, scare of a call-up quota and its poor quality are the main problems of the Armed Forces at the state level. The uniform complex program of medico-social maintenance of the citizens during preparation for military service is necessary. The modern situation is difficult due to many infectious diseases, so the role and the place of military-medical service grows. In last years structure of quota, served by the military doctors, and number of other parameters have greatly changed, that require revision of some priorities. A problem of reinforcements of the Armed Forces by medical service officers remains actual, for decision of which a full-bodied admission on military medical faculty is required, as well as admission of the officers under contract and calling-up of reserve officers. In article the main lessons, received by the medical service during combat actions in Republic of Chechnya are also formulated.
NASA Astrophysics Data System (ADS)
Umbarkar, A. J.; Balande, U. T.; Seth, P. D.
2017-06-01
The field of nature inspired computing and optimization techniques have evolved to solve difficult optimization problems in diverse fields of engineering, science and technology. The firefly attraction process is mimicked in the algorithm for solving optimization problems. In Firefly Algorithm (FA) sorting of fireflies is done by using sorting algorithm. The original FA is proposed with bubble sort for ranking the fireflies. In this paper, the quick sort replaces bubble sort to decrease the time complexity of FA. The dataset used is unconstrained benchmark functions from CEC 2005 [22]. The comparison of FA using bubble sort and FA using quick sort is performed with respect to best, worst, mean, standard deviation, number of comparisons and execution time. The experimental result shows that FA using quick sort requires less number of comparisons but requires more execution time. The increased number of fireflies helps to converge into optimal solution whereas by varying dimension for algorithm performed better at a lower dimension than higher dimension.
NASA Technical Reports Server (NTRS)
Vajingortin, L. D.; Roisman, W. P.
1991-01-01
The problem of ensuring the required quality of products and/or technological processes often becomes more difficult due to the fact that there is not general theory of determining the optimal sets of value of the primary factors, i.e., of the output parameters of the parts and units comprising an object and ensuring the correspondence of the object's parameters to the quality requirements. This is the main reason for the amount of time taken to finish complex vital article. To create this theory, one has to overcome a number of difficulties and to solve the following tasks: the creation of reliable and stable mathematical models showing the influence of the primary factors on the output parameters; finding a new technique of assigning tolerances for primary factors with regard to economical, technological, and other criteria, the technique being based on the solution of the main problem; well reasoned assignment of nominal values for primary factors which serve as the basis for creating tolerances. Each of the above listed tasks is of independent importance. An attempt is made to give solutions for this problem. The above problem dealing with quality ensuring an mathematically formalized aspect is called the multiple inverse problem.
Kochanska, Grazyna; Kim, Sanghag
2012-01-01
Background Research has shown that interactions between young children’s temperament and the quality of care they receive predict the emergence of positive and negative socioemotional developmental outcomes. This multi-method study addresses such interactions, using observed and mother-rated measures of difficult temperament, children’s committed, self-regulated compliance and externalizing problems, and mothers’ responsiveness in a low-income sample. Methods In 186 30-month-old children, difficult temperament was observed in the laboratory (as poor effortful control and high anger proneness), and rated by mothers. Mothers’ responsiveness was observed in lengthy naturalistic interactions at 30 and 33 months. At 40 months, children’s committed compliance and externalizing behavior problems were assessed using observations and several well-established maternal report instruments. Results Parallel significant interactions between child difficult temperament and maternal responsiveness were found across both observed and mother-rated measures of temperament. For difficult children, responsiveness had a significant effect such that those children were more compliant and had fewer externalizing problems when they received responsive care, but were less compliant and had more behavior problems when they received unresponsive care. For children with easy temperaments, maternal responsiveness and developmental outcomes were unrelated. All significant interactions reflected the diathesis-stress model. There was no evidence of differential susceptibility, perhaps due to the pervasive stress present in the ecology of the studied families. Conclusions Those findings add to the growing body of evidence that for temperamentally difficult children, unresponsive parenting exacerbates risks for behavior problems, but responsive parenting can effectively buffer risks conferred by temperament. PMID:23057713
A Research Methodology for Studying What Makes Some Problems Difficult to Solve
ERIC Educational Resources Information Center
Gulacar, Ozcan; Fynewever, Herb
2010-01-01
We present a quantitative model for predicting the level of difficulty subjects will experience with specific problems. The model explicitly accounts for the number of subproblems a problem can be broken into and the difficultly of each subproblem. Although the model builds on previously published models, it is uniquely suited for blending with…
A cross-disciplinary introduction to quantum annealing-based algorithms
NASA Astrophysics Data System (ADS)
Venegas-Andraca, Salvador E.; Cruz-Santos, William; McGeoch, Catherine; Lanzagorta, Marco
2018-04-01
A central goal in quantum computing is the development of quantum hardware and quantum algorithms in order to analyse challenging scientific and engineering problems. Research in quantum computation involves contributions from both physics and computer science; hence this article presents a concise introduction to basic concepts from both fields that are used in annealing-based quantum computation, an alternative to the more familiar quantum gate model. We introduce some concepts from computer science required to define difficult computational problems and to realise the potential relevance of quantum algorithms to find novel solutions to those problems. We introduce the structure of quantum annealing-based algorithms as well as two examples of this kind of algorithms for solving instances of the max-SAT and Minimum Multicut problems. An overview of the quantum annealing systems manufactured by D-Wave Systems is also presented.
A comparison of approaches for finding minimum identifying codes on graphs
NASA Astrophysics Data System (ADS)
Horan, Victoria; Adachi, Steve; Bak, Stanley
2016-05-01
In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.
Managing the Risks of Climate Change and Terrorism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosa, Eugene; Dietz, Tom; Moss, Richard H.
2012-04-07
Society has difficult decisions to make about how best to allocate its resources to ensure future sustainability. Risk assessment can be a valuable tool: it has long been used to support decisions to address environmental problems. But in a time when the risks to sustainability range from climate change to terrorism, applying risk assessment to sustainability will require careful rethinking. For new threats, we will need a new approach to risk assessment.
NASA Technical Reports Server (NTRS)
Lewis, Clayton; Wilde, Nick
1989-01-01
Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.
[Community coordination of dental care needs in a home medical care support ward and at home].
Sumi, Yasunori; Ozawa, Nobuyoshi; Miura, Hiroko; Miura, Hisayuki; Toba, Kenji
2011-01-01
The purpose of this study was to ascertain the current statuses and problems of dental home care patients by surveying the oral care status and needs of patients in the home medical care support ward at the National Center for Geriatrics and Gerontology. Patients that required continuous oral management even after discharge from the hospital were referred to local dental clinics to receive home dental care. We investigated the suitability and problems associated with such care, and identified the dental care needs of home patients and the status of local care coordination, including those in hospitals. The subjects were 82 patients. We ascertained their general condition and oral status, and also investigated the problems associated with patients judged to need specialized oral care by a dentist during oral treatment. Patients who required continuous specialized oral care after discharge from hospital were referred to dental clinics that could provide regular care, and the problems at the time of referral were identified. Dry mouth was reported by many patients. A large number of patients also needed specialized dental treatment such as the removal of dental calculus or tooth extraction. Problems were seen in oral function, with 38 of the patients (46%) unable to gargle and 23 (28%) unable to hold their mouths open. About half of the patients also had dementia, and communication with these patients was difficult. Of the 43 patients who were judged to need continuing oral care after discharge from hospital, their referral to a dental clinic for regular care was successful for 22 (51%) patients and unsuccessful for 21 (49%) patients. The reasons for unsuccessful referrals included the fact that the family, patient, nurse, or caregiver did not understand the need for specialized oral care. The present results suggest the need for specialized oral treatment in home medical care. These findings also suggest that coordinating seamless dental care among primary physicians, intermediates, and transferring care after hospital discharge to regular dentists is difficult.
Thermal control on the lunar surface
NASA Technical Reports Server (NTRS)
Walker, Sherry T.; Alexander, Reginald A.; Tucker, Stephen P.
1995-01-01
For a mission to the Moon which lasts more than a few days, thermal control is a challenging problem because of the Moon's wide temperature swings and long day and night periods. During the lunar day it is difficult to reject heat temperatures low enough to be comfortable for either humans or electronic components, while excessive heat loss can damage unprotected equipment at night. Fluid systems can readily be designed to operate at either the hot or cold temperature extreme but it is more difficult to accomodate both extermes within the same system. Special consideration should be given to sensitive systems, such as optics and humans, and systems that generate large amounts of waste heat, such as lunar bases or manufacturing facilities. Passive thermal control systems such as covers, shades and optical coatings can be used to mitigate the temperature swings experienced by components. For more precise thermal control active systems such as heaters or heat pumps are required although they require more power than passive systems.
Multispectral optical telescope alignment testing for a cryogenic space environment
NASA Astrophysics Data System (ADS)
Newswander, Trent; Hooser, Preston; Champagne, James
2016-09-01
Multispectral space telescopes with visible to long wave infrared spectral bands provide difficult alignment challenges. The visible channels require precision in alignment and stability to provide good image quality in short wavelengths. This is most often accomplished by choosing materials with near zero thermal expansion glass or ceramic mirrors metered with carbon fiber reinforced polymer (CFRP) that are designed to have a matching thermal expansion. The IR channels are less sensitive to alignment but they often require cryogenic cooling for improved sensitivity with the reduced radiometric background. Finding efficient solutions to this difficult problem of maintaining good visible image quality at cryogenic temperatures has been explored with the building and testing of a telescope simulator. The telescope simulator is an onaxis ZERODUR® mirror, CFRP metered set of optics. Testing has been completed to accurately measure telescope optical element alignment and mirror figure changes in a cryogenic space simulated environment. Measured alignment error and mirror figure error test results are reported with a discussion of their impact on system optical performance.
Revisiting software specification and design for large astronomy projects
NASA Astrophysics Data System (ADS)
Wiant, Scott; Berukoff, Steven
2016-07-01
The separation of science and engineering in the delivery of software systems overlooks the true nature of the problem being solved and the organization that will solve it. Use of a systems engineering approach to managing the requirements flow between these two groups as between a customer and contractor has been used with varying degrees of success by well-known entities such as the U.S. Department of Defense. However, treating science as the customer and engineering as the contractor fosters unfavorable consequences that can be avoided and opportunities that are missed. For example, the "problem" being solved is only partially specified through the requirements generation process since it focuses on detailed specification guiding the parties to a technical solution. Equally important is the portion of the problem that will be solved through the definition of processes and staff interacting through them. This interchange between people and processes is often underrepresented and under appreciated. By concentrating on the full problem and collaborating on a strategy for its solution a science-implementing organization can realize the benefits of driving towards common goals (not just requirements) and a cohesive solution to the entire problem. The initial phase of any project when well executed is often the most difficult yet most critical and thus it is essential to employ a methodology that reinforces collaboration and leverages the full suite of capabilities within the team. This paper describes an integrated approach to specifying the needs induced by a problem and the design of its solution.
NASA Astrophysics Data System (ADS)
Buchner, Johannes
2011-12-01
Scheduling, the task of producing a time table for resources and tasks, is well-known to be a difficult problem the more resources are involved (a NP-hard problem). This is about to become an issue in Radio astronomy as observatories consisting of hundreds to thousands of telescopes are planned and operated. The Square Kilometre Array (SKA), which Australia and New Zealand bid to host, is aiming for scales where current approaches -- in construction, operation but also scheduling -- are insufficent. Although manual scheduling is common today, the problem is becoming complicated by the demand for (1) independent sub-arrays doing simultaneous observations, which requires the scheduler to plan parallel observations and (2) dynamic re-scheduling on changed conditions. Both of these requirements apply to the SKA, especially in the construction phase. We review the scheduling approaches taken in the astronomy literature, as well as investigate techniques from human schedulers and today's observatories. The scheduling problem is specified in general for scientific observations and in particular on radio telescope arrays. Also taken into account is the fact that the observatory may be oversubscribed, requiring the scheduling problem to be integrated with a planning process. We solve this long-term scheduling problem using a time-based encoding that works in the very general case of observation scheduling. This research then compares algorithms from various approaches, including fast heuristics from CPU scheduling, Linear Integer Programming and Genetic algorithms, Branch-and-Bound enumeration schemes. Measures include not only goodness of the solution, but also scalability and re-scheduling capabilities. In conclusion, we have identified a fast and good scheduling approach that allows (re-)scheduling difficult and changing problems by combining heuristics with a Genetic algorithm using block-wise mutation operations. We are able to explain and eradicate two problems in the literature: The inability of a GA to properly improve schedules and the generation of schedules with frequent interruptions. Finally, we demonstrate the scheduling framework for several operating telescopes: (1) Dynamic re-scheduling with the AUT Warkworth 12m telescope, (2) Scheduling for the Australian Mopra 22m telescope and scheduling for the Allen Telescope Array. Furthermore, we discuss the applicability of the presented scheduling framework to the Atacama Large Millimeter/submillimeter Array (ALMA, in construction) and the SKA. In particular, during the development phase of the SKA, this dynamic, scalable scheduling framework can accommodate changing conditions.
ERIC Educational Resources Information Center
Taber, Mary R.
2013-01-01
Mathematics can be a difficult topic both to teach and to learn. Word problems specifically can be difficult for students with disabilities because they have to conceptualize what the problem is asking for, and they must perform the correct operation accurately. Current trends in mathematics instruction stem from the National Council of Teachers…
Optical techniques to feed and control GaAs MMIC modules for phased array antenna applications
NASA Astrophysics Data System (ADS)
Bhasin, K. B.; Anzic, G.; Kunath, R. R.; Connolly, D. J.
A complex signal distribution system is required to feed and control GaAs monolithic microwave integrated circuits (MMICs) for phased array antenna applications above 20 GHz. Each MMIC module will require one or more RF lines, one or more bias voltage lines, and digital lines to provide a minimum of 10 bits of combined phase and gain control information. In a closely spaced array, the routing of these multiple lines presents difficult topology problems as well as a high probability of signal interference. To overcome GaAs MMIC phased array signal distribution problems optical fibers interconnected to monolithically integrated optical components with GaAs MMIC array elements are proposed as a solution. System architecture considerations using optical fibers are described. The analog and digital optical links to respectively feed and control MMIC elements are analyzed. It is concluded that a fiber optic network will reduce weight and complexity, and increase reliability and performance, but higher power will be required.
Optical techniques to feed and control GaAs MMIC modules for phased array antenna applications
NASA Technical Reports Server (NTRS)
Bhasin, K. B.; Anzic, G.; Kunath, R. R.; Connolly, D. J.
1986-01-01
A complex signal distribution system is required to feed and control GaAs monolithic microwave integrated circuits (MMICs) for phased array antenna applications above 20 GHz. Each MMIC module will require one or more RF lines, one or more bias voltage lines, and digital lines to provide a minimum of 10 bits of combined phase and gain control information. In a closely spaced array, the routing of these multiple lines presents difficult topology problems as well as a high probability of signal interference. To overcome GaAs MMIC phased array signal distribution problems optical fibers interconnected to monolithically integrated optical components with GaAs MMIC array elements are proposed as a solution. System architecture considerations using optical fibers are described. The analog and digital optical links to respectively feed and control MMIC elements are analyzed. It is concluded that a fiber optic network will reduce weight and complexity, and increase reliability and performance, but higher power will be required.
Exact parallel algorithms for some members of the traveling salesman problem family
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pekny, J.F.
1989-01-01
The traveling salesman problem and its many generalizations comprise one of the best known combinatorial optimization problem families. Most members of the family are NP-complete problems so that exact algorithms require an unpredictable and sometimes large computational effort. Parallel computers offer hope for providing the power required to meet these demands. A major barrier to applying parallel computers is the lack of parallel algorithms. The contributions presented in this thesis center around new exact parallel algorithms for the asymmetric traveling salesman problem (ATSP), prize collecting traveling salesman problem (PCTSP), and resource constrained traveling salesman problem (RCTSP). The RCTSP is amore » particularly difficult member of the family since finding a feasible solution is an NP-complete problem. An exact sequential algorithm is also presented for the directed hamiltonian cycle problem (DHCP). The DHCP algorithm is superior to current heuristic approaches and represents the first exact method applicable to large graphs. Computational results presented for each of the algorithms demonstrates the effectiveness of combining efficient algorithms with parallel computing methods. Performance statistics are reported for randomly generated ATSPs with 7,500 cities, PCTSPs with 200 cities, RCTSPs with 200 cities, DHCPs with 3,500 vertices, and assignment problems of size 10,000. Sequential results were collected on a Sun 4/260 engineering workstation, while parallel results were collected using a 14 and 100 processor BBN Butterfly Plus computer. The computational results represent the largest instances ever solved to optimality on any type of computer.« less
Sensitivity and bias under conditions of equal and unequal academic task difficulty.
Reed, Derek D; Martens, Brian K
2008-01-01
We conducted an experimental analysis of children's relative problem-completion rates across two workstations under conditions of equal (Experiment 1) and unequal (Experiment 2) problem difficulty. Results were described using the generalized matching equation and were evaluated for degree of schedule versus stimulus control. Experiment 1 involved a symmetrical choice arrangement in which the children could earn points exchangeable for rewards contingent on correct math problem completion. Points were delivered according to signaled variable-interval schedules at each workstation. For 2 children, relative rates of problem completion appeared to have been controlled by the schedule requirements in effect and matched relative rates of reinforcement, with sensitivity values near 1 and bias values near 0. Experiment 2 involved increasing the difficulty of math problems at one of the workstations. Sensitivity values for all 3 participants were near 1, but a substantial increase in bias toward the easier math problems was observed. This bias was possibly associated with responding at the more difficult workstation coming under stimulus control rather than schedule control.
Management of common behaviour and mental health problems.
El-Radhi, A Sahib
Behavioural problems are usually influenced by both biological and environmental factors. Disruptive behavioural problems such temper tantrums or attention deficit hyperactivity disorder are displayed during the first years of childhood. Breath-holding attacks are relatively common and are an important problem. Although the attacks are not serious and the prognosis is usually good, parents often fear that their child may die during an attack. Parents therefore require explanation and reassurance from health professionals. Conduct disorders (often referred to as antisocial behaviours), such as aggression to others or theft, are more serious as they tend to be repetitive and persistent behaviours where the basic rights of others are violated. Emotional problems, such as anxiety, depression and post-traumatic stress disorder tend to occur in later childhood, and are often unrecognised because young children often find it difficult to express their emotions, or it may go unnoticed by the child's parents. This article briefly discusses the most common behavioural problems, including autism, that affect children of all ages.
High-frequency CAD-based scattering model: SERMAT
NASA Astrophysics Data System (ADS)
Goupil, D.; Boutillier, M.
1991-09-01
Specifications for an industrial radar cross section (RCS) calculation code are given: it must be able to exchange data with many computer aided design (CAD) systems, it must be fast, and it must have powerful graphic tools. Classical physical optics (PO) and equivalent currents (EC) techniques have proven their efficiency on simple objects for a long time. Difficult geometric problems occur when objects with very complex shapes have to be computed. Only a specific geometric code can solve these problems. We have established that, once these problems have been solved: (1) PO and EC give good results on complex objects of large size compared to wavelength; and (2) the implementation of these objects in a software package (SERMAT) allows fast and sufficiently precise domain RCS calculations to meet industry requirements in the domain of stealth.
Gioacchini, Matteo; Bottoni, Manuela; Grassetti, Luca; Scalise, Alessandro
2015-01-01
Summary: Lower-pole shaping of the breast is sometimes a difficult challenge when performing vertical mammoplasty. The problems mostly encountered are too large breast bases, persistent dog ears, which require long incision, and poor breast projection. We report a modification of the technique that we use in breast reduction so as to better shape the lower pole and to reduce revision surgery. PMID:26034653
Porosity Estimation By Artificial Neural Networks Inversion . Application to Algerian South Field
NASA Astrophysics Data System (ADS)
Eladj, Said; Aliouane, Leila; Ouadfeul, Sid-Ali
2017-04-01
One of the main geophysicist's current challenge is the discovery and the study of stratigraphic traps, this last is a difficult task and requires a very fine analysis of the seismic data. The seismic data inversion allows obtaining lithological and stratigraphic information for the reservoir characterization . However, when solving the inverse problem we encounter difficult problems such as: Non-existence and non-uniqueness of the solution add to this the instability of the processing algorithm. Therefore, uncertainties in the data and the non-linearity of the relationship between the data and the parameters must be taken seriously. In this case, the artificial intelligence techniques such as Artificial Neural Networks(ANN) is used to resolve this ambiguity, this can be done by integrating different physical properties data which requires a supervised learning methods. In this work, we invert the acoustic impedance 3D seismic cube using the colored inversion method, then, the introduction of the acoustic impedance volume resulting from the first step as an input of based model inversion method allows to calculate the Porosity volume using the Multilayer Perceptron Artificial Neural Network. Application to an Algerian South hydrocarbon field clearly demonstrate the power of the proposed processing technique to predict the porosity for seismic data, obtained results can be used for reserves estimation, permeability prediction, recovery factor and reservoir monitoring. Keywords: Artificial Neural Networks, inversion, non-uniqueness , nonlinear, 3D porosity volume, reservoir characterization .
Côté, L.; Clavet, D.; St-Hilaire, S.; Vaillancourt, C.; Blondeau, F.; Martineau, B.
1999-01-01
PROBLEM ADDRESSED: In addition to clinical instruction, residents need "people" skills that will enable them to deal with all sorts of patients in difficult clinical situations. We planned a series of 12 seminars to teach these skills to first-year residents. OBJECTIVES OF PROGRAM: To ask relevant questions typical of the patient-centred approach; with empathy and respect, to encourage patients to express their emotions; to become more aware of one's own emotions and reactions in one's work as a physician; to negotiate with patients, taking into account both the patient's agenda and one's own. MAIN COMPONENTS OF PROGRAM: Clinical problems drawn from a list of situations likely to involve difficult contact with patients were used to achieve program objectives. Various teaching methods (discussion, brief presentation, practical demonstration, role play) were used during the four stages of skills development: information, demonstration, practice, and feedback. Various tools were used to test the program. CONCLUSION: Proper planning requires ongoing exploration of objectives, content, teaching methods, and evaluation. This discussion of the teaching principles applied in planning our seminars might inspire others to develop similar programs. PMID:10349069
Resource-aware taxon selection for maximizing phylogenetic diversity.
Pardi, Fabio; Goldman, Nick
2007-06-01
Phylogenetic diversity (PD) is a useful metric for selecting taxa in a range of biological applications, for example, bioconservation and genomics, where the selection is usually constrained by the limited availability of resources. We formalize taxon selection as a conceptually simple optimization problem, aiming to maximize PD subject to resource constraints. This allows us to take into account the different amounts of resources required by the different taxa. Although this is a computationally difficult problem, we present a dynamic programming algorithm that solves it in pseudo-polynomial time. Our algorithm can also solve many instances of the Noah's Ark Problem, a more realistic formulation of taxon selection for biodiversity conservation that allows for taxon-specific extinction risks. These instances extend the set of problems for which solutions are available beyond previously known greedy-tractable cases. Finally, we discuss the relevance of our results to real-life scenarios.
Data parallel sorting for particle simulation
NASA Technical Reports Server (NTRS)
Dagum, Leonardo
1992-01-01
Sorting on a parallel architecture is a communications intensive event which can incur a high penalty in applications where it is required. In the case of particle simulation, only integer sorting is necessary, and sequential implementations easily attain the minimum performance bound of O (N) for N particles. Parallel implementations, however, have to cope with the parallel sorting problem which, in addition to incurring a heavy communications cost, can make the minimun performance bound difficult to attain. This paper demonstrates how the sorting problem in a particle simulation can be reduced to a merging problem, and describes an efficient data parallel algorithm to solve this merging problem in a particle simulation. The new algorithm is shown to be optimal under conditions usual for particle simulation, and its fieldwise implementation on the Connection Machine is analyzed in detail. The new algorithm is about four times faster than a fieldwise implementation of radix sort on the Connection Machine.
To Cooperate or Not to Cooperate: Why Behavioural Mechanisms Matter
2016-01-01
Mutualistic cooperation often requires multiple individuals to behave in a coordinated fashion. Hence, while the evolutionary stability of mutualistic cooperation poses no particular theoretical difficulty, its evolutionary emergence faces a chicken and egg problem: an individual cannot benefit from cooperating unless other individuals already do so. Here, we use evolutionary robotic simulations to study the consequences of this problem for the evolution of cooperation. In contrast with standard game-theoretic results, we find that the transition from solitary to cooperative strategies is very unlikely, whether interacting individuals are genetically related (cooperation evolves in 20% of all simulations) or unrelated (only 3% of all simulations). We also observe that successful cooperation between individuals requires the evolution of a specific and rather complex behaviour. This behavioural complexity creates a large fitness valley between solitary and cooperative strategies, making the evolutionary transition difficult. These results reveal the need for research on biological mechanisms which may facilitate this transition. PMID:27148874
The Complexity of Folding Self-Folding Origami
NASA Astrophysics Data System (ADS)
Stern, Menachem; Pinson, Matthew B.; Murugan, Arvind
2017-10-01
Why is it difficult to refold a previously folded sheet of paper? We show that even crease patterns with only one designed folding motion inevitably contain an exponential number of "distractor" folding branches accessible from a bifurcation at the flat state. Consequently, refolding a sheet requires finding the ground state in a glassy energy landscape with an exponential number of other attractors of higher energy, much like in models of protein folding (Levinthal's paradox) and other NP-hard satisfiability (SAT) problems. As in these problems, we find that refolding a sheet requires actuation at multiple carefully chosen creases. We show that seeding successful folding in this way can be understood in terms of subpatterns that fold when cut out ("folding islands"). Besides providing guidelines for the placement of active hinges in origami applications, our results point to fundamental limits on the programmability of energy landscapes in sheets.
Design, construction, and testing of a high altitude research glider
NASA Astrophysics Data System (ADS)
Parker, Trevor Llewellyn
Micro aerial vehicle development and atmospheric flight on Mars are areas that require research in very low Reynolds number flight. Facilities for studying these problems are not widely available. The upper atmosphere of the Earth, approximately 100,000 feet AGL, is readily available and closely resembles the atmosphere on Mars, in both temperature and density. This low density also allows normal size test geometry with a very low Reynolds number. This solves a problem in micro aerial vehicle development; it can be very difficult to manufacture instrumented test apparatus in the small sizes required for conventional testing. This thesis documents the design, construction, and testing of a glider designed to be released from a weather balloon at 100,000 feet AGL and operate in this environment, collecting airfoil and aircraft performance data. The challenges of designing a vehicle to operate in a low Reynolds number, low temperature environment are addressed.
Zhang, H H; Gao, S; Chen, W; Shi, L; D'Souza, W D; Meyer, R R
2013-03-21
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equallyspaced beams (eplans), we have developed a global search metaheuristic process based on the nested partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are of superior quality.
Zhang, H H; Gao, S; Chen, W; Shi, L; D’Souza, W D; Meyer, R R
2013-01-01
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equally-spaced beams (eplans), we have developed a global search metaheuristic process based on the Nested Partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are superior quality. PMID:23459411
Skinner, D; Hesseling, A C; Francis, C; Mandalakas, A M
2013-09-21
Isoniazid preventive therapy (IPT) offers children protection against tuberculosis (TB), but it has been difficult to implement, particularly in developing countries. To understand what encourages or inhibits children from adhering to IPT. In-depth interviews were conducted with two parents of children adherent to IPT and two staff members from three primary health care clinics in high TB prevalence communities. Themes explored were knowledge and attitudes towards IPT, problems in accessing and adhering to treatment, and community responses. Parents administering treatment valued it positively, realised their children's risk of TB, and were positive about the clinic. Nurses acknowledged that resistance to treatment remained, with some parents not wanting to acknowledge risk nor willing to make the effort for their children; there was also considerable misinformation about IPT. Clinic nurses acknowledged problems of staff shortages, lengthy waiting times and conflict between staff and community members. Adherence was affected by social problems, stigma about TB and its link to the human immunodeficiency virus, and the extended treatment period. Parents who maintained adherence to the IPT regimen showed that it was possible even in very difficult circumstances. Further effort is required to improve some of the clinic services, correct misinformation, reduce stigma and provide support to parents.
Difficult incidents and tutor interventions in problem-based learning tutorials.
Kindler, Pawel; Grant, Christopher; Kulla, Steven; Poole, Gary; Godolphin, William
2009-09-01
Tutors report difficult incidents and distressing conflicts that adversely affect learning in their problem-based learning (PBL) groups. Faculty development (training) and peer support should help them to manage this. Yet our understanding of these problems and how to deal with them often seems inadequate to help tutors. The aim of this study was to categorise difficult incidents and the interventions that skilled tutors used in response, and to determine the effectiveness of those responses. Thirty experienced and highly rated tutors in our Year 1 and 2 medical curriculum took part in semi-structured interviews to: identify and describe difficult incidents; describe how they responded, and assess the success of each response. Recorded and transcribed data were analysed thematically to develop typologies of difficult incidents and interventions and compare reported success or failure. The 94 reported difficult incidents belonged to the broad categories 'individual student' or 'group dynamics'. Tutors described 142 interventions in response to these difficult incidents, categorised as: (i) tutor intervenes during tutorial; (ii) tutor gives feedback outside tutorial, or (iii) student or group intervenes. Incidents in the 'individual student' category were addressed relatively unsuccessfully (effective < 50% of the time) by response (i), but with moderate success by response (ii) and successfully (> 75% of the time) by response (iii). None of the interventions worked well when used in response to problems related to 'group dynamics'. Overall, 59% of the difficult incidents were dealt with successfully. Dysfunctional PBL groups can be highly challenging, even for experienced and skilled tutors. Within-tutorial feedback, the treatment that tutors are most frequently advised to apply, was often not effective. Our study suggests that the collective responsibility of the group, rather than of the tutor, to deal with these difficulties should be emphasised.
A Novel Partnership Disrupts the Norm in Early Childhood Education and Pediatric Health Care.
Dowd, M Denise; Lantos, John D
2017-09-01
Children living in poverty in the United States in 2016 face a devastating combination of psychological problems. Their neighborhoods are often violent. They have no place to get healthy food. It is not safe to play outside, even on playgrounds. The children who grow up in this environment, not surprisingly, have many adverse childhood experiences (ACEs). ACEs cause toxic stress. Toxic stress leads to long-term physical and psychological problems. For many pediatricians, children's hospitals, civic leaders, and public health officials, it is difficult to know how to intervene. While the science on causation is indisputable, there are fewer data about treatment. We know that intervention should start early, but the types of interventions that are being proposed require extensive collaboration between social services, health care, and education. Such collaborations require a new sort of cooperation among professionals in disciplines that have not traditionally worked closely together. But they need to start. No one group will be able to solve this problem. This issue of Current Problems in Pediatric and Adolescent Health Care essentially provides a case study of one community's attempt to develop such a collaboration. Copyright © 2017 Mosby, Inc. All rights reserved.
Hartmann, Klaas; Steel, Mike
2006-08-01
The Noah's Ark Problem (NAP) is a comprehensive cost-effectiveness methodology for biodiversity conservation that was introduced by Weitzman (1998) and utilizes the phylogenetic tree containing the taxa of interest to assess biodiversity. Given a set of taxa, each of which has a particular survival probability that can be increased at some cost, the NAP seeks to allocate limited funds to conserving these taxa so that the future expected biodiversity is maximized. Finding optimal solutions using this framework is a computationally difficult problem to which a simple and efficient "greedy" algorithm has been proposed in the literature and applied to conservation problems. We show that, although algorithms of this type cannot produce optimal solutions for the general NAP, there are two restricted scenarios of the NAP for which a greedy algorithm is guaranteed to produce optimal solutions. The first scenario requires the taxa to have equal conservation cost; the second scenario requires an ultrametric tree. The NAP assumes a linear relationship between the funding allocated to conservation of a taxon and the increased survival probability of that taxon. This relationship is briefly investigated and one variation is suggested that can also be solved using a greedy algorithm.
Traumatic injury to the portal vein.
Mattox, K L; Espada, R; Beall, A R
1975-01-01
Traumatic injuries to the upper abdominal vasculature pose difficult management problems related to both exposure and associated injuries. Among those injuries that are more difficult to manage are those involving the portal vein. While occurring rarely, portal vein injuries require specific therapeutic considerations. Between January, 1968, and July, 1974, over 2000 patients were treated operatively for abdominal trauma at the Ben Taub General Hospital. Among these patients, 22 had injury to the portal vein. Seventeen portal vein injuries were secondary to gunshot wounds, 3 to stab wounds, and 2 to blunt trauma. Associated injuries to the inferior vena cava, pancreas, liver and bile ducts were common. Three patients had associated abdominal aortic injuries, two with acute aorto-caval fistulae. Nine patients died from from failure to control hemorrhage. Eleven were long-term survivors, including two who required pancreataico-duodenectomy as well as portal venorrhaphy. Late complications were rare. The operative approach to patients with traumatic injuries to multiple organs in the upper abdomen, including the portal vein, requires aggressive management and predetermined sequential methods of repair. In spite of innumerable associated injuries, portal vein injuries can be successfully managed in a significant number of patients using generally available surgical techniques and several adjunctive maneuvers. PMID:1130870
Modeling visual problem solving as analogical reasoning.
Lovett, Andrew; Forbus, Kenneth
2017-01-01
We present a computational model of visual problem solving, designed to solve problems from the Raven's Progressive Matrices intelligence test. The model builds on the claim that analogical reasoning lies at the heart of visual problem solving, and intelligence more broadly. Images are compared via structure mapping, aligning the common relational structure in 2 images to identify commonalities and differences. These commonalities or differences can themselves be reified and used as the input for future comparisons. When images fail to align, the model dynamically rerepresents them to facilitate the comparison. In our analysis, we find that the model matches adult human performance on the Standard Progressive Matrices test, and that problems which are difficult for the model are also difficult for people. Furthermore, we show that model operations involving abstraction and rerepresentation are particularly difficult for people, suggesting that these operations may be critical for performing visual problem solving, and reasoning more generally, at the highest level. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Physics-based approach to color image enhancement in poor visibility conditions.
Tan, K K; Oakley, J P
2001-10-01
Degradation of images by the atmosphere is a familiar problem. For example, when terrain is imaged from a forward-looking airborne camera, the atmosphere degradation causes a loss in both contrast and color information. Enhancement of such images is a difficult task because of the complexity in restoring both the luminance and the chrominance while maintaining good color fidelity. One particular problem is the fact that the level of contrast loss depends strongly on wavelength. A novel method is presented for the enhancement of color images. This method is based on the underlying physics of the degradation process, and the parameters required for enhancement are estimated from the image itself.
Cycle life machine for AX-5 space suit
NASA Technical Reports Server (NTRS)
Schenberger, Deborah S.
1990-01-01
In order to accurately test the AX-5 space suit, a complex series of motions needed to be performed which provided a unique opportunity for mechanism design. The cycle life machine design showed how 3-D computer images can enhance mechanical design as well as help in visualizing mechanisms before manufacturing them. In the early stages of the design, potential problems in the motion of the joint and in the four bar linkage system were resolved using CAD. Since these problems would have been very difficult and tedious to solve on a drawing board, they would probably not have been addressed prior to fabrication, thus limiting the final design or requiring design modification after fabrication.
A Comparative Study of Interferometric Regridding Algorithms
NASA Technical Reports Server (NTRS)
Hensley, Scott; Safaeinili, Ali
1999-01-01
THe paper discusses regridding options: (1) The problem of interpolating data that is not sampled on a uniform grid, that is noisy, and contains gaps is a difficult problem. (2) Several interpolation algorithms have been implemented: (a) Nearest neighbor - Fast and easy but shows some artifacts in shaded relief images. (b) Simplical interpolator - uses plane going through three points containing point where interpolation is required. Reasonably fast and accurate. (c) Convolutional - uses a windowed Gaussian approximating the optimal prolate spheroidal weighting function for a specified bandwidth. (d) First or second order surface fitting - Uses the height data centered in a box about a given point and does a weighted least squares surface fit.
Dynamic Control of Plans with Temporal Uncertainty
NASA Technical Reports Server (NTRS)
Morris, Paul; Muscettola, Nicola; Vidal, Thierry
2001-01-01
Certain planning systems that deal with quantitative time constraints have used an underlying Simple Temporal Problem solver to ensure temporal consistency of plans. However, many applications involve processes of uncertain duration whose timing cannot be controlled by the execution agent. These cases require more complex notions of temporal feasibility. In previous work, various "controllability" properties such as Weak, Strong, and Dynamic Controllability have been defined. The most interesting and useful Controllability property, the Dynamic one, has ironically proved to be the most difficult to analyze. In this paper, we resolve the complexity issue for Dynamic Controllability. Unexpectedly, the problem turns out to be tractable. We also show how to efficiently execute networks whose status has been verified.
Sustainer electric propulsion system application for spacecraft attitude control
NASA Astrophysics Data System (ADS)
Obukhov, V. A.; Pokryshkin, A. I.; Popov, G. A.; Yashina, N. V.
2010-07-01
Application of electric propulsion system (EPS) requires spacecraft (SC) equipping with large solar panels (SP) for the power supply to electric propulsions. This makes the problem of EPS-equipped SC control at the insertion stage more difficult to solve than in the case of SC equipped with chemical engines, because in addition to the SC attitude control associated with the mission there appears necessity in keeping SP orientation to Sun that is necessary for generation of electric power sufficient for the operation of service systems, purpose-oriented equipment, and EPS. The theoretical study of the control problem is the most interesting for a non-coplanar transfer from high elliptic orbit (HEO) to geostationary orbit (GSO).
Major Thought Restructuring: The Roles of Different Prefrontal Cortical Regions.
Seyed-Allaei, Shima; Avanaki, Zahra Nasiri; Bahrami, Bahador; Shallice, Tim
2017-07-01
An important question for understanding the neural basis of problem solving is whether the regions of human prefrontal cortices play qualitatively different roles in the major cognitive restructuring required to solve difficult problems. However, investigating this question using neuroimaging faces a major dilemma: either the problems do not require major cognitive restructuring, or if they do, the restructuring typically happens once, rendering repeated measurements of the critical mental process impossible. To circumvent these problems, young adult participants were challenged with a one-dimensional Subtraction (or Nim) problem [Bouton, C. L. Nim, a game with a complete mathematical theory. The Annals of Mathematics, 3, 35-39, 1901] that can be tackled using two possible strategies. One, often used initially, is effortful, slow, and error-prone, whereas the abstract solution, once achieved, is easier, quicker, and more accurate. Behaviorally, success was strongly correlated with sex. Using voxel-based morphometry analysis controlling for sex, we found that participants who found the more abstract strategy (i.e., Solvers) had more gray matter volume in the anterior medial, ventrolateral prefrontal, and parietal cortices compared with those who never switched from the initial effortful strategy (i.e., Explorers). Removing the sex covariate showed higher gray matter volume in Solvers (vs. Explorers) in the right ventrolateral prefrontal and left parietal cortex.
Holmberg, Leif
2007-11-01
A health-care organization simultaneously belongs to two different institutional value patterns: a professional and an administrative value pattern. At the administrative level, medical problem-solving processes are generally perceived as the efficient application of familiar chains of activities to well-defined problems; and a low task uncertainty is therefore assumed at the work-floor level. This assumption is further reinforced through clinical pathways and other administrative guidelines. However, studies have shown that in clinical practice such administrative guidelines are often considered inadequate and difficult to implement mainly because physicians generally perceive task uncertainty to be high and that the guidelines do not cover the scope of encountered deviations. The current administrative level guidelines impose uniform structural features that meet the requirement for low task uncertainty. Within these structural constraints, physicians must organize medical problem-solving processes to meet any task uncertainty that may be encountered. Medical problem-solving processes with low task uncertainty need to be organized independently of processes with high task uncertainty. Each process must be evaluated according to different performance standards and needs to have autonomous administrative guideline models. Although clinical pathways seem appropriate when there is low task uncertainty, other kinds of guidelines are required when the task uncertainty is high.
Patient safety and the problem of many hands
Dixon-Woods, Mary; Pronovost, Peter
2016-01-01
Summary Healthcare worldwide is faced with a crisis of patient safety: every day, everywhere, patients are injured during the course of their care. Notwithstanding occasional successes in relation to specific harms, safety as a system characteristic has remained elusive. We propose that one neglected reason why the safety problem has proved so stubborn is that healthcare suffers from a pathology known in the public administration literature as the problem of many hands. It is a problem that arises in contexts where multiple actors – organizations, individuals, groups – each contribute to effects seen at system level, but it remains difficult to hold any single actor responsible for these effects. Efforts by individual actors, including local quality improvement projects, may have the paradoxical effect of undermining system safety. Many challenges cannot be resolved by individual organisations, since they require whole-sector coordination and action. We call for recognition of the problem of many hands and for attention to be given to how it might most optimally be addressed in a healthcare context. PMID:26912578
On Complex Water Conflicts: Role of Enabling Conditions for Pragmatic Resolution
NASA Astrophysics Data System (ADS)
Islam, S.; Choudhury, E.
2016-12-01
Many of our current and emerging water problems are interconnected and cross boundaries, domains, scales, and sectors. These boundary crossing water problems are neither static nor linear; but often are interconnected nonlinearly with other problems and feedback. The solution space for these complex problems - involving interdependent variables, processes, actors, and institutions - can't be pre-stated. We need to recognize the disconnect among values, interests, and tools as well as problems, policies, and politics. Scientific and technological solutions are desired for efficiency and reliability, but need to be politically feasible and actionable. Governing and managing complex water problems require difficult tradeoffs in exploring and sharing benefits and burdens through carefully crafted negotiation processes. The crafting of such negotiation process, we argue, constitutes a pragmatic approach to negotiation - one that is based on the identification of enabling conditions - as opposed to mechanistic casual explanations, and rooted in contextual conditions to specify and ensure the principles of equity and sustainability. We will use two case studies to demonstrate the efficacy of the proposed principled pragmatic approcah to address complex water problems.
Graph pyramids as models of human problem solving
NASA Astrophysics Data System (ADS)
Pizlo, Zygmunt; Li, Zheng
2004-05-01
Prior theories have assumed that human problem solving involves estimating distances among states and performing search through the problem space. The role of mental representation in those theories was minimal. Results of our recent experiments suggest that humans are able to solve some difficult problems quickly and accurately. Specifically, in solving these problems humans do not seem to rely on distances or on search. It is quite clear that producing good solutions without performing search requires a very effective mental representation. In this paper we concentrate on studying the nature of this representation. Our theory takes the form of a graph pyramid. To verify the psychological plausibility of this theory we tested subjects in a Euclidean Traveling Salesman Problem in the presence of obstacles. The role of the number and size of obstacles was tested for problems with 6-50 cities. We analyzed the effect of experimental conditions on solution time per city and on solution error. The main result is that time per city is systematically affected only by the size of obstacles, but not by their number, or by the number of cities.
Evidence-based ergonomics: a model and conceptual structure proposal.
Silveira, Dierci Marcio
2012-01-01
In Human Factors and Ergonomics Science (HFES), it is difficult to identify what is the best approach to tackle the workplace and systems design problems which needs to be solved, and it has been also advocated as transdisciplinary and multidisciplinary the issue of "How to solve the human factors and ergonomics problems that are identified?". The proposition on this study is to combine the theoretical approach for Sustainability Science, the Taxonomy of the Human Factors and Ergonomics (HFE) discipline and the framework for Evidence-Based Medicine in an attempt to be applied in Human Factors and Ergonomics. Applications of ontologies are known in the field of medical research and computer science. By scrutinizing the key requirements for the HFES structuring of knowledge, it was designed a reference model, First, it was identified the important requirements for HFES Concept structuring, as regarded by Meister. Second, it was developed an evidence-based ergonomics framework as a reference model composed of six levels based on these requirements. Third, it was devised a mapping tool using linguistic resources to translate human work, systems environment and the complexities inherent to their hierarchical relationships to support future development at Level 2 of the reference model and for meeting the two major challenges for HFES, namely, identifying what problems should be addressed in HFE as an Autonomous Science itself and proposing solutions by integrating concepts and methods applied in HFES for those problems.
Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.
DiMaio, Frank
2017-01-01
Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.
The Lucky Seventh in the Bulge: A Case Study for the Airland Battle
1985-05-15
decidedly proactive and adherence to it requires commanders and soldiers to get inside of the enemy force and preclude them from acting as they would like...Struggle for Europe: World War II in Western Eurooe (New York: Harper and Row, 1952 ); * Charles Whiting, Death of a Division (New York: Stein and Day...Hoge, in the absence of any clear delineation of responsibility. The problem of who was in command made things difficult, but Hasbrouck acted as he saw
Enterocutaneous fistulas: an overview.
Whelan, J F; Ivatury, R R
2011-06-01
Enterocutaneous fistulas remain a difficult management problem. The basis of management centers on the prevention and treatment of sepsis, control of fistula effluent, and fluid and nutritional support. Early surgery should be limited to abscess drainage and proximal defunctioning stoma formation. Definitive procedures for a persistent fistula are indicated in the late postoperative period, with resection of the fistula segment and reanastomosis of healthy bowel. Even more complex are the enteroatmospheric fistulas in the open abdomen. These enteric fistulas require the highest level of multidisciplinary approach for optimal outcomes.
[Ethical questions related to nutrition and hidration: basic aspects].
Collazo Chao, E; Girela, E
2011-01-01
Conditions that pose ethical problems related to nutrition and hydration are very common nowdays, particularly within Hospitals among terminally ill patients and other patients who require nutrition and hydration. In this article we intend to analyze some circumstances, according to widely accepted ethical values, in order to outline a clear action model to help clinicians in making such difficult decisions. The problematic situations analyzed include: should hydration and nutrition be considered basic care or therapeutic measures?, and the ethical aspects of enteral versus parenteral nutrition.
Methods for Multiplex Template Sampling in Digital PCR Assays
Petriv, Oleh I.; Heyries, Kevin A.; VanInsberghe, Michael; Walker, David; Hansen, Carl L.
2014-01-01
The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision. PMID:24854517
Methods for multiplex template sampling in digital PCR assays.
Petriv, Oleh I; Heyries, Kevin A; VanInsberghe, Michael; Walker, David; Hansen, Carl L
2014-01-01
The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision.
Fabrication Of Metal Chloride Cathodes By Sintering
NASA Technical Reports Server (NTRS)
Bugga, Ratnakumar V.; Di Stefano, Salvador; Bankston, C. Perry
1992-01-01
Transition-metal chloride cathodes for use in high-temperature rechargeable sodium batteries prepared by sintering transition-metal powders mixed with sodium chloride. Need for difficult and dangerous chlorination process eliminated. Proportions of transition metal and sodium chloride in mixture adjusted to suit specific requirements. Cathodes integral to sodium/metal-chloride batteries, which have advantages over sodium/sulfur batteries including energy densities, increased safety, reduced material and thermal-management problems, and ease of operation and assembly. Being evaluated for supplying electrical power during peak demand and electric vehicles.
Collisional breakup in a quantum system of three charged particles
Rescigno; Baertschy; Isaacs; McCurdy
1999-12-24
Since the invention of quantum mechanics, even the simplest example of the collisional breakup of a system of charged particles, e(-) + H --> H(+) + e(-) + e(-) (where e(-) is an electron and H is hydrogen), has resisted solution and is now one of the last unsolved fundamental problems in atomic physics. A complete solution requires calculation of the energies and directions for a final state in which all three particles are moving away from each other. Even with supercomputers, the correct mathematical description of this state has proved difficult to apply. A framework for solving ionization problems in many areas of chemistry and physics is finally provided by a mathematical transformation of the Schrodinger equation that makes the final state tractable, providing the key to a numerical solution of this problem that reveals its full dynamics.
Optimal routing of IP packets to multi-homed servers
NASA Astrophysics Data System (ADS)
Swartz, K. L.
1992-08-01
Multi-homing, or direct attachment to multiple networks, offers both performance and availability benefits for important servers on busy networks. Exploiting these benefits to their fullest requires a modicum of routing knowledge in the clients. Careful policy control must also be reflected in the routing used within the network to make best use of specialized and often scarce resources. While relatively straightforward in theory, this problem becomes much more difficult to solve in a real network containing often intractable implementations from a variety of vendors. This paper presents an analysis of the problem and proposes a useful solution for a typical campus network. Application of this solution at the Stanford Linear Accelerator Center is studied and the problems and pitfalls encountered are discussed, as are the workarounds used to make the system work in the real world.
Urban Mathematics Teacher Retention
ERIC Educational Resources Information Center
Hamdan, Kamal
2010-01-01
Mathematics teachers are both more difficult to attract and more difficult to retain than social sciences teachers. This fact is not unique to the United States; it is reported as being a problem in Europe as well (Howson, 2002). In the United States, however, the problem is particularly preoccupying. Because of the chronic teacher shortages and…
Baccalaureate Student Perceptions of Challenging Family Problems: Building Bridges to Acceptance
ERIC Educational Resources Information Center
Floyd, Melissa; Gruber, Kenneth J.
2011-01-01
This study explored the attitudes of 147 undergraduate social work majors to working with difficult families. Students indicated which problems (from a list of 42, including hot topics such as homosexuality, transgender issues, abortion, and substance abuse) they believed they would find most difficult to work with and provided information…
NASA Astrophysics Data System (ADS)
Xing, Xi; Rey-de-Castro, Roberto; Rabitz, Herschel
2014-12-01
Optimally shaped femtosecond laser pulses can often be effectively identified in adaptive feedback quantum control experiments, but elucidating the underlying control mechanism can be a difficult task requiring significant additional analysis. We introduce landscape Hessian analysis (LHA) as a practical experimental tool to aid in elucidating control mechanism insights. This technique is applied to the dissociative ionization of CH2BrI using shaped fs laser pulses for optimization of the absolute yields of ionic fragments as well as their ratios for the competing processes of breaking the C-Br and C-I bonds. The experimental results suggest that these nominally complex problems can be reduced to a low-dimensional control space with insights into the control mechanisms. While the optimal yield for some fragments is dominated by a non-resonant intensity-driven process, the optimal generation of other fragments maa difficult task requiring significant additionaly be explained by a non-resonant process coupled to few level resonant dynamics. Theoretical analysis and modeling is consistent with the experimental observations.
Groin injuries in sport: treatment strategies.
Lynch, S A; Renström, P A
1999-08-01
Groin pain in athletes is a common problem that can result in significant amounts of missed playing time. Many of the problems are related to the musculoskeletal system, but care must be taken not to overlook other more serious and potentially life threatening medical cases of pelvis and groin pain. Stress fractures of the bones of the pelvis occur, particularly after a sudden increase in the intensity of training. Most of these stress fractures will heal with rest, but femoral neck stress fractures can potentially lead to more serious problems, and require closer evaluation and sometimes surgical treatment. Avulsion fractures of the apophyses occur through the relatively weaker growth plate in adolescents. Most of these will heal with a graduated physical therapy programme and do not need surgery. Osteitis pubis is characterised by sclerosis and bony changes about the pubic symphysis. This is a self-limiting disease that can take several months to resolve. Corticosteroid injection can sometimes hasten the rehabilitation process. Sports hernias can cause prolonged groin pain, and provide a difficult diagnostic dilemma. In athletes with prolonged groin pain, with increased pain during valsalva manoeuvres and tenderness along the posterior inguinal wall and external canal, an insidious sports hernia should be considered. In cases of true sports hernia, treatment is by surgical reinforcement of the inguinal wall. Nerve compression can occur to the nerves supplying the groin. In cases that do not respond to desensitisation measures, neurolysis can relieve the pain. Adductor strains are common problems in kicking sports such as soccer. The majority of these are incomplete muscle tendon tears that occur just adjacent to, the musculotendinous junction. Most of these will respond to a graduated stretching and strengthening programme, but these can sometimes take a long time to completely heal. Patience is the key to obtain complete healing, because a return to sports too early can lead to chronic pain, which becomes increasingly difficult to treat. Management of groin injuries can be challenging, and diagnosis can be difficult because of the degree of overlap of symptoms between the different problems. By careful history and clinical examination, with judicious use of special tests and good team work, a correct diagnosis can be obtained.
Asahi, Yoshinao; Omichi, Shiro; Ono, Takahiro
2015-09-01
Many stroke patients may have oral problems and systemic diseases, but clinical information on treatment provided to stroke patients for dental problems during inpatient rehabilitation is rare. The objective of this study was to research stroke inpatients' requirements for dental treatment and the accompanying risks. We included 165 stroke patients undergoing inpatient rehabilitation at Morinomiya Hospital during the year 2010 and researched the causes of stroke and the patients' orodental status, underlying diseases, antithrombotic drugs prescribed and special considerations or difficulties in the treatment. Cerebral infarction was the most common causes of stroke. Many patients had hypertension, heart disease or diabetes mellitus, and 54.5% had been prescribed antithrombotic drugs. Dentists diagnosed 57.0% patients with untreated dental cavities. Approximately 30% did not use dentures despite having a requirement. In total, 142 patients underwent dental treatment including periodontal treatment, prosthetic treatment and tooth extraction under management of circulation and haemostasis such as monitoring vital signs and surgical splints in cases of the difficult extraction. The current study revealed a high requirement for dental treatment among stroke patients and demonstrated the effectiveness of performing dental treatment during inpatient rehabilitation of these patients. © 2014 John Wiley & Sons A/S and The Gerodontology Society. Published by John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Hill, George B.; Sweeney, Joseph B.
2015-01-01
Reaction workup can be a complex problem for those facing novel synthesis of difficult compounds for the first time. Workup problem solving by systematic thinking should be inculcated as mid-graduate-level is reached. A structured approach is proposed, building decision tree flowcharts to analyze challenges, and an exemplar flowchart is presented…
Distraction during learning with hypermedia: difficult tasks help to keep task goals on track
Scheiter, Katharina; Gerjets, Peter; Heise, Elke
2014-01-01
In educational hypermedia environments, students are often confronted with potential sources of distraction arising from additional information that, albeit interesting, is unrelated to their current task goal. The paper investigates the conditions under which distraction occurs and hampers performance. Based on theories of volitional action control it was hypothesized that interesting information, especially if related to a pending goal, would interfere with task performance only when working on easy, but not on difficult tasks. In Experiment 1, 66 students learned about probability theory using worked examples and solved corresponding test problems, whose task difficulty was manipulated. As a second factor, the presence of interesting information unrelated to the primary task was varied. Results showed that students solved more easy than difficult probability problems correctly. However, the presence of interesting, but task-irrelevant information did not interfere with performance. In Experiment 2, 68 students again engaged in example-based learning and problem solving in the presence of task-irrelevant information. Problem-solving difficulty was varied as a first factor. Additionally, the presence of a pending goal related to the task-irrelevant information was manipulated. As expected, problem-solving performance declined when a pending goal was present during working on easy problems, whereas no interference was observed for difficult problems. Moreover, the presence of a pending goal reduced the time on task-relevant information and increased the time on task-irrelevant information while working on easy tasks. However, as revealed by mediation analyses these changes in overt information processing behavior did not explain the decline in problem-solving performance. As an alternative explanation it is suggested that goal conflicts resulting from pending goals claim cognitive resources, which are then no longer available for learning and problem solving. PMID:24723907
Budko, A A; Gribovskaia, G A; Zhuravlev, D A
2014-05-01
Cooperation issues between military-medical service and civil healthcare in the field of delivery of medical aid to patients in the rear of country are considered in the artic. The rear is a final stage of the care by echelon and the main medical reserve force for front and army areas. Wide hospital network in the rear consisted mainly of evacuation hospitals of the People's Commissariat of the USSR healthcare. Cooperation between military-medical service and civil healthcare facilities was required. Sometimes necessary cooperation failed and made mutual helming of evacuation hospitals difficult. But despite the problems the main problem - return of maximum wounded soldiers to active duty was solved during the Great Patriotic War.
Jeżewska, Maria; Grubman-Nowak, Marta; Leszczyñska, Irena; Jaremin, Bogdan
2012-01-01
The work of marine fishermen is considered one of the most dangerous and life-threatening professions all over the world. There are some common features of the fishing occupation, such as: exposure to cold, wind, rough seas, substantial participation of physical effort, frequency of injuries during work, unpredictability and abruptness of threats, equipment failure, everyday psychological stress, and constant economic pressure. At the same time, the specificity and variety of hazards, depending significantly on geographical-climate and cultural factors, makes the dissimilarity of problems and solutions substantial in different sectors of fishing. The present article is a review of the problems of Polish costal fishermen, referring to some local particularities within this extremely difficult profession requiring special predispositions.
Kruger, Erwin A.; Pires, Marilyn; Ngann, Yvette; Sterling, Michelle; Rubayi, Salah
2013-01-01
Pressure ulcers in spinal cord injury represent a challenging problem for patients, their caregivers, and their physicians. They often lead to recurrent hospitalizations, multiple surgeries, and potentially devastating complications. They present a significant cost to the healthcare system, they require a multidisciplinary team approach to manage well, and outcomes directly depend on patients' education, prevention, and compliance with conservative and surgical protocols. With so many factors involved in the successful treatment of pressure ulcers, an update on their comprehensive management in spinal cord injury is warranted. Current concepts of local wound care, surgical options, as well as future trends from the latest wound healing research are reviewed to aid medical professionals in treating patients with this difficult problem. PMID:24090179
Laser ablation of iron-rich black films from exposed granite surfaces
NASA Astrophysics Data System (ADS)
Delgado Rodrigues, J.; Costa, D.; Mascalchi, M.; Osticioli, I.; Siano, S.
2014-10-01
Here, we investigated the potential of laser removal of iron-rich dark films from weathered granite substrates, which represents a very difficult conservation problem because of the polymineralic nature of the stone and of its complex deterioration mechanisms. As often occurs, biotite was the most critical component because of its high optical absorption, low melting temperature, and pronounced cleavage, which required a careful control of the photothermal and photomechanical effects to optimize the selective ablation of the mentioned unwanted dark film. Different pulse durations and wavelengths Nd:YAG lasers were tested and optimal irradiation conditions were determined through thorough analytical characterisations. Besides addressing a specific conservation problem, the present work provides information of general valence in laser uncovering of encrusted granite.
More reliable protein NMR peak assignment via improved 2-interval scheduling.
Chen, Zhi-Zhong; Lin, Guohui; Rizzi, Romeo; Wen, Jianjun; Xu, Dong; Xu, Ying; Jiang, Tao
2005-03-01
Protein NMR peak assignment refers to the process of assigning a group of "spin systems" obtained experimentally to a protein sequence of amino acids. The automation of this process is still an unsolved and challenging problem in NMR protein structure determination. Recently, protein NMR peak assignment has been formulated as an interval scheduling problem (ISP), where a protein sequence P of amino acids is viewed as a discrete time interval I (the amino acids on P one-to-one correspond to the time units of I), each subset S of spin systems that are known to originate from consecutive amino acids from P is viewed as a "job" j(s), the preference of assigning S to a subsequence P of consecutive amino acids on P is viewed as the profit of executing job j(s) in the subinterval of I corresponding to P, and the goal is to maximize the total profit of executing the jobs (on a single machine) during I. The interval scheduling problem is max SNP-hard in general; but in the real practice of protein NMR peak assignment, each job j(s) usually requires at most 10 consecutive time units, and typically the jobs that require one or two consecutive time units are the most difficult to assign/schedule. In order to solve these most difficult assignments, we present an efficient 13/7-approximation algorithm for the special case of the interval scheduling problem where each job takes one or two consecutive time units. Combining this algorithm with a greedy filtering strategy for handling long jobs (i.e., jobs that need more than two consecutive time units), we obtain a new efficient heuristic for protein NMR peak assignment. Our experimental study shows that the new heuristic produces the best peak assignment in most of the cases, compared with the NMR peak assignment algorithms in the recent literature. The above algorithm is also the first approximation algorithm for a nontrivial case of the well-known interval scheduling problem that breaks the ratio 2 barrier.
NASA Astrophysics Data System (ADS)
Tapilouw, Marisa Christina; Firman, Harry; Redjeki, Sri; Chandra, Didi Teguh
2017-05-01
Teacher training is one form of continuous professional development. Before organizing teacher training (material, time frame), a survey about teacher's need has to be done. Science teacher's perception about science learning in the classroom, the most difficult learning model, difficulties of lesson plan would be a good input for teacher training program. This survey conducted in June 2016. About 23 science teacher filled in the questionnaire. The core of questions are training participation, the most difficult science subject matter, the most difficult learning model, the difficulties of making lesson plan, knowledge of integrated science and problem based learning. Mostly, experienced teacher participated training once a year. Science training is very important to enhance professional competency and to improve the way of teaching. The difficulties of subject matter depend on teacher's education background. The physics subject matter in class VIII and IX are difficult to teach for most respondent because of many formulas and abstract. Respondents found difficulties in making lesson plan, in term of choosing the right learning model for some subject matter. Based on the result, inquiry, cooperative, practice are frequently used in science class. Integrated science is understood as a mix between Biology, Physics and Chemistry concepts. On the other hand, respondents argue that problem based learning was difficult especially in finding contextual problem. All the questionnaire result can be used as an input for teacher training program in order to enhanced teacher's competency. Difficult concepts, integrated science, teaching plan, problem based learning can be shared in teacher training.
Clustering Multivariate Time Series Using Hidden Markov Models
Ghassempour, Shima; Girosi, Federico; Maeder, Anthony
2014-01-01
In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs), where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers. PMID:24662996
The BCI competition. III: Validating alternative approaches to actual BCI problems.
Blankertz, Benjamin; Müller, Klaus-Robert; Krusienski, Dean J; Schalk, Gerwin; Wolpaw, Jonathan R; Schlögl, Alois; Pfurtscheller, Gert; Millán, José del R; Schröder, Michael; Birbaumer, Niels
2006-06-01
A brain-computer interface (BCI) is a system that allows its users to control external devices with brain activity. Although the proof-of-concept was given decades ago, the reliable translation of user intent into device control commands is still a major challenge. Success requires the effective interaction of two adaptive controllers: the user's brain, which produces brain activity that encodes intent, and the BCI system, which translates that activity into device control commands. In order to facilitate this interaction, many laboratories are exploring a variety of signal analysis techniques to improve the adaptation of the BCI system to the user. In the literature, many machine learning and pattern classification algorithms have been reported to give impressive results when applied to BCI data in offline analyses. However, it is more difficult to evaluate their relative value for actual online use. BCI data competitions have been organized to provide objective formal evaluations of alternative methods. Prompted by the great interest in the first two BCI Competitions, we organized the third BCI Competition to address several of the most difficult and important analysis problems in BCI research. The paper describes the data sets that were provided to the competitors and gives an overview of the results.
Nitsche Extended Finite Element Methods for Earthquake Simulation
NASA Astrophysics Data System (ADS)
Coon, Ethan T.
Modeling earthquakes and geologically short-time-scale events on fault networks is a difficult problem with important implications for human safety and design. These problems demonstrate a. rich physical behavior, in which distributed loading localizes both spatially and temporally into earthquakes on fault systems. This localization is governed by two aspects: friction and fault geometry. Computationally, these problems provide a stern challenge for modelers --- static and dynamic equations must be solved on domains with discontinuities on complex fault systems, and frictional boundary conditions must be applied on these discontinuities. The most difficult aspect of modeling physics on complicated domains is the mesh. Most numerical methods involve meshing the geometry; nodes are placed on the discontinuities, and edges are chosen to coincide with faults. The resulting mesh is highly unstructured, making the derivation of finite difference discretizations difficult. Therefore, most models use the finite element method. Standard finite element methods place requirements on the mesh for the sake of stability, accuracy, and efficiency. The formation of a mesh which both conforms to fault geometry and satisfies these requirements is an open problem, especially for three dimensional, physically realistic fault. geometries. In addition, if the fault system evolves over the course of a dynamic simulation (i.e. in the case of growing cracks or breaking new faults), the geometry must he re-meshed at each time step. This can be expensive computationally. The fault-conforming approach is undesirable when complicated meshes are required, and impossible to implement when the geometry is evolving. Therefore, meshless and hybrid finite element methods that handle discontinuities without placing them on element boundaries are a desirable and natural way to discretize these problems. Several such methods are being actively developed for use in engineering mechanics involving crack propagation and material failure. While some theory and application of these methods exist, implementations for the simulation of networks of many cracks have not yet been considered. For my thesis, I implement and extend one such method, the eXtended Finite Element Method (XFEM), for use in static and dynamic models of fault networks. Once this machinery is developed, it is applied to open questions regarding the behavior of networks of faults, including questions of distributed deformation in fault systems and ensembles of magnitude, location, and frequency in repeat ruptures. The theory of XFEM is augmented to allow for solution of problems with alternating regimes of static solves for elastic stress conditions and short, dynamic earthquakes on networks of faults. This is accomplished using Nitsche's approach for implementing boundary conditions. Finally, an optimization problem is developed to determine tractions along the fault, enabling the calculation of frictional constraints and the rupture front. This method is verified via a series of static, quasistatic, and dynamic problems. Armed with this technique, we look at several problems regarding geometry within the earthquake cycle in which geometry is crucial. We first look at quasistatic simulations on a community fault model of Southern California, and model slip distribution across that system. We find the distribution of deformation across faults compares reasonably well with slip rates across the region, as constrained by geologic data. We find geometry can provide constraints for friction, and consider the minimization of shear strain across the zone as a function of friction and plate loading direction, and infer bounds on fault strength in the region. Then we consider the repeated rupture problem, modeling the full earthquake cycle over the course of many events on several fault geometries. In this work, we look at distributions of events, studying the effect of geometry on statistical metrics of event ensembles. Finally, this thesis is a proof of concept for the XFEM on earthquake cycle models on fault systems. We identify strengths and weaknesses of the method, and identify places for future improvement. We discuss the feasibility of the method's use in three dimensions, and find the method to be a strong candidate for future crustal deformation simulations.
Educational Reform as a Dynamic System of Problems and Solutions: Towards an Analytic Instrument
ERIC Educational Resources Information Center
Luttenberg, Johan; Carpay, Thérèse; Veugelers, Wiel
2013-01-01
Large-scale educational reforms are difficult to realize and often fail. In the literature, the course of reform and problems associated with this are frequently discussed. The explanations and recommendations then provided are so diverse that it is difficult to gain a comprehensive overview of what factors are at play and how to take them into…
ERIC Educational Resources Information Center
Albert, Lawrence S.
If being a competent small group problem solver is difficult, it is even more difficult to impart those competencies to others. Unlike athletic coaches who are near their players during the real game, teachers of small group communication are not typically present for on-the-spot coaching when their students are doing their problem solving. That…
NASA Astrophysics Data System (ADS)
Suflita, Joseph M.; Duncan, Kathleen E.
The anaerobic biodegradation of petroleum hydrocarbons is important for the intrinsic remediation of spilt fuels (Gieg and Suflita, 2005), for the conversion of hydrocarbons to clean burning natural gas (Gieg et al., 2008; Jones et al., 2008) and for the fundamental cycling of carbon on the planet (Caldwell et al., 2008). However, the same process has also been implicated in a host of difficult problems including reservoir souring (Jack and Westlake, 1995), oil viscosity alteration (Head et al., 2003), compromised equipment performance and microbiologically influenced corrosion (Duncan et al., 2009). Herein, we will focus on the role of anaerobic microbial communities in catalysing biocorrosion activities in oilfield facilities. Biocorrosion is a costly problem that remains relatively poorly understood. Understanding of the underlying mechanisms requires reliable information on the carbon and energy sources supporting biofilm microorganisms capable of catalysing such activities.
Problems of modern urban drainage in developing countries.
Silveira, A L L
2002-01-01
Socio-economic factors in developing countries make it more difficult to solve problems of urban drainage than in countries that are more advanced. Factors inhibiting the adoption of modern solutions include: (1) in matters of urban drainage, 19th-century sanitary philosophy still dominates; (2) both legal and clandestine land settlement limits the space that modern solutions require; (3) contamination of storm runoff by foul sewage, sediment and garbage prevents adoption of developed-country practices; (4) climatic and socio-economic factors favour the growth of epidemics where runoff is retained for flood-avoidance and to increase infiltration; (5) lack of a technological basis for adequate drainage management and design; (6) lack of the interaction between community and city administration that is needed to obtain modern solutions to urban drainage problems. Awareness of these difficulties is fundamental to the search for modern and viable solutions appropriate for developing countries.
Virtual morality: emotion and action in a simulated three-dimensional "trolley problem".
Navarrete, C David; McDonald, Melissa M; Mott, Michael L; Asher, Benjamin
2012-04-01
Experimentally investigating the relationship between moral judgment and action is difficult when the action of interest entails harming others. We adopt a new approach to this problem by placing subjects in an immersive, virtual reality environment that simulates the classic "trolley problem." In this moral dilemma, the majority of research participants behaved as "moral utilitarians," either (a) acting to cause the death of one individual in order to save the lives of five others, or (b) abstaining from action, when that action would have caused five deaths versus one. Confirming the emotional distinction between moral actions and omissions, autonomic arousal was greater when the utilitarian outcome required action, and increased arousal was associated with a decreased likelihood of utilitarian-biased behavior. This pattern of results held across individuals of different gender, age, and race. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
Examining problem solving in physics-intensive Ph.D. research
NASA Astrophysics Data System (ADS)
Leak, Anne E.; Rothwell, Susan L.; Olivera, Javier; Zwickl, Benjamin; Vosburg, Jarrett; Martin, Kelly Norris
2017-12-01
Problem-solving strategies learned by physics undergraduates should prepare them for real-world contexts as they transition from students to professionals. Yet, graduate students in physics-intensive research face problems that go beyond problem sets they experienced as undergraduates and are solved by different strategies than are typically learned in undergraduate coursework. This paper expands the notion of problem solving by characterizing the breadth of problems and problem-solving processes carried out by graduate students in physics-intensive research. We conducted semi-structured interviews with ten graduate students to determine the routine, difficult, and important problems they engage in and problem-solving strategies they found useful in their research. A qualitative typological analysis resulted in the creation of a three-dimensional framework: context, activity, and feature (that made the problem challenging). Problem contexts extended beyond theory and mathematics to include interactions with lab equipment, data, software, and people. Important and difficult contexts blended social and technical skills. Routine problem activities were typically well defined (e.g., troubleshooting), while difficult and important ones were more open ended and had multiple solution paths (e.g., evaluating options). In addition to broadening our understanding of problems faced by graduate students, our findings explore problem-solving strategies (e.g., breaking down problems, evaluating options, using test cases or approximations) and characteristics of successful problem solvers (e.g., initiative, persistence, and motivation). Our research provides evidence of the influence that problems students are exposed to have on the strategies they use and learn. Using this evidence, we have developed a preliminary framework for exploring problems from the solver's perspective. This framework will be examined and refined in future work. Understanding problems graduate students face and the strategies they use has implications for improving how we approach problem solving in undergraduate physics and physics education research.
Applied extreme-value statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kinnison, R.R.
1983-05-01
The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all themore » subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.« less
Artificial Neural Network Based Mission Planning Mechanism for Spacecraft
NASA Astrophysics Data System (ADS)
Li, Zhaoyu; Xu, Rui; Cui, Pingyuan; Zhu, Shengying
2018-04-01
The ability to plan and react fast in dynamic space environments is central to intelligent behavior of spacecraft. For space and robotic applications, many planners have been used. But it is difficult to encode the domain knowledge and directly use existing techniques such as heuristic to improve the performance of the application systems. Therefore, regarding planning as an advanced control problem, this paper first proposes an autonomous mission planning and action selection mechanism through a multiple layer perceptron neural network approach to select actions in planning process and improve efficiency. To prove the availability and effectiveness, we use autonomous mission planning problems of the spacecraft, which is a sophisticated system with complex subsystems and constraints as an example. Simulation results have shown that artificial neural networks (ANNs) are usable for planning problems. Compared with the existing planning method in EUROPA, the mechanism using ANNs is more efficient and can guarantee stable performance. Therefore, the mechanism proposed in this paper is more suitable for planning problems of spacecraft that require real time and stability.
Promoting the Multidimensional Character of Scientific Reasoning.
Bradshaw, William S; Nelson, Jennifer; Adams, Byron J; Bell, John D
2017-04-01
This study reports part of a long-term program to help students improve scientific reasoning using higher-order cognitive tasks set in the discipline of cell biology. This skill was assessed using problems requiring the construction of valid conclusions drawn from authentic research data. We report here efforts to confirm the hypothesis that data interpretation is a complex, multifaceted exercise. Confirmation was obtained using a statistical treatment showing that various such problems rank students differently-each contains a unique set of cognitive challenges. Additional analyses of performance results have allowed us to demonstrate that individuals differ in their capacity to navigate five independent generic elements that constitute successful data interpretation: biological context, connection to course concepts, experimental protocols, data inference, and integration of isolated experimental observations into a coherent model. We offer these aspects of scientific thinking as a "data analysis skills inventory," along with usable sample problems that illustrate each element. Additionally, we show that this kind of reasoning is rigorous in that it is difficult for most novice students, who are unable to intuitively implement strategies for improving these skills. Instructors armed with knowledge of the specific challenges presented by different types of problems can provide specific helpful feedback during formative practice. The use of this instructional model is most likely to require changes in traditional classroom instruction.
Cerium Oxide Nanoparticle Nose-Only Inhalation Exposures ...
There is a critical need to assess the health effects associated with exposure of commercially produced NPs across the size ranges reflective of that detected in the industrial sectors that are generating, as well as incorporating, NPs into products. Generation of stable and low concentrations of size-fractionated nanoscale aerosols in nose-only chambers can be difficult, and when the aerosol agglomerates during generation, the problems are significantly increased. One problem is that many nanoscale aerosol generators have higher aerosol output and/or airflow than can be accommodated by a nose-only inhalation chamber, requiring much of the generated aerosol to be diverted to exhaust. Another problem is that mixing vessels used to modulate the fluctuating output from aerosol generators can cause substantial wall losses, consuming much of the generated aerosol. Other available aerosol generation systems can produce nanoscale aerosols from nanoparticles (NPs), however these NPs are generated in real time and do not approximate the physical and chemical characteristics of NPs that are commercially produced exposing the workers and the public. The health effects associated with exposure to commercial NP production, which are more morphologically and size heterogeneous, is required for risk assessment. To overcome these problems, a low-consumption dry-particulate nanoscale aerosol generator was developed to deliver stable concentrations in the range of 10–5000 µg
Motion Planning of Two Stacker Cranes in a Large-Scale Automated Storage/Retrieval System
NASA Astrophysics Data System (ADS)
Kung, Yiheng; Kobayashi, Yoshimasa; Higashi, Toshimitsu; Ota, Jun
We propose a method for reducing the computational time of motion planning for stacker cranes. Most automated storage/retrieval systems (AS/RSs) are only equipped with one stacker crane. However, this is logistically challenging, and greater work efficiency in warehouses, such as those using two stacker cranes, is required. In this paper, a warehouse with two stacker cranes working simultaneously is proposed. Unlike warehouses with only one crane, trajectory planning in those with two cranes is very difficult. Since there are two cranes working together, a proper trajectory must be considered to avoid collision. However, verifying collisions is complicated and requires a considerable amount of computational time. As transport work in AS/RSs occurs randomly, motion planning cannot be conducted in advance. Planning an appropriate trajectory within a restricted duration would be a difficult task. We thereby address the current problem of motion planning requiring extensive calculation time. As a solution, we propose a “free-step” to simplify the procedure of collision verification and reduce the computational time. On the other hand, we proposed a method to reschedule the order of collision verification in order to find an appropriate trajectory in less time. By the proposed method, we reduce the calculation time to less than 1/300 of that achieved in former research.
Van Doorslaer, Koenraad; Chen, Dan; Chapman, Sandra; Khan, Jameela
2017-01-01
ABSTRACT Human papillomavirus (HPV) genomes are replicated and maintained as extrachromosomal plasmids during persistent infection. The viral E2 proteins are thought to promote stable maintenance replication by tethering the viral DNA to host chromatin. However, this has been very difficult to prove genetically, as the E2 protein is involved in transcriptional regulation and initiation of replication, as well as its assumed role in genome maintenance. This makes mutational analysis of viral trans factors and cis elements in the background of the viral genome problematic and difficult to interpret. To circumvent this problem, we have developed a complementation assay in which the complete wild-type HPV18 genome is transfected into primary human keratinocytes along with subgenomic or mutated replicons that contain the minimal replication origin. The wild-type genome provides the E1 and E2 proteins in trans, allowing us to determine additional cis elements that are required for long-term replication and partitioning of the replicon. We found that, in addition to the core replication origin (and the three E2 binding sites located therein), additional sequences from the transcriptional enhancer portion of the URR (upstream regulatory region) are required in cis for long-term genome replication. PMID:29162712
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Vega, F F; Cantu-Paz, E; Lopez, J I
The population size of genetic algorithms (GAs) affects the quality of the solutions and the time required to find them. While progress has been made in estimating the population sizes required to reach a desired solution quality for certain problems, in practice the sizing of populations is still usually performed by trial and error. These trials might lead to find a population that is large enough to reach a satisfactory solution, but there may still be opportunities to optimize the computational cost by reducing the size of the population. This paper presents a technique called plague that periodically removes amore » number of individuals from the population as the GA executes. Recently, the usefulness of the plague has been demonstrated for genetic programming. The objective of this paper is to extend the study of plagues to genetic algorithms. We experiment with deceptive trap functions, a tunable difficult problem for GAs, and the experiments show that plagues can save computational time while maintaining solution quality and reliability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Brennan T; Jager, Yetta; March, Patrick
Reservoir releases are typically operated to maximize the efficiency of hydropower production and the value of hydropower produced. In practice, ecological considerations are limited to those required by law. We first describe reservoir optimization methods that include mandated constraints on environmental and other water uses. Next, we describe research to formulate and solve reservoir optimization problems involving both energy and environmental water needs as objectives. Evaluating ecological objectives is a challenge in these problems for several reasons. First, it is difficult to predict how biological populations will respond to flow release patterns. This problem can be circumvented by using ecologicalmore » models. Second, most optimization methods require complex ecological responses to flow to be quantified by a single metric, preferably a currency that can also represent hydropower benefits. Ecological valuation of instream flows can make optimization methods that require a single currency for the effects of flow on energy and river ecology possible. Third, holistic reservoir optimization problems are unlikely to be structured such that simple solution methods can be used, necessitating the use of flexible numerical methods. One strong advantage of optimal control is the ability to plan for the effects of climate change. We present ideas for developing holistic methods to the point where they can be used for real-time operation of reservoirs. We suggest that developing ecologically sound optimization tools should be a priority for hydropower in light of the increasing value placed on sustaining both the ecological and energy benefits of riverine ecosystems long into the future.« less
Artificial evolution by viability rather than competition.
Maesani, Andrea; Fernando, Pradeep Ruben; Floreano, Dario
2014-01-01
Evolutionary algorithms are widespread heuristic methods inspired by natural evolution to solve difficult problems for which analytical approaches are not suitable. In many domains experimenters are not only interested in discovering optimal solutions, but also in finding the largest number of different solutions satisfying minimal requirements. However, the formulation of an effective performance measure describing these requirements, also known as fitness function, represents a major challenge. The difficulty of combining and weighting multiple problem objectives and constraints of possibly varying nature and scale into a single fitness function often leads to unsatisfactory solutions. Furthermore, selective reproduction of the fittest solutions, which is inspired by competition-based selection in nature, leads to loss of diversity within the evolving population and premature convergence of the algorithm, hindering the discovery of many different solutions. Here we present an alternative abstraction of artificial evolution, which does not require the formulation of a composite fitness function. Inspired from viability theory in dynamical systems, natural evolution and ethology, the proposed method puts emphasis on the elimination of individuals that do not meet a set of changing criteria, which are defined on the problem objectives and constraints. Experimental results show that the proposed method maintains higher diversity in the evolving population and generates more unique solutions when compared to classical competition-based evolutionary algorithms. Our findings suggest that incorporating viability principles into evolutionary algorithms can significantly improve the applicability and effectiveness of evolutionary methods to numerous complex problems of science and engineering, ranging from protein structure prediction to aircraft wing design.
Facade renovation - replacement and restoration of the panels in a monument protected object
NASA Astrophysics Data System (ADS)
Novotný, Michal
2017-12-01
The article deals with problems of reconstruction of the facade and the associated problem of replacement or repair of the panels. In conventional buildings it is a smooth operation, but it is problematic in monument-protected objects. In the case of a common building, it is possible to choose any modern panels and simply replace them, but for historical objects we have to follow the claims and the rules of monument protection. In practice, it usually means the impossibility of use of modern panels, but at least a combination of old and modern technologies. Another possible solution to the problem is renovation, or repairs to the original state of the existing panels, of course with respect to the functionality of such panels. The implementation of such repairs must always be based on the technical and historical survey of the condition of the object and the repairs must be professionally designed. Subsequently, corrections are made, during which it is necessary to pay particular attention to observance of the technological procedures, rules and instructions particularly in terms of monument protection. However, the functionality of the works or elements made with regard to the quality of the environment within the building is not negligible. A common problem is the lack of control of technical requirements and functional requirements. Underestimation of the problems then leads to difficult repairs. The article points to the mistakes and problems of one such construction project on a historically protected chateau building.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
Holmberg, D L
1979-05-01
Pyothorax is a serious disease process which requires both medical and surgical intervention. Late recognition, management problems, and likely recurrence make successful treatment difficult and often frustrating. Aims of therapy should be to avoid undue stress to the patient, to relieve respiratory distress by thoracocentesis, to eliminate infectious agents with antimicrobials, to remove pleural exudate, and to provide supportive care. Close monitoring of the patient is necessary to prevent iatrogenic complications such as pneumothorax, hemothorax, hypothermia, or hypoproteinemia. Exploratory thoracotomy for removal of granulomatous material and fibroelastic pleural "peels" is occasionally necessary to resolve compressive cardiopulmonary lesions.
NASA Technical Reports Server (NTRS)
Centrella, Joan; Baker, John G.; Kelly, Bernard J.; vanMeter, James R.
2010-01-01
Black-hole mergers take place in regions of very strong and dynamical gravitational fields, and are among the strongest sources of gravitational radiation. Probing these mergers requires solving the full set of Einstein's equations of general relativity numerically. For more than 40 years, progress towards this goal has been very slow, as numerical relativists encountered a host of difficult problems. Recently, several breakthroughs have led to dramatic progress, enabling stable and accurate calculations of black-hole mergers. This article presents an overview of this field, including impacts on astrophysics and applications in gravitational wave data analysis.
An adaptive grid algorithm for one-dimensional nonlinear equations
NASA Technical Reports Server (NTRS)
Gutierrez, William E.; Hills, Richard G.
1990-01-01
Richards' equation, which models the flow of liquid through unsaturated porous media, is highly nonlinear and difficult to solve. Step gradients in the field variables require the use of fine grids and small time step sizes. The numerical instabilities caused by the nonlinearities often require the use of iterative methods such as Picard or Newton interation. These difficulties result in large CPU requirements in solving Richards equation. With this in mind, adaptive and multigrid methods are investigated for use with nonlinear equations such as Richards' equation. Attention is focused on one-dimensional transient problems. To investigate the use of multigrid and adaptive grid methods, a series of problems are studied. First, a multigrid program is developed and used to solve an ordinary differential equation, demonstrating the efficiency with which low and high frequency errors are smoothed out. The multigrid algorithm and an adaptive grid algorithm is used to solve one-dimensional transient partial differential equations, such as the diffusive and convective-diffusion equations. The performance of these programs are compared to that of the Gauss-Seidel and tridiagonal methods. The adaptive and multigrid schemes outperformed the Gauss-Seidel algorithm, but were not as fast as the tridiagonal method. The adaptive grid scheme solved the problems slightly faster than the multigrid method. To solve nonlinear problems, Picard iterations are introduced into the adaptive grid and tridiagonal methods. Burgers' equation is used as a test problem for the two algorithms. Both methods obtain solutions of comparable accuracy for similar time increments. For the Burgers' equation, the adaptive grid method finds the solution approximately three times faster than the tridiagonal method. Finally, both schemes are used to solve the water content formulation of the Richards' equation. For this problem, the adaptive grid method obtains a more accurate solution in fewer work units and less computation time than required by the tridiagonal method. The performance of the adaptive grid method tends to degrade as the solution process proceeds in time, but still remains faster than the tridiagonal scheme.
A Rare Cause of Hypothalamic Obesity, Rohhad Syndrome: 2 Cases.
Şiraz, Ülkü Gül; Okdemir, Deniz; Direk, Gül; Akın, Leyla; Hatipoğlu, Nihal; Kendırcı, Mustafa; Kurtoğlu, Selim
2018-03-19
Rapid-onset obesity with hypoventilation, hypothalamic dysfunction and autonomic dysregulation (ROHHAD) syndrome is a rare disease that is difficult to diagnosis and distinguish from genetic obesity syndromes. The underlying causes of the disease has not been fully explained. Hypothalamic dysfunction causes endocrine problems, respiratory dysfunction and autonomic alterations. There are around 80 reported patients due to lack of recognition. We present two female patient suspected of ROHHAD due to weight gain since early childhood. The presented symptoms, respiratory and circulatory dysfunction, hypothalamic hypernatremia, hypothalamo-pituitary hormonal disorders such as santral hypothyrodism, hyperprolactinemia and santral early puberty are completely matched the criteria of ROHHAD syndrome. ROHHAD syndrome should be considered in differential diagnosis since it is difficult to distinguish from causes of monogenic obesity. Early identification of the disease reduces morbidity of the syndrome and patients require regular follow-up by a multidisciplinary approach.
Human factors issues in performing life science experiments in a 0-G environment
NASA Technical Reports Server (NTRS)
Gonzalez, Wayne
1989-01-01
An overview of the environmental conditions within the Spacelab and the planned Space Station Freedom is presented. How this environment causes specific Human Factors problems and the nature of design solutions are described. The impact of these problems and solutions on the performance of life science activities onboard Spacelab (SL) and Space Station Freedom (SSF) is discussed. The first area highlighted is contamination. The permanence of SSF in contrast to the two-week mission of SL has significant impacts on crew and specimen protection requirements and, thus, resource utilization. These requirements, in turn impose restrictions on working volumes, scheduling, training, and scope of experimental procedures. A second area is microgravity. This means that all specimens, materials, and apparatus must be restrained and carefully controlled. Because so much of the scientific activity must occur within restricted enclosures (gloveboxes), the provisions for restraint and control are made more complex. The third topic is crewmember biomechanics and the problems of movement and task performance in microgravity. In addition to the need to stabilize the body for the performance of tasks, performance of very sensitive tasks such as dissection is difficult. The issue of space sickness and adaption is considered in this context.
Application of decision science to resilience management in Jamaica Bay
Eaton, Mitchell; Fuller, Angela K.; Johnson, Fred A.; Hare, M. P.; Stedman, Richard C.; Sanderson, E.W.; Solecki, W. D.; Waldman, J.R.; Paris, A. S.
2016-01-01
This book highlights the growing interest in management interventions designed to enhance the resilience of the Jamaica Bay socio-ecological system. Effective management, whether the focus is on managing biological processes or human behavior or (most likely) both, requires decision makers to anticipate how the managed system will respond to interventions (i.e., via predictions or projections). In systems characterized by many interacting components and high uncertainty, making probabilistic predictions is often difficult and requires careful thinking not only about system dynamics, but also about how management objectives are specified and the analytic method used to select the preferred action(s). Developing a clear statement of the problem(s) and articulation of management objectives is often best achieved by including input from managers, scientists and other stakeholders affected by the decision through a process of joint problem framing (Marcot and others 2012; Keeney and others 1990). Using a deliberate, coherent and transparent framework for deciding among management alternatives to best meet these objectives then ensures a greater likelihood for successful intervention. Decision science provides the theoretical and practical basis for developing this framework and applying decision analysis methods for making complex decisions under uncertainty and risk.
TEMPERAMENT, FAMILY ENVIRONMENT, AND BEHAVIOR PROBLEMS IN CHILDREN WITH NEW-ONSET SEIZURES
Baum, Katherine T.; Byars, Anna W.; deGrauw, Ton J.; Johnson, Cynthia S.; Perkins, Susan M.; Dunn, David W.; Bates, John E.; Austin, Joan K.
2007-01-01
Children with epilepsy, even those with new-onset seizures, exhibit relatively high rates of behavior problems. The purpose of this study was to explore the relationships among early temperament, family adaptive resources, and behavior problems in children with new-onset seizures. Our major goal was to test whether family adaptive resources moderated the relationship between early temperament dimensions and current behavior problems in 287 children with new-onset seizures. Two of the three temperament dimensions (difficultness and resistance to control) were positively correlated with total, internalizing, and externalizing behavior problems (all p < 0.0001). The third temperament dimension, unadaptability, was positively correlated with total and internalizing problems (p < 0.01). Family adaptive resources moderated the relationships between temperament and internalizing and externalizing behavior problems at school. Children with a difficult early temperament who live in a family environment with low family mastery are at the greatest risk for behavior problems. PMID:17267291
VLSI-based video event triggering for image data compression
NASA Astrophysics Data System (ADS)
Williams, Glenn L.
1994-02-01
Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.
VLSI-based Video Event Triggering for Image Data Compression
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
1994-01-01
Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.
Conservative zonal schemes for patched grids in 2 and 3 dimensions
NASA Technical Reports Server (NTRS)
Hessenius, Kristin A.
1987-01-01
The computation of flow over complex geometries, such as realistic aircraft configurations, poses difficult grid generation problems for computational aerodynamicists. The creation of a traditional, single-module grid of acceptable quality about an entire configuration may be impossible even with the most sophisticated of grid generation techniques. A zonal approach, wherein the flow field is partitioned into several regions within which grids are independently generated, is a practical alternative for treating complicated geometries. This technique not only alleviates the problems of discretizing a complex region, but also facilitates a block processing approach to computation thereby circumventing computer memory limitations. The use of such a zonal scheme, however, requires the development of an interfacing procedure that ensures a stable, accurate, and conservative calculation for the transfer of information across the zonal borders.
ERIC Educational Resources Information Center
D'Andrea, Michael; Daniels, Judy
2007-01-01
The authors describe social justice advocacy interventions to initiate difficult discussions at the university where they are employed. They emphasize the need to foster difficult dialogues about the problem of institutional racism among students, faculty members, and administrators where they work. The Privileged Identity Exploration (PIE) model…
Location-allocation models and new solution methodologies in telecommunication networks
NASA Astrophysics Data System (ADS)
Dinu, S.; Ciucur, V.
2016-08-01
When designing a telecommunications network topology, three types of interdependent decisions are combined: location, allocation and routing, which are expressed by the following design considerations: how many interconnection devices - consolidation points/concentrators should be used and where should they be located; how to allocate terminal nodes to concentrators; how should the voice, video or data traffic be routed and what transmission links (capacitated or not) should be built into the network. Including these three components of the decision into a single model generates a problem whose complexity makes it difficult to solve. A first method to address the overall problem is the sequential one, whereby the first step deals with the location-allocation problem and based on this solution the subsequent sub-problem (routing the network traffic) shall be solved. The issue of location and allocation in a telecommunications network, called "The capacitated concentrator location- allocation - CCLA problem" is based on one of the general location models on a network in which clients/demand nodes are the terminals and facilities are the concentrators. Like in a location model, each client node has a demand traffic, which must be served, and the facilities can serve these demands within their capacity limit. In this study, the CCLA problem is modeled as a single-source capacitated location-allocation model whose optimization objective is to determine the minimum network cost consisting of fixed costs for establishing the locations of concentrators, costs for operating concentrators and costs for allocating terminals to concentrators. The problem is known as a difficult combinatorial optimization problem for which powerful algorithms are required. Our approach proposes a Fuzzy Genetic Algorithm combined with a local search procedure to calculate the optimal values of the location and allocation variables. To confirm the efficiency of the proposed algorithm with respect to the quality of solutions, significant size test problems were considered: up to 100 terminal nodes and 50 concentrators on a 100 × 100 square grid. The performance of this hybrid intelligent algorithm was evaluated by measuring the quality of its solutions with respect to the following statistics: the standard deviation and the ratio of the best solution obtained.
Binning and filtering: the six-color solution
NASA Astrophysics Data System (ADS)
Ashdown, Ian; Robinson, Shane; Salsbury, Marc
2006-08-01
The use of LED backlighting for LCD displays requires careful binning of red, green, and blue LEDs by dominant wavelength to maintain the color gamuts as specified by NTSC, SMPTE, and EBU/ITU standards. This problem also occurs to a lesser extent with RGB and RGBA assemblies for solid-state lighting, where color gamut consistency is required for color-changing luminaires. In this paper, we propose a "six-color solution," based on Grassman's laws, that does not require color binning, but nevertheless guarantees a fixed color gamut that subsumes the color gamuts of carefully-binned RGB assemblies. A further advantage of this solution is that it solves the problem of peak wavelength shifts with varying junction temperatures. The color gamut can thus remain fixed over the full range of LED intensities and ambient temperatures. A related problem occurs with integrated circuit (IC) colorimeters used for optical feedback with LED backlighting and RGB(A) solid-state lighting, wherein it can be difficult to distinguish between peak wavelength shifts and changes in LED intensity. We apply our six-color solution to the design of a novel colorimeter for LEDs that independently measures changes in peak wavelength and intensity. The design is compatible with current manufacturing techniques for tristimulus colorimeter ICs. Together, the six-color solution for LEDs and colorimeters enables less expensive LED backlighting and solid-state lighting systems with improved color stability.
Whalen, Madeleine; Maliszewski, Barbara; Sheinfeld, Rebecca; Gardner, Heather; Baptiste, Diana
2018-04-25
Difficult venous access is a common problem in health care-especially in the emergency setting-that relies on quick diagnostics to differentiate patient acuities and administer critical medications. The creation of a dedicated team to address difficult venous access (DVA) is a possible solution to the problems of delayed venous access, yet no studies have been published on implementing such a team in the emergency department. This was a quasi-experimental study in an urban emergency department. Researchers performed chart audits of staff-identified patients with DVA to gather baseline data. A DVA team was subsequently implemented 16 hours a day, 7 days a week. Data were recorded on patients referred to the team and included time, number of IV attempts, and patient characteristics. Baseline data were collected on 53 patients, and postintervention data included 135 patients. The implementation of a DVA team decreased the mean lab order-to-lab completion time by 115 minutes (P < 0.0001). Decreases in the number of attempts were not statistically significant. Patients requiring increased numbers of IV attempts also had many common characteristics including history of multiple attempts, poor skin quality, and IV drug use. The use of a dedicated team for DVA reduces the lag time from physician orders to actionable diagnostics or administration of medication. A dedicated DVA technician is a concrete solution to threats of patient safety, as well as ED crowding, and has the potential to affect both patient- and department-level care. Copyright © 2018 Elsevier Inc. All rights reserved.
Attitudes and beliefs of emergency department staff regarding alcohol-related presentations.
Indig, Devon; Copeland, Jan; Conigrave, Katherine M; Rotenko, Irene
2009-01-01
This study examined emergency department (ED) staff attitudes and beliefs about alcohol-related ED presentations in order to recommend improved detection and brief intervention strategies. The survey was conducted at two inner-Sydney hospital EDs in 2006 to explore ED clinical staff's attitudes, current practice and barriers for managing alcohol-related ED presentations. The sample included N=78 ED staff (54% nurses, 46% doctors), representing a 30% response rate. Management of alcohol-related problems was not routine among ED staff, with only 5% usually formally screening for alcohol problems, only 16% usually conducting brief interventions, and only 27% usually providing a referral to specialist treatment services. Over 85% of ED staff indicated that lack of patient motivation made providing alcohol interventions very difficult. Significant predictors of good self-reported practice among ED staff for patients with alcohol problems included: being a doctor, being confident and having a sense of responsibility towards managing patients with alcohol-related problems. This study reported that many staff lack the confidence or sense of clinical responsibility to fully and appropriately manage ED patients with alcohol-related problems. ED staff appear to require additional training, resources and support to enhance their management of patients with alcohol-related problems.
Enhanced Communication Network Solution for Positive Train Control Implementation
NASA Technical Reports Server (NTRS)
Fatehi, M. T.; Simon, J.; Chang, W.; Chow, E. T.; Burleigh, S. C.
2011-01-01
The commuter and freight railroad industry is required to implement Positive Train Control (PTC) by 2015 (2012 for Metrolink), a challenging network communications problem. This paper will discuss present technologies developed by the National Aeronautics and Space Administration (NASA) to overcome comparable communication challenges encountered in deep space mission operations. PTC will be based on a new cellular wireless packet Internet Protocol (IP) network. However, ensuring reliability in such a network is difficult due to the "dead zones" and transient disruptions we commonly experience when we lose calls in commercial cellular networks. These disruptions make it difficult to meet PTC s stringent reliability (99.999%) and safety requirements, deployment deadlines, and budget. This paper proposes innovative solutions based on space-proven technologies that would help meet these challenges: (1) Delay Tolerant Networking (DTN) technology, designed for use in resource-constrained, embedded systems and currently in use on the International Space Station, enables reliable communication over networks in which timely data acknowledgments might not be possible due to transient link outages. (2) Policy-Based Management (PBM) provides dynamic management capabilities, allowing vital data to be exchanged selectively (with priority) by utilizing alternative communication resources. The resulting network may help railroads implement PTC faster, cheaper, and more reliably.
Prediction of anthropometric accommodation in aircraft cockpits
NASA Astrophysics Data System (ADS)
Zehner, Gregory Franklin
Designing aircraft cockpits to accommodate the wide range of body sizes existing in the U.S. population has always been a difficult problem for Crewstation Engineers. The approach taken in the design of military aircraft has been to restrict the range of body sizes allowed into flight training, and then to develop standards and specifications to ensure that the majority of the pilots are accommodated. Accommodation in this instance is defined as the ability to: (1) Adequately see, reach, and actuate controls; (2) Have external visual fields so that the pilot can see to land, clear for other aircraft, and perform a wide variety of missions (ground support/attack or air to air combat); and (3) Finally, if problems arise, the pilot has to be able to escape safely. Each of these areas is directly affected by the body size of the pilot. Unfortunately, accommodation problems persist and may get worse. Currently the USAF is considering relaxing body size entrance requirements so that smaller and larger people could become pilots. This will make existing accommodation problems much worse. This dissertation describes a methodology for correcting this problem and demonstrates the method by predicting pilot fit and performance in the USAF T-38A aircraft based on anthropometric data. The methods described can be applied to a variety of design applications where fitting the human operator into a system is a major concern. A systematic approach is described which includes: defining the user population, setting functional requirements that operators must be able to perform, testing the ability of the user population to perform the functional requirements, and developing predictive equations for selecting future users of the system. Also described is a process for the development of new anthropometric design criteria and cockpit design methods that assure body size accommodation is improved in the future.
Joint Geophysical Inversion With Multi-Objective Global Optimization Methods
NASA Astrophysics Data System (ADS)
Lelievre, P. G.; Bijani, R.; Farquharson, C. G.
2015-12-01
Pareto multi-objective global optimization (PMOGO) methods generate a suite of solutions that minimize multiple objectives (e.g. data misfits and regularization terms) in a Pareto-optimal sense. Providing a suite of models, as opposed to a single model that minimizes a weighted sum of objectives, allows a more complete assessment of the possibilities and avoids the often difficult choice of how to weight each objective. We are applying PMOGO methods to three classes of inverse problems. The first class are standard mesh-based problems where the physical property values in each cell are treated as continuous variables. The second class of problems are also mesh-based but cells can only take discrete physical property values corresponding to known or assumed rock units. In the third class we consider a fundamentally different type of inversion in which a model comprises wireframe surfaces representing contacts between rock units; the physical properties of each rock unit remain fixed while the inversion controls the position of the contact surfaces via control nodes. This third class of problem is essentially a geometry inversion, which can be used to recover the unknown geometry of a target body or to investigate the viability of a proposed Earth model. Joint inversion is greatly simplified for the latter two problem classes because no additional mathematical coupling measure is required in the objective function. PMOGO methods can solve numerically complicated problems that could not be solved with standard descent-based local minimization methods. This includes the latter two classes of problems mentioned above. There are significant increases in the computational requirements when PMOGO methods are used but these can be ameliorated using parallelization and problem dimension reduction strategies.
Holden, Richard J; Rivera-Rodriguez, A Joy; Faye, Héléne; Scanlon, Matthew C; Karsh, Ben-Tzion
2013-08-01
The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses' operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA's impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians' work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign.
Holden, Richard J.; Rivera-Rodriguez, A. Joy; Faye, Héléne; Scanlon, Matthew C.; Karsh, Ben-Tzion
2012-01-01
The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses’ operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA’s impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians’ work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign. PMID:24443642
Putting the Aero Back into Aeroelasticity
NASA Technical Reports Server (NTRS)
Bousman, William G.
2000-01-01
The lack of progress in understanding the physics of rotorcraft loads and vibration over the last 30 years is addressed in this paper. As befits this extraordinarily difficult problem, the reasons for the lack of progress are complicated and difficult to ascertain. It is proposed here that the difficulty lies within at least three areas: 1) a loss of perspective as to what are the key factors in rotor loads and vibration, 2) the overlooking of serious unsolved problems in the field, and 3) cultural barriers that impede progress. Some criteria are suggested for future research to provide a more concentrated focus on the problem.
Putting the Aero Back Into Aeroelasticity
NASA Technical Reports Server (NTRS)
Bousman, William G.; Aiken, Edwin W. (Technical Monitor)
1999-01-01
The lack of progress in understanding the physics of rotorcraft loads and vibration over the last 30 years is addressed in this paper. As befits this extraordinarily difficult problem, the reasons for the lack of progress are complicated and difficult to ascertain. It is proposed here that the difficulty lies within at least three areas: 1) a loss of perspective as to what are the key factors in rotor loads and vibration, 2) the overlooking of serious unsolved problems in the field, and 3) cultural barriers that impede progress. Some criteria are suggested for future research to provide a more concentrated focus on the problem.
[Are the flight security measures good for the patients? The "sickurity" problem].
Felkai, Péter
2010-10-10
Due to the stiffening requirements of security measures at the airports, prevention of air-travel related illnesses have become more difficult. The backlash effects of restrictions (e.g. fluid and movement restrictions) can trigger or even improve pathophysiological processes. The most advanced security check methods, the full body scan, besides ethical and moral considerations, may induce yet unknown pathological processes. We face the similar problem with the traveller, who becomes ill or injured during the trip. In this case, repatriation is often required, which is usually accomplished by commercial airlines. If patient should be transported by stretcher, it is also available on regular flight, but in this case he/she must be accompanied by a medical professional. This solution raises much more security problem: not only the sick person and the medical team, but even their medical equipments and medicines have to be checked. Due to the lack of standardised regulations the security staff solves the problem by various attempts from emphatic approach till refusal. For these reasons, a clear and exact regulation is needed, which must be based upon medical experts' opinion, and should deal not only with the flight security but with the patient's security, as well. This regulation can cease the patients and their medical accompanied persons' to be defencelessness against local authorities and security services. The same is true for handicapped persons. Author suggests solutions for the problem, balancing between flight security and the patient's "sickurity".
Duarte, Belmiro P.M.; Wong, Weng Kee; Atkinson, Anthony C.
2016-01-01
T-optimum designs for model discrimination are notoriously difficult to find because of the computational difficulty involved in solving an optimization problem that involves two layers of optimization. Only a handful of analytical T-optimal designs are available for the simplest problems; the rest in the literature are found using specialized numerical procedures for a specific problem. We propose a potentially more systematic and general way for finding T-optimal designs using a Semi-Infinite Programming (SIP) approach. The strategy requires that we first reformulate the original minimax or maximin optimization problem into an equivalent semi-infinite program and solve it using an exchange-based method where lower and upper bounds produced by solving the outer and the inner programs, are iterated to convergence. A global Nonlinear Programming (NLP) solver is used to handle the subproblems, thus finding the optimal design and the least favorable parametric configuration that minimizes the residual sum of squares from the alternative or test models. We also use a nonlinear program to check the global optimality of the SIP-generated design and automate the construction of globally optimal designs. The algorithm is successfully used to produce results that coincide with several T-optimal designs reported in the literature for various types of model discrimination problems with normally distributed errors. However, our method is more general, merely requiring that the parameters of the model be estimated by a numerical optimization. PMID:27330230
Duarte, Belmiro P M; Wong, Weng Kee; Atkinson, Anthony C
2015-03-01
T-optimum designs for model discrimination are notoriously difficult to find because of the computational difficulty involved in solving an optimization problem that involves two layers of optimization. Only a handful of analytical T-optimal designs are available for the simplest problems; the rest in the literature are found using specialized numerical procedures for a specific problem. We propose a potentially more systematic and general way for finding T-optimal designs using a Semi-Infinite Programming (SIP) approach. The strategy requires that we first reformulate the original minimax or maximin optimization problem into an equivalent semi-infinite program and solve it using an exchange-based method where lower and upper bounds produced by solving the outer and the inner programs, are iterated to convergence. A global Nonlinear Programming (NLP) solver is used to handle the subproblems, thus finding the optimal design and the least favorable parametric configuration that minimizes the residual sum of squares from the alternative or test models. We also use a nonlinear program to check the global optimality of the SIP-generated design and automate the construction of globally optimal designs. The algorithm is successfully used to produce results that coincide with several T-optimal designs reported in the literature for various types of model discrimination problems with normally distributed errors. However, our method is more general, merely requiring that the parameters of the model be estimated by a numerical optimization.
VOP memory management in MPEG-4
NASA Astrophysics Data System (ADS)
Vaithianathan, Karthikeyan; Panchanathan, Sethuraman
2001-03-01
MPEG-4 is a multimedia standard that requires Video Object Planes (VOPs). Generation of VOPs for any kind of video sequence is still a challenging problem that largely remains unsolved. Nevertheless, if this problem is treated by imposing certain constraints, solutions for specific application domains can be found. MPEG-4 applications in mobile devices is one such domain where the opposite goals namely low power and high throughput are required to be met. Efficient memory management plays a major role in reducing the power consumption. Specifically, efficient memory management for VOPs is difficult because the lifetimes of these objects vary and these life times may be overlapping. Varying life times of the objects requires dynamic memory management where memory fragmentation is a key problem that needs to be addressed. In general, memory management systems address this problem by following a combination of strategy, policy and mechanism. For MPEG4 based mobile devices that lack instruction processors, a hardware based memory management solution is necessary. In MPEG4 based mobile devices that have a RISC processor, using a Real time operating system (RTOS) for this memory management task is not expected to be efficient because the strategies and policies used by the ROTS is often tuned for handling memory segments of smaller sizes compared to object sizes. Hence, a memory management scheme specifically tuned for VOPs is important. In this paper, different strategies, policies and mechanisms for memory management are considered and an efficient combination is proposed for the case of VOP memory management along with a hardware architecture, which can handle the proposed combination.
Nurse Scheduling by Cooperative GA with Effective Mutation Operator
NASA Astrophysics Data System (ADS)
Ohki, Makoto
In this paper, we propose an effective mutation operators for Cooperative Genetic Algorithm (CGA) to be applied to a practical Nurse Scheduling Problem (NSP). The nurse scheduling is a very difficult task, because NSP is a complex combinatorial optimizing problem for which many requirements must be considered. In real hospitals, the schedule changes frequently. The changes of the shift schedule yields various problems, for example, a fall in the nursing level. We describe a technique of the reoptimization of the nurse schedule in response to a change. The conventional CGA is superior in ability for local search by means of its crossover operator, but often stagnates at the unfavorable situation because it is inferior to ability for global search. When the optimization stagnates for long generation cycle, a searching point, population in this case, would be caught in a wide local minimum area. To escape such local minimum area, small change in a population should be required. Based on such consideration, we propose a mutation operator activated depending on the optimization speed. When the optimization stagnates, in other words, when the optimization speed decreases, the mutation yields small changes in the population. Then the population is able to escape from a local minimum area by means of the mutation. However, this mutation operator requires two well-defined parameters. This means that user have to consider the value of these parameters carefully. To solve this problem, we propose a periodic mutation operator which has only one parameter to define itself. This simplified mutation operator is effective over a wide range of the parameter value.
Computational complexity of ecological and evolutionary spatial dynamics
Ibsen-Jensen, Rasmus; Chatterjee, Krishnendu; Nowak, Martin A.
2015-01-01
There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. There are classes of harder problems for which the fastest possible algorithm requires exponential time. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. Precise complexity results can be notoriously difficult. The famous question whether polynomial time equals nondeterministic polynomial time (i.e., P = NP) is one of the hardest open problems in computer science and all of mathematics. Here, we consider simple processes of ecological and evolutionary spatial dynamics. The basic question is: What is the probability that a new invader (or a new mutant) will take over a resident population? We derive precise complexity results for a variety of scenarios. We therefore show that some fundamental questions in this area cannot be answered by simple equations (assuming that P is not equal to NP). PMID:26644569
Promoting the Multidimensional Character of Scientific Reasoning †
Bradshaw, William S.; Nelson, Jennifer; Adams, Byron J.; Bell, John D.
2017-01-01
This study reports part of a long-term program to help students improve scientific reasoning using higher-order cognitive tasks set in the discipline of cell biology. This skill was assessed using problems requiring the construction of valid conclusions drawn from authentic research data. We report here efforts to confirm the hypothesis that data interpretation is a complex, multifaceted exercise. Confirmation was obtained using a statistical treatment showing that various such problems rank students differently—each contains a unique set of cognitive challenges. Additional analyses of performance results have allowed us to demonstrate that individuals differ in their capacity to navigate five independent generic elements that constitute successful data interpretation: biological context, connection to course concepts, experimental protocols, data inference, and integration of isolated experimental observations into a coherent model. We offer these aspects of scientific thinking as a “data analysis skills inventory,” along with usable sample problems that illustrate each element. Additionally, we show that this kind of reasoning is rigorous in that it is difficult for most novice students, who are unable to intuitively implement strategies for improving these skills. Instructors armed with knowledge of the specific challenges presented by different types of problems can provide specific helpful feedback during formative practice. The use of this instructional model is most likely to require changes in traditional classroom instruction. PMID:28512524
NASA Astrophysics Data System (ADS)
Magee, Daniel J.; Niemeyer, Kyle E.
2018-03-01
The expedient design of precision components in aerospace and other high-tech industries requires simulations of physical phenomena often described by partial differential equations (PDEs) without exact solutions. Modern design problems require simulations with a level of resolution difficult to achieve in reasonable amounts of time-even in effectively parallelized solvers. Though the scale of the problem relative to available computing power is the greatest impediment to accelerating these applications, significant performance gains can be achieved through careful attention to the details of memory communication and access. The swept time-space decomposition rule reduces communication between sub-domains by exhausting the domain of influence before communicating boundary values. Here we present a GPU implementation of the swept rule, which modifies the algorithm for improved performance on this processing architecture by prioritizing use of private (shared) memory, avoiding interblock communication, and overwriting unnecessary values. It shows significant improvement in the execution time of finite-difference solvers for one-dimensional unsteady PDEs, producing speedups of 2 - 9 × for a range of problem sizes, respectively, compared with simple GPU versions and 7 - 300 × compared with parallel CPU versions. However, for a more sophisticated one-dimensional system of equations discretized with a second-order finite-volume scheme, the swept rule performs 1.2 - 1.9 × worse than a standard implementation for all problem sizes.
Can Joined-Up Data Lead to Joined-Up Thinking? The Western Australian Developmental Pathways Project
Stanley, Fiona; Glauert, Rebecca; McKenzie, Anne; O'Donnell, Melissa
2011-01-01
Modern societies are challenged by “wicked problems” – by definition, those that are difficult to define, multi-causal and hard to treat. Problems such as low birth weight, obesity, mental ill health, teenage pregnancy, educational difficulties and juvenile crime fit this category. Given the complex nature of these problems, they require the best data in order to measure them, guide policy frameworks and evaluate whether the steps taken to address them are actually making a difference. What such problems really require are joined-up approaches to enable effective solutions. In this paper, we describe a unique initiative to encourage a more preventive, whole-of-government approach to these problems – the Developmental Pathways Project, which has enabled the linkage of a large number of de-identified administrative databases in order to explore the pathways into and out of the negative outcomes affecting our children and youth. This project has not only enabled the linkage of agency data, but also of agency personnel, in order to improve and promote cross-agency research, policy and preventive solutions. Through the use of these linkages we are attempting to shift the paradigm to encourage agencies to appreciate that these “wicked problems” demand a preventive approach, as well as the provision of effective services for those already affected. PMID:24933374
a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps
NASA Astrophysics Data System (ADS)
Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.
2016-06-01
Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.
Ellis, Sarah Lh
2018-05-01
Practical relevance: Crucial to successful treatment of problem behaviour and optimising the welfare of the individual cat is determining which underpinning emotion(s) are involved in the presentation of the behaviour. Feline emotions are not feelings per se, but motivational-emotional systems that are responsible for instinctual emotional arousal. Often different interventions are required to alleviate different negative emotional motivations. Clinical challenges: Identifying different emotional motivations and the arousal level associated with them solely from observations of behaviour and body language is a difficult task because, as with any species, the behavioural repertoire of the domestic cat is finite and the same behaviour may occur with the activation of different emotional systems. In addition, cats, like people, may experience more than one emotion at the same time or switch quickly between emotional motivations, and this further complicates identification. The behavioural assessment of pain is also notoriously difficult in cats. Evidence base: This review draws on the published literature where available and, where there is a paucity of research, on hypotheses derived from observations of professionals in the field. Global importance: Being able to recognise and assess feline emotional motivations in order to address problem behaviours and improve welfare is important for all veterinarians who see cats.
[Quality assurance in airway management: education and training for difficult airway management].
Kaminoh, Yoshiroh
2006-01-01
Respiratory problem is one of the main causes of death or severe brain damage in perioperative period. Three major factors of respiratory problem are esophageal intubation, inadequate ventilation, and difficult airway. The wide spread of pulse oximeter and capnograph reduced the incidences of esophageal intubation and inadequate ventilation, but the difficult airway still occupies the large portion in the causes of adverse events during anesthesia. "Practice guideline for management of the difficult airway" was proposed by American Society of Anesthesiologists (ASA) in 1992 and 2002. Improvement of knowledge, technical skills, and cognitive skills are necessary for the education and training of the difficult airway management. "The practical seminar of difficult airway management (DAM practical seminar)" has been cosponsored by the Japanese Association of Medical Simulation (JAMS) in the 51 st and 52 nd annual meetings of Japanese Society of Anesthesiologists and the 24th annual meeting of Japanese Society for Clinical Anesthesia. The DAM practical seminar is composed of the lecture session for ASA difficult airway algorithm, the hands-on training session for technical skills, and the scenario-based training session for cognitive skills. Ninty six Japanese anesthesiologists have completed the DAM practical seminar in one year. "The DAM instructor course" should be immediately prepared to organize the seminar more frequently.
[Presurgical orthodontics for facial asymmetry].
Labarrère, H
2003-03-01
As with the treatment of all facial deformities, orthodontic pre-surgical preparation for facial asymmetry should aim at correcting severe occlusal discrepancies not solely on the basis of a narrow occlusal analysis but also in a way that will not disturb the proposed surgical protocol. In addition, facial asymmetries require specific adjustments, difficult to derive and to apply because of their inherent atypical morphological orientation of both alveolar and basal bony support. Three treated cases illustrate different solutions to problems posed by pathological torque: this torque must be considered with respect to proposed surgical changes, within the framework of their limitations and their possible contra-indications.
Apparatus for electroplating particles of small dimension
Yu, C.M.; Illige, J.D.
1980-09-19
The thickness, uniformity, and surface smoothness requirements for surface coatings of glass microspheres for use as targets for laser fusion research are critical. Because of thier minute size, the microspheres are difficult to manipulate and control in electroplating systems. The electroplating apparatus of the present invention addresses these problems by providing a cathode cell having a cell chamber, a cathode and an anode electrically isolated from each other and connected to an electrical power source. During the plating process, the cathode is controllably vibrated along with solution pulse to maintain the particles in random free motion so as to attain the desired properties.
Optical polarization: plenty of room for surprise
NASA Astrophysics Data System (ADS)
Chenault, David
2012-10-01
Solutions to difficult problems in complex environments frequently require methodical approaches and detailed research plans. Fortunately, there is still room for serendipity in disciplines that are relatively mature and well understood. As a case in point, optical sensors that exploit polarized light have matured greatly and the discipline of optical polarization has grown substantially in scope and applications over the last ten years. In spite of this increased understanding, polarization signatures are frequently not well understood. A good example is polarization in the animal kingdom. The potential for polarimetric monitoring of moose populations, and other applications, will be discussed.
Computational Psychometrics for Modeling System Dynamics during Stressful Disasters.
Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe
2017-01-01
Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.
Xenon lighting adjusted to plant requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koefferlein, M.; Doehring, T.; Payer, H.D.
1994-12-31
The high luminous flux and spectral properties of xenon lamps would provide an ideal luminary for plant lighting if not excess IR radiation poses several problems for an application: the required filter systems reduce the irradiance at spectral regions of particular importance for plant development. Most of the economical drawbacks of xenon lamps are related to the difficult handling of that excess IR energy. Furthermore, the temporal variation of the xenon output depending on the oscillations of the applied AC voltage has to be considered for the plant development. However, xenon lamps outperform other lighting systems with respect to spectralmore » stability, immediate response, and maximum luminance. Therefore, despite considerable competition by other lighting techniques, xenon lamps provide a very useful tool for special purposes. In plant lighting however, they seem to play a less important role as other lamp and lighting developments can meet these particular requirements at lower costs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haghighat, A.; Sjoden, G.E.; Wagner, J.C.
In the past 10 yr, the Penn State Transport Theory Group (PSTTG) has concentrated its efforts on developing accurate and efficient particle transport codes to address increasing needs for efficient and accurate simulation of nuclear systems. The PSTTG's efforts have primarily focused on shielding applications that are generally treated using multigroup, multidimensional, discrete ordinates (S{sub n}) deterministic and/or statistical Monte Carlo methods. The difficulty with the existing public codes is that they require significant (impractical) computation time for simulation of complex three-dimensional (3-D) problems. For the S{sub n} codes, the large memory requirements are handled through the use of scratchmore » files (i.e., read-from and write-to-disk) that significantly increases the necessary execution time. Further, the lack of flexible features and/or utilities for preparing input and processing output makes these codes difficult to use. The Monte Carlo method becomes impractical because variance reduction (VR) methods have to be used, and normally determination of the necessary parameters for the VR methods is very difficult and time consuming for a complex 3-D problem. For the deterministic method, the authors have developed the 3-D parallel PENTRAN (Parallel Environment Neutral-particle TRANsport) code system that, in addition to a parallel 3-D S{sub n} solver, includes pre- and postprocessing utilities. PENTRAN provides for full phase-space decomposition, memory partitioning, and parallel input/output to provide the capability of solving large problems in a relatively short time. Besides having a modular parallel structure, PENTRAN has several unique new formulations and features that are necessary for achieving high parallel performance. For the Monte Carlo method, the major difficulty currently facing most users is the selection of an effective VR method and its associated parameters. For complex problems, generally, this process is very time consuming and may be complicated due to the possibility of biasing the results. In an attempt to eliminate this problem, the authors have developed the A{sup 3}MCNP (automated adjoint accelerated MCNP) code that automatically prepares parameters for source and transport biasing within a weight-window VR approach based on the S{sub n} adjoint function. A{sup 3}MCNP prepares the necessary input files for performing multigroup, 3-D adjoint S{sub n} calculations using TORT.« less
Drilling Regolith: Why Is It So Difficult?
NASA Astrophysics Data System (ADS)
Schmitt, H. H.
2017-10-01
The Apollo rotary percussive drill system penetrated the lunar regolith with reasonable efficiency; however, extraction of the drill core stem proved to be very difficult on all three missions. Retractable drill stem flutes may solve this problem.
Everyday beliefs about sources of advice for the parents of difficult children.
Sonuga-Barke, E J; Thompson, M; Balding, J
1993-01-01
Parents were asked which sources of advice families with difficult children should seek. The results suggested a similar hierarchy of agencies for both boys and girls with emotional and behavioural problems. Teachers, doctors, child psychiatrists and health visitors all received strong positive ratings, books about children with problems received moderate positive ratings, religious leaders received the strongest negative ratings and grandparents and friends received neutral ratings. Implications for service provision are discussed.
ERIC Educational Resources Information Center
Silberman, Mel
Written for parents, this book discusses four steps for dealing with children's difficult behavior. The book is divided into two parts. Part 1, "The Building Blocks," discusses baseline perspectives parents need to establish in order to effectively deal with difficult behavior. Topics covered include: (1) parents' dual roles as caregivers and…
IESIP - AN IMPROVED EXPLORATORY SEARCH TECHNIQUE FOR PURE INTEGER LINEAR PROGRAMMING PROBLEMS
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1994-01-01
IESIP, an Improved Exploratory Search Technique for Pure Integer Linear Programming Problems, addresses the problem of optimizing an objective function of one or more variables subject to a set of confining functions or constraints by a method called discrete optimization or integer programming. Integer programming is based on a specific form of the general linear programming problem in which all variables in the objective function and all variables in the constraints are integers. While more difficult, integer programming is required for accuracy when modeling systems with small numbers of components such as the distribution of goods, machine scheduling, and production scheduling. IESIP establishes a new methodology for solving pure integer programming problems by utilizing a modified version of the univariate exploratory move developed by Robert Hooke and T.A. Jeeves. IESIP also takes some of its technique from the greedy procedure and the idea of unit neighborhoods. A rounding scheme uses the continuous solution found by traditional methods (simplex or other suitable technique) and creates a feasible integer starting point. The Hook and Jeeves exploratory search is modified to accommodate integers and constraints and is then employed to determine an optimal integer solution from the feasible starting solution. The user-friendly IESIP allows for rapid solution of problems up to 10 variables in size (limited by DOS allocation). Sample problems compare IESIP solutions with the traditional branch-and-bound approach. IESIP is written in Borland's TURBO Pascal for IBM PC series computers and compatibles running DOS. Source code and an executable are provided. The main memory requirement for execution is 25K. This program is available on a 5.25 inch 360K MS DOS format diskette. IESIP was developed in 1990. IBM is a trademark of International Business Machines. TURBO Pascal is registered by Borland International.
Multipass Target Search in Natural Environments
Otte, Michael W.; Sofge, Donald; Gupta, Satyandra K.
2017-01-01
Consider a disaster scenario where search and rescue workers must search difficult to access buildings during an earthquake or flood. Often, finding survivors a few hours sooner results in a dramatic increase in saved lives, suggesting the use of drones for expedient rescue operations. Entropy can be used to quantify the generation and resolution of uncertainty. When searching for targets, maximizing mutual information of future sensor observations will minimize expected target location uncertainty by minimizing the entropy of the future estimate. Motion planning for multi-target autonomous search requires planning over an area with an imperfect sensor and may require multiple passes, which is hindered by the submodularity property of mutual information. Further, mission duration constraints must be handled accordingly, requiring consideration of the vehicle’s dynamics to generate feasible trajectories and must plan trajectories spanning the entire mission duration, something which most information gathering algorithms are incapable of doing. If unanticipated changes occur in an uncertain environment, new plans must be generated quickly. In addition, planning multipass trajectories requires evaluating path dependent rewards, requiring planning in the space of all previously selected actions, compounding the problem. We present an anytime algorithm for autonomous multipass target search in natural environments. The algorithm is capable of generating long duration dynamically feasible multipass coverage plans that maximize mutual information using a variety of techniques such as ϵ-admissible heuristics to speed up the search. To the authors’ knowledge this is the first attempt at efficiently solving multipass target search problems of such long duration. The proposed algorithm is based on best first branch and bound and is benchmarked against state of the art algorithms adapted to the problem in natural Simplex environments, gathering the most information in the given search time. PMID:29099087
NASA Astrophysics Data System (ADS)
Aittokoski, Timo; Miettinen, Kaisa
2008-07-01
Solving real-life engineering problems can be difficult because they often have multiple conflicting objectives, the objective functions involved are highly nonlinear and they contain multiple local minima. Furthermore, function values are often produced via a time-consuming simulation process. These facts suggest the need for an automated optimization tool that is efficient (in terms of number of objective function evaluations) and capable of solving global and multiobjective optimization problems. In this article, the requirements on a general simulation-based optimization system are discussed and such a system is applied to optimize the performance of a two-stroke combustion engine. In the example of a simulation-based optimization problem, the dimensions and shape of the exhaust pipe of a two-stroke engine are altered, and values of three conflicting objective functions are optimized. These values are derived from power output characteristics of the engine. The optimization approach involves interactive multiobjective optimization and provides a convenient tool to balance between conflicting objectives and to find good solutions.
NASA Astrophysics Data System (ADS)
Ushijima, T.; Yeh, W.
2013-12-01
An optimal experimental design algorithm is developed to select locations for a network of observation wells that provides the maximum information about unknown hydraulic conductivity in a confined, anisotropic aquifer. The design employs a maximal information criterion that chooses, among competing designs, the design that maximizes the sum of squared sensitivities while conforming to specified design constraints. Because that the formulated problem is non-convex and contains integer variables (necessitating a combinatorial search), for a realistically-scaled model, the problem may be difficult, if not impossible, to solve through traditional mathematical programming techniques. Genetic Algorithms (GAs) are designed to search out the global optimum; however because a GA requires a large number of calls to a groundwater model, the formulated optimization problem may still be infeasible to solve. To overcome this, Proper Orthogonal Decomposition (POD) is applied to the groundwater model to reduce its dimension. The information matrix in the full model space can then be searched without solving the full model.
Serang, Oliver; MacCoss, Michael J.; Noble, William Stafford
2010-01-01
The problem of identifying proteins from a shotgun proteomics experiment has not been definitively solved. Identifying the proteins in a sample requires ranking them, ideally with interpretable scores. In particular, “degenerate” peptides, which map to multiple proteins, have made such a ranking difficult to compute. The problem of computing posterior probabilities for the proteins, which can be interpreted as confidence in a protein’s presence, has been especially daunting. Previous approaches have either ignored the peptide degeneracy problem completely, addressed it by computing a heuristic set of proteins or heuristic posterior probabilities, or by estimating the posterior probabilities with sampling methods. We present a probabilistic model for protein identification in tandem mass spectrometry that recognizes peptide degeneracy. We then introduce graph-transforming algorithms that facilitate efficient computation of protein probabilities, even for large data sets. We evaluate our identification procedure on five different well-characterized data sets and demonstrate our ability to efficiently compute high-quality protein posteriors. PMID:20712337
Perspectives on the geographic stability and mobility of people in cities
Hanson, Susan
2005-01-01
A class of questions in the human environment sciences focuses on the relationship between individual or household behavior and local geographic context. Central to these questions is the nature of people's geographic mobility as well as the duration of their locational stability at varying spatial and temporal scales. The problem for researchers is that the processes of mobility/stability are temporally and spatially dynamic and therefore difficult to measure. Whereas time and space are continuous, analysts must select levels of aggregation for both length of time in place and spatial scale of place that fit with the problem in question. Previous work has emphasized mobility and suppressed stability as an analytic category. I focus here on stability and show how analyzing individuals' stability requires also analyzing their mobility. Through an empirical example centered on the relationship between entrepreneurship and place, I demonstrate how a spotlight on stability illuminates a resolution to the measurement problem by highlighting the interdependence between the time and space dimensions of stability/mobility. PMID:16230616
Improved Quasi-Newton method via PSB update for solving systems of nonlinear equations
NASA Astrophysics Data System (ADS)
Mamat, Mustafa; Dauda, M. K.; Waziri, M. Y.; Ahmad, Fadhilah; Mohamad, Fatma Susilawati
2016-10-01
The Newton method has some shortcomings which includes computation of the Jacobian matrix which may be difficult or even impossible to compute and solving the Newton system in every iteration. Also, the common setback with some quasi-Newton methods is that they need to compute and store an n × n matrix at each iteration, this is computationally costly for large scale problems. To overcome such drawbacks, an improved Method for solving systems of nonlinear equations via PSB (Powell-Symmetric-Broyden) update is proposed. In the proposed method, the approximate Jacobian inverse Hk of PSB is updated and its efficiency has improved thereby require low memory storage, hence the main aim of this paper. The preliminary numerical results show that the proposed method is practically efficient when applied on some benchmark problems.
Evaluating models of climate and forest vegetation
NASA Technical Reports Server (NTRS)
Clark, James S.
1992-01-01
Understanding how the biosphere may respond to increasing trace gas concentrations in the atmosphere requires models that contain vegetation responses to regional climate. Most of the processes ecologists study in forests, including trophic interactions, nutrient cycling, and disturbance regimes, and vital components of the world economy, such as forest products and agriculture, will be influenced in potentially unexpected ways by changing climate. These vegetation changes affect climate in the following ways: changing C, N, and S pools; trace gases; albedo; and water balance. The complexity of the indirect interactions among variables that depend on climate, together with the range of different space/time scales that best describe these processes, make the problems of modeling and prediction enormously difficult. These problems of predicting vegetation response to climate warming and potential ways of testing model predictions are the subjects of this chapter.
Genetic Parallel Programming: design and implementation.
Cheang, Sin Man; Leung, Kwong Sak; Lee, Kin Hong
2006-01-01
This paper presents a novel Genetic Parallel Programming (GPP) paradigm for evolving parallel programs running on a Multi-Arithmetic-Logic-Unit (Multi-ALU) Processor (MAP). The MAP is a Multiple Instruction-streams, Multiple Data-streams (MIMD), general-purpose register machine that can be implemented on modern Very Large-Scale Integrated Circuits (VLSIs) in order to evaluate genetic programs at high speed. For human programmers, writing parallel programs is more difficult than writing sequential programs. However, experimental results show that GPP evolves parallel programs with less computational effort than that of their sequential counterparts. It creates a new approach to evolving a feasible problem solution in parallel program form and then serializes it into a sequential program if required. The effectiveness and efficiency of GPP are investigated using a suite of 14 well-studied benchmark problems. Experimental results show that GPP speeds up evolution substantially.
An echelle spectrograph for middle ultraviolet solar spectroscopy from rockets.
Tousey, R; Purcell, J D; Garrett, D L
1967-03-01
An echelle grating spectrograph is ideal for use in a rocket when high resolution is required becaus itoccupies a minimum of space. The instrument described covers the range 4000-2000 A with a resolution of 0.03 A. It was designed to fit into the solar biaxial pointing-control section of an Aerobee-150 rocket. The characteristics of the spectrograph are illustrated with laboratory spectra of iron and carbon are sources and with solar spectra obtained during rocket flights in 1961 and 1964. Problems encountered in analyzing the spectra are discussed. The most difficult design problem was the elimination of stray light when used with the sun. Of the several methods investigated, the most effective was a predispersing system in the form of a zero-dispersion double monochromator. This was made compact by folding the beam four times.
The pollution of the marine environment by plastic debris: a review.
Derraik, José G B
2002-09-01
The deleterious effects of plastic debris on the marine environment were reviewed by bringing together most of the literature published so far on the topic. A large number of marine species is known to be harmed and/or killed by plastic debris, which could jeopardize their survival, especially since many are already endangered by other forms of anthropogenic activities. Marine animals are mostly affected through entanglement in and ingestion of plastic litter. Other less known threats include the use of plastic debris by "invader" species and the absorption of polychlorinated biphenyls from ingested plastics. Less conspicuous forms, such as plastic pellets and "scrubbers" are also hazardous. To address the problem of plastic debris in the oceans is a difficult task, and a variety of approaches are urgently required. Some of the ways to mitigate the problem are discussed.
Performance Analysis and Portability of the PLUM Load Balancing System
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.
1998-01-01
The ability to dynamically adapt an unstructured mesh is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive numerical computations in a message-passing environment. PLUM requires that all data be globally redistributed after each mesh adaption to achieve load balance. We present an algorithm for minimizing this remapping overhead by guaranteeing an optimal processor reassignment. We also show that the data redistribution cost can be significantly reduced by applying our heuristic processor reassignment algorithm to the default mapping of the parallel partitioner. Portability is examined by comparing performance on a SP2, an Origin2000, and a T3E. Results show that PLUM can be successfully ported to different platforms without any code modifications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duchaineau, M.; Wolinsky, M.; Sigeti, D.E.
Terrain visualization is a difficult problem for applications requiring accurate images of large datasets at high frame rates, such as flight simulation and ground-based aircraft testing using synthetic sensor stimulation. On current graphics hardware, the problem is to maintain dynamic, view-dependent triangle meshes and texture maps that produce good images at the required frame rate. We present an algorithm for constructing triangle meshes that optimizes flexible view-dependent error metrics, produces guaranteed error bounds, achieves specified triangle counts directly, and uses frame-to-frame coherence to operate at high frame rates for thousands of triangles per frame. Our method, dubbed Real-time Optimally Adaptingmore » Meshes (ROAM), uses two priority queues to drive split and merge operations that maintain continuous triangulations built from pre-processed bintree triangles. We introduce two additional performance optimizations: incremental triangle stripping and priority-computation deferral lists. ROAM execution time is proportionate to the number of triangle changes per frame, which is typically a few percent of the output mesh size, hence ROAM performance is insensitive to the resolution and extent of the input terrain. Dynamic terrain and simple vertex morphing are supported.« less
Toward Scalable Trustworthy Computing Using the Human-Physiology-Immunity Metaphor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hively, Lee M; Sheldon, Frederick T
The cybersecurity landscape consists of an ad hoc patchwork of solutions. Optimal cybersecurity is difficult for various reasons: complexity, immense data and processing requirements, resource-agnostic cloud computing, practical time-space-energy constraints, inherent flaws in 'Maginot Line' defenses, and the growing number and sophistication of cyberattacks. This article defines the high-priority problems and examines the potential solution space. In that space, achieving scalable trustworthy computing and communications is possible through real-time knowledge-based decisions about cyber trust. This vision is based on the human-physiology-immunity metaphor and the human brain's ability to extract knowledge from data and information. The article outlines future steps towardmore » scalable trustworthy systems requiring a long-term commitment to solve the well-known challenges.« less
Passive magnetic bearing system
Post, Richard F.
2014-09-02
An axial stabilizer for the rotor of a magnetic bearing provides external control of stiffness through switching in external inductances. External control also allows the stabilizer to become a part of a passive/active magnetic bearing system that requires no external source of power and no position sensor. Stabilizers for displacements transverse to the axis of rotation are provided that require only a single cylindrical Halbach array in its operation, and thus are especially suited for use in high rotation speed applications, such as flywheel energy storage systems. The elimination of the need of an inner cylindrical array solves the difficult mechanical problem of supplying support against centrifugal forces for the magnets of that array. Compensation is provided for the temperature variation of the strength of the magnetic fields of the permanent magnets in the levitating magnet arrays.
NASA Technical Reports Server (NTRS)
Lippisch, Espenlaub
1922-01-01
Any one endeavoring to solve the problem of soaring flight is confronted not only by structural difficulties, but also by the often far more difficult aerodynamic problem of flight properties and efficiency, which can only be determined by experimenting with the finished glider.
Dietary habits and behaviors associated with nonalcoholic fatty liver disease
Yasutake, Kenichiro; Kohjima, Motoyuki; Kotoh, Kazuhiro; Nakashima, Manabu; Nakamuta, Makoto; Enjoji, Munechika
2014-01-01
Nonalcoholic fatty liver disease (NAFLD) is one of the most frequent causes of health problems in Western (industrialized) countries. Moreover, the incidence of infantile NAFLD is increasing, with some of these patients progressing to nonalcoholic steatohepatitis. These trends depend on dietary habits and life-style. In particular, overeating and its associated obesity affect the development of NAFLD. Nutritional problems in patients with NAFLD include excess intake of energy, carbohydrates, and lipids, and shortages of polyunsaturated fatty acids, vitamins, and minerals. Although nutritional therapeutic approaches are required for prophylaxis and treatment of NAFLD, continuous nutrition therapy is difficult for many patients because of their dietary habits and lifestyle, and because the motivation for treatment differs among patients. Thus, it is necessary to assess the nutritional background and to identify nutritional problems in each patient with NAFLD. When assessing dietary habits, it is important to individually evaluate those that are consumed excessively or insufficiently, as well as inappropriate eating behaviors. Successful nutrition therapy requires patient education, based on assessments of individual nutrients, and continuing the treatment. In this article, we update knowledge about NAFLD, review the important aspects of nutritional assessment targeting treatment success, and present some concrete nutritional care plans which can be applied generally. PMID:24587653
Hatt, Sarah R; Leske, David A; Wernimont, Suzanne M; Birch, Eileen E; Holmes, Jonathan M
2017-03-01
A rating scale is a critical component of patient-reported outcome instrument design, but the optimal rating scale format for pediatric use has not been investigated. We compared rating scale performance when administering potential questionnaire items to children with eye disorders and their parents. Three commonly used rating scales were evaluated: frequency (never, sometimes, often, always), severity (not at all, a little, some, a lot), and difficulty (not difficult, a little difficult, difficult, very difficult). Ten patient-derived items were formatted for each rating scale, and rating scale testing order was randomized. Both child and parent were asked to comment on any problems with, or a preference for, a particular scale. Any confusion about options or inability to answer was recorded. Twenty-one children, aged 5-17 years, with strabismus, amblyopia, or refractive error were recruited, each with one of their parents. Of the first 10 children, 4 (40%) had problems using the difficulty scale, compared with 1 (10%) using frequency, and none using severity. The difficulty scale was modified, replacing the word "difficult" with "hard." Eleven additional children (plus parents) then completed all 3 questionnaires. No children had problems using any scale. Four (36%) parents had problems using the difficulty ("hard") scale and 1 (9%) with frequency. Regarding preference, 6 (55%) of 11 children and 5 (50%) of 10 parents preferred using the frequency scale. Children and parents found the frequency scale and question format to be the most easily understood. Children and parents also expressed preference for the frequency scale, compared with the difficulty and severity scales. We recommend frequency rating scales for patient-reported outcome measures in pediatric populations.
ERIC Educational Resources Information Center
Kelly, Ronald R.
2003-01-01
Presents "Project Solve," a web-based problem-solving instruction and guided practice for mathematical word problems. Discusses implications for college students for whom reading and comprehension of mathematical word problem solving are difficult, especially learning disabled students. (Author/KHR)
ERIC Educational Resources Information Center
Quinn, Diane M.; Spencer, Steven J.
2001-01-01
Investigated whether stereotype threat would depress college women's math performance. In one test, men outperformed women when solving word problems, though women performed equally when problems were converted into numerical equivalents. In another test, participants solved difficult problems in high or reduced stereotype threat conditions. Women…
The Role of Expository Writing in Mathematical Problem Solving
ERIC Educational Resources Information Center
Craig, Tracy S.
2016-01-01
Mathematical problem-solving is notoriously difficult to teach in a standard university mathematics classroom. The project on which this article reports aimed to investigate the effect of the writing of explanatory strategies in the context of mathematical problem solving on problem-solving behaviour. This article serves to describe the…
Using KIE To Help Students Develop Shared Criteria for House Designs.
ERIC Educational Resources Information Center
Cuthbert, Alex; Hoadley, Christopher M.
How can students develop shared criteria for problems that have no "right" answer? Ill-structured problems of this sort are called design problems. Like portfolio projects, these problems are difficult to evaluate for both teachers and students. This investigation contrasts two methods for developing shared criteria for project…
ERIC Educational Resources Information Center
Waters, W. G., II
1973-01-01
Analyzes the urban transport problems in comparison with those involved in a journey to the Moon. Indicates that the problem of enabling man to travel through the inner space of conurbations may prove to be more difficult than the transport problem of space travel. (CC)
"But You Look So Good!": Managing Specific Issues
... and resources about handling bladder or bowel problems. Self-esteem “I think the most difficult thing to cope ... feel. It’s understandable then that MS can impact self-esteem and confidence. “It’s difficult to feel powerful, competent, ...
A componential view of children's difficulties in learning fractions.
Gabriel, Florence; Coché, Frédéric; Szucs, Dénes; Carette, Vincent; Rey, Bernard; Content, Alain
2013-01-01
Fractions are well known to be difficult to learn. Various hypotheses have been proposed in order to explain those difficulties: fractions can denote different concepts; their understanding requires a conceptual reorganization with regard to natural numbers; and using fractions involves the articulation of conceptual knowledge with complex manipulation of procedures. In order to encompass the major aspects of knowledge about fractions, we propose to distinguish between conceptual and procedural knowledge. We designed a test aimed at assessing the main components of fraction knowledge. The test was carried out by fourth-, fifth- and sixth-graders from the French Community of Belgium. The results showed large differences between categories. Pupils seemed to master the part-whole concept, whereas numbers and operations posed problems. Moreover, pupils seemed to apply procedures they do not fully understand. Our results offer further directions to explain why fractions are amongst the most difficult mathematical topics in primary education. This study offers a number of recommendations on how to teach fractions.
A Multidisciplinary Approach to Mixer-Ejector Analysis and Design
NASA Technical Reports Server (NTRS)
Hendricks, Eric, S.; Seidel, Jonathan, A.
2012-01-01
The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.
Pycortex: an interactive surface visualizer for fMRI
Gao, James S.; Huth, Alexander G.; Lescroart, Mark D.; Gallant, Jack L.
2015-01-01
Surface visualizations of fMRI provide a comprehensive view of cortical activity. However, surface visualizations are difficult to generate and most common visualization techniques rely on unnecessary interpolation which limits the fidelity of the resulting maps. Furthermore, it is difficult to understand the relationship between flattened cortical surfaces and the underlying 3D anatomy using tools available currently. To address these problems we have developed pycortex, a Python toolbox for interactive surface mapping and visualization. Pycortex exploits the power of modern graphics cards to sample volumetric data on a per-pixel basis, allowing dense and accurate mapping of the voxel grid across the surface. Anatomical and functional information can be projected onto the cortical surface. The surface can be inflated and flattened interactively, aiding interpretation of the correspondence between the anatomical surface and the flattened cortical sheet. The output of pycortex can be viewed using WebGL, a technology compatible with modern web browsers. This allows complex fMRI surface maps to be distributed broadly online without requiring installation of complex software. PMID:26483666
Using adaptive-mesh refinement in SCFT simulations of surfactant adsorption
NASA Astrophysics Data System (ADS)
Sides, Scott; Kumar, Rajeev; Jamroz, Ben; Crockett, Robert; Pletzer, Alex
2013-03-01
Adsorption of surfactants at interfaces is relevant to many applications such as detergents, adhesives, emulsions and ferrofluids. Atomistic simulations of interface adsorption are challenging due to the difficulty of modeling the wide range of length scales in these problems: the thin interface region in equilibrium with a large bulk region that serves as a reservoir for the adsorbed species. Self-consistent field theory (SCFT) has been extremely useful for studying the morphologies of dense block copolymer melts. Field-theoretic simulations such as these are able to access large length and time scales that are difficult or impossible for particle-based simulations such as molecular dynamics. However, even SCFT methods can be difficult to apply to systems in which small spatial regions might require finer resolution than most of the simulation grid (eg. interface adsorption and confinement). We will present results on interface adsorption simulations using PolySwift++, an object-oriented, polymer SCFT simulation code aided by the Tech-X Chompst library that enables via block-structured AMR calculations with PETSc.
The human influence on seabird nesting success: Conservation implications
Anderson, D.W.; Keith, J.O.
1980-01-01
Based on studies of brown pelicans Pelecanus occidentalis californicus and Heermann's gulls Larus heermanni, disturbances by recreationists, educational groups, local fishermen and scientists alike can be seriously disruptive and damaging to breeding seabirds in the Gulf of California and off the west coast of Baja California. Similar instances have been identified throughout the world?the problem is not difficult to document, but it is difficult to eliminate. The increasing human-seabird contacts on islands in the Gulf of California and along the west coast of Baja California raise serious questions and immediate concern about the continued preservation of nesting colonies of marine birds in those areas. Conservation measures must consider the extreme sensitivity of many seabirds to the inter- and intraspecific behavioural imbalances created by human disturbances. In some cases, total exclusion of humans may be required; in others, limited access might be possible under closely managed conditions at certain times of the year. A symbiotic relationship between seabird conservation, legitimate research and tourism should be the desired goal.
A componential view of children's difficulties in learning fractions
Gabriel, Florence; Coché, Frédéric; Szucs, Dénes; Carette, Vincent; Rey, Bernard; Content, Alain
2013-01-01
Fractions are well known to be difficult to learn. Various hypotheses have been proposed in order to explain those difficulties: fractions can denote different concepts; their understanding requires a conceptual reorganization with regard to natural numbers; and using fractions involves the articulation of conceptual knowledge with complex manipulation of procedures. In order to encompass the major aspects of knowledge about fractions, we propose to distinguish between conceptual and procedural knowledge. We designed a test aimed at assessing the main components of fraction knowledge. The test was carried out by fourth-, fifth- and sixth-graders from the French Community of Belgium. The results showed large differences between categories. Pupils seemed to master the part-whole concept, whereas numbers and operations posed problems. Moreover, pupils seemed to apply procedures they do not fully understand. Our results offer further directions to explain why fractions are amongst the most difficult mathematical topics in primary education. This study offers a number of recommendations on how to teach fractions. PMID:24133471
Beyond Coordination: Joint Planning and Program Execution. The IHPRPT Materials Working Group
NASA Technical Reports Server (NTRS)
Stropki, Michael A.; Cleyrat, Danial A.; Clinton, Raymond G., Jr.; Rogacki, John R. (Technical Monitor)
2000-01-01
"Partnership is more than just coordination," stated then-Commander of the Air Force Research Laboratory (AFRL), Major General Dick Paul (USAF-Ret), at this year's National Space and Missile Materials Symposium. His comment referred to the example of the joint planning and program execution provided by the Integrated High Payoff Rocket Propulsion Technology (IHPRPT) Materials Working Group (IMWG). Most people agree that fiscal pressures imposed by shrinking budgets have made it extremely difficult to build upon our existing technical capabilities. In times of sufficient budgets, building advanced systems poses no major difficulties. However, with today's budgets, realizing enhanced capabilities and developing advanced systems often comes at an unaffordable cost. Overcoming this problem represents both a challenge and an opportunity to develop new business practices that allow us to develop advanced technologies within the restrictions imposed by current funding levels. Coordination of technology developments between different government agencies and organizations is a valuable tool for technology transfer. However, rarely do the newly developed technologies have direct applicability to other ongoing programs. Technology requirements are typically determined up-front during the program planning stage so that schedule risk can be minimized. The problem with this process is that the costs associated with the technology development are often borne by a single program. Additionally, the potential exists for duplication of technical effort. Changing this paradigm is a difficult process but one that can be extremely worthwhile should the right opportunity arise. The IMWG is one such example where NASA, the DoD, and industry have developed joint requirements that are intended to satisfy multiple program needs. More than mere coordination, the organizations comprising the group come together as partners, sharing information and resources, proceeding from a joint roadmap.
A method for aircraft concept exploration using multicriteria interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Buonanno, Michael Alexander
2005-08-01
The problem of aircraft concept selection has become increasingly difficult in recent years due to changes in the primary evaluation criteria of concepts. In the past, performance was often the primary discriminator, whereas modern programs have placed increased emphasis on factors such as environmental impact, economics, supportability, aesthetics, and other metrics. The revolutionary nature of the vehicles required to simultaneously meet these conflicting requirements has prompted a shift from design using historical data regression techniques for metric prediction to the use of sophisticated physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to select a sub-optimum baseline vehicle. Some extremely important concept decisions, such as the type of control surface arrangement to use, are frequently made without sufficient understanding of their impact on the important system metrics due to a lack of historical guidance, computational resources, or analysis tools. This thesis discusses the difficulties associated with revolutionary system design, and introduces several new techniques designed to remedy them. First, an interactive design method has been developed that allows the designer to provide feedback to a numerical optimization algorithm during runtime, thereby preventing the optimizer from exploiting weaknesses in the analytical model. This method can be used to account for subjective criteria, or as a crude measure of un-modeled quantitative criteria. Other contributions of the work include a modified Structured Genetic Algorithm that enables the efficient search of large combinatorial design hierarchies and an improved multi-objective optimization procedure that can effectively optimize several objectives simultaneously. A new conceptual design method has been created by drawing upon each of these new capabilities and aspects of more traditional design methods. The ability of this new technique to assist in the design of revolutionary vehicles has been demonstrated using a problem of contemporary interest: the concept exploration of a supersonic business jet. This problem was found to be a good demonstration case because of its novelty and unique requirements, and the results of this proof of concept exercise indicate that the new method is effective at providing additional insight into the relationship between a vehicle's requirements and its favorable attributes.
Solving geosteering inverse problems by stochastic Hybrid Monte Carlo method
Shen, Qiuyang; Wu, Xuqing; Chen, Jiefu; ...
2017-11-20
The inverse problems arise in almost all fields of science where the real-world parameters are extracted from a set of measured data. The geosteering inversion plays an essential role in the accurate prediction of oncoming strata as well as a reliable guidance to adjust the borehole position on the fly to reach one or more geological targets. This mathematical treatment is not easy to solve, which requires finding an optimum solution among a large solution space, especially when the problem is non-linear and non-convex. Nowadays, a new generation of logging-while-drilling (LWD) tools has emerged on the market. The so-called azimuthalmore » resistivity LWD tools have azimuthal sensitivity and a large depth of investigation. Hence, the associated inverse problems become much more difficult since the earth model to be inverted will have more detailed structures. The conventional deterministic methods are incapable to solve such a complicated inverse problem, where they suffer from the local minimum trap. Alternatively, stochastic optimizations are in general better at finding global optimal solutions and handling uncertainty quantification. In this article, we investigate the Hybrid Monte Carlo (HMC) based statistical inversion approach and suggest that HMC based inference is more efficient in dealing with the increased complexity and uncertainty faced by the geosteering problems.« less
Constraint-based Temporal Reasoning with Preferences
NASA Technical Reports Server (NTRS)
Khatib, Lina; Morris, Paul; Morris, Robert; Rossi, Francesca; Sperduti, Alessandro; Venable, K. Brent
2005-01-01
Often we need to work in scenarios where events happen over time and preferences are associated to event distances and durations. Soft temporal constraints allow one to describe in a natural way problems arising in such scenarios. In general, solving soft temporal problems require exponential time in the worst case, but there are interesting subclasses of problems which are polynomially solvable. In this paper we identify one of such subclasses giving tractability results. Moreover, we describe two solvers for this class of soft temporal problems, and we show some experimental results. The random generator used to build the problems on which tests are performed is also described. We also compare the two solvers highlighting the tradeoff between performance and robustness. Sometimes, however, temporal local preferences are difficult to set, and it may be easier instead to associate preferences to some complete solutions of the problem. To model everything in a uniform way via local preferences only, and also to take advantage of the existing constraint solvers which exploit only local preferences, we show that machine learning techniques can be useful in this respect. In particular, we present a learning module based on a gradient descent technique which induces local temporal preferences from global ones. We also show the behavior of the learning module on randomly-generated examples.
Bicriteria Network Optimization Problem using Priority-based Genetic Algorithm
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Lin, Lin; Cheng, Runwei
Network optimization is being an increasingly important and fundamental issue in the fields such as engineering, computer science, operations research, transportation, telecommunication, decision support systems, manufacturing, and airline scheduling. In many applications, however, there are several criteria associated with traversing each edge of a network. For example, cost and flow measures are both important in the networks. As a result, there has been recent interest in solving Bicriteria Network Optimization Problem. The Bicriteria Network Optimization Problem is known a NP-hard. The efficient set of paths may be very large, possibly exponential in size. Thus the computational effort required to solve it can increase exponentially with the problem size in the worst case. In this paper, we propose a genetic algorithm (GA) approach used a priority-based chromosome for solving the bicriteria network optimization problem including maximum flow (MXF) model and minimum cost flow (MCF) model. The objective is to find the set of Pareto optimal solutions that give possible maximum flow with minimum cost. This paper also combines Adaptive Weight Approach (AWA) that utilizes some useful information from the current population to readjust weights for obtaining a search pressure toward a positive ideal point. Computer simulations show the several numerical experiments by using some difficult-to-solve network design problems, and show the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Hasanah, N.; Hayashi, Y.; Hirashima, T.
2017-02-01
Arithmetic word problems remain one of the most difficult area of teaching mathematics. Learning by problem posing has been suggested as an effective way to improve students’ understanding. However, the practice in usual classroom is difficult due to extra time needed for assessment and giving feedback to students’ posed problems. To address this issue, we have developed a tablet PC software named Monsakun for learning by posing arithmetic word problems based on Triplet Structure Model. It uses the mechanism of sentence-integration, an efficient implementation of problem-posing that enables agent-assessment of posed problems. The learning environment has been used in actual Japanese elementary school classrooms and the effectiveness has been confirmed in previous researches. In this study, ten Indonesian elementary school students living in Japan participated in a learning session of problem posing using Monsakun in Indonesian language. We analyzed their learning activities and show that students were able to interact with the structure of simple word problem using this learning environment. The results of data analysis and questionnaire suggested that the use of Monsakun provides a way of creating an interactive and fun environment for learning by problem posing for Indonesian elementary school students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
NASA Astrophysics Data System (ADS)
Ushijima, Timothy T.; Yeh, William W.-G.
2013-10-01
An optimal experimental design algorithm is developed to select locations for a network of observation wells that provide maximum information about unknown groundwater pumping in a confined, anisotropic aquifer. The design uses a maximal information criterion that chooses, among competing designs, the design that maximizes the sum of squared sensitivities while conforming to specified design constraints. The formulated optimization problem is non-convex and contains integer variables necessitating a combinatorial search. Given a realistic large-scale model, the size of the combinatorial search required can make the problem difficult, if not impossible, to solve using traditional mathematical programming techniques. Genetic algorithms (GAs) can be used to perform the global search; however, because a GA requires a large number of calls to a groundwater model, the formulated optimization problem still may be infeasible to solve. As a result, proper orthogonal decomposition (POD) is applied to the groundwater model to reduce its dimensionality. Then, the information matrix in the full model space can be searched without solving the full model. Results from a small-scale test case show identical optimal solutions among the GA, integer programming, and exhaustive search methods. This demonstrates the GA's ability to determine the optimal solution. In addition, the results show that a GA with POD model reduction is several orders of magnitude faster in finding the optimal solution than a GA using the full model. The proposed experimental design algorithm is applied to a realistic, two-dimensional, large-scale groundwater problem. The GA converged to a solution for this large-scale problem.
A Knowledge-Based and Model-Driven Requirements Engineering Approach to Conceptual Satellite Design
NASA Astrophysics Data System (ADS)
Dos Santos, Walter A.; Leonor, Bruno B. F.; Stephany, Stephan
Satellite systems are becoming even more complex, making technical issues a significant cost driver. The increasing complexity of these systems makes requirements engineering activities both more important and difficult. Additionally, today's competitive pressures and other market forces drive manufacturing companies to improve the efficiency with which they design and manufacture space products and systems. This imposes a heavy burden on systems-of-systems engineering skills and particularly on requirements engineering which is an important phase in a system's life cycle. When this is poorly performed, various problems may occur, such as failures, cost overruns and delays. One solution is to underpin the preliminary conceptual satellite design with computer-based information reuse and integration to deal with the interdisciplinary nature of this problem domain. This can be attained by taking a model-driven engineering approach (MDE), in which models are the main artifacts during system development. MDE is an emergent approach that tries to address system complexity by the intense use of models. This work outlines the use of SysML (Systems Modeling Language) and a novel knowledge-based software tool, named SatBudgets, to deal with these and other challenges confronted during the conceptual phase of a university satellite system, called ITASAT, currently being developed by INPE and some Brazilian universities.
Condition Number Regularized Covariance Estimation*
Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala
2012-01-01
Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the “large p small n” setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required. PMID:23730197
Condition Number Regularized Covariance Estimation.
Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala
2013-06-01
Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the "large p small n " setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required.
The Efficacy of Using Diagrams When Solving Probability Word Problems in College
ERIC Educational Resources Information Center
Beitzel, Brian D.; Staley, Richard K.
2015-01-01
Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…
The Development, Implementation, and Evaluation of a Problem Solving Heuristic
ERIC Educational Resources Information Center
Lorenzo, Mercedes
2005-01-01
Problem-solving is one of the main goals in science teaching and is something many students find difficult. This research reports on the development, implementation and evaluation of a problem-solving heuristic. This heuristic intends to help students to understand the steps involved in problem solving (metacognitive tool), and to provide them…
The Difficult Patron Situation: A Window of Opportunity To Improve Library Service.
ERIC Educational Resources Information Center
Sarkodie-Mensah, Kwasi
2000-01-01
Discusses the problem library patron from various fronts: historical, personality traits, importance of complaints, nature and types of problem patrons and their behavior, technology and the newly-bred problem patron, strategies for dealing with problem patrons, and ensuring that library administrators and other supervisors understand the need to…
Multidimensional Functional Behaviour Assessment within a Problem Analysis Framework.
ERIC Educational Resources Information Center
Ryba, Ken; Annan, Jean
This paper presents a new approach to contextualized problem analysis developed for use with multimodal Functional Behaviour Assessment (FBA) at Massey University in Auckland, New Zealand. The aim of problem analysis is to simplify complex problems that are difficult to understand. It accomplishes this by providing a high order framework that can…
LaGasse, Linda L.; Conradt, Elisabeth; Karalunas, Sarah L.; Dansereau, Lynne M.; Butner, Jonathan E.; Shankaran, Seetha; Bada, Henrietta; Bauer, Charles R.; Whitaker, Toni M.; Lester, Barry M.
2016-01-01
Developmental psychopathologists face the difficult task of identifying the environmental conditions that may contribute to early childhood behavior problems. Highly stressed caregivers can exacerbate behavior problems, while children with behavior problems may make parenting more difficult and increase caregiver stress. Unknown is: (1) how these transactions originate, (2) whether they persist over time to contribute to the development of problem behavior and (3) what role resilience factors, such as child executive functioning, may play in mitigating the development of problem behavior. In the present study, transactional relations between caregiving stress, executive functioning, and behavior problems were examined in a sample of 1,388 children with prenatal drug exposures at three developmental time points: early childhood (birth-age 5), middle childhood (ages 6 to 9), and early adolescence (ages 10 to 13). Transactional relations differed between caregiving stress and internalizing versus externalizing behavior. Targeting executive functioning in evidence-based interventions for children with prenatal substance exposure who present with internalizing problems and treating caregiving psychopathology, depression, and parenting stress in early childhood may be particularly important for children presenting with internalizing behavior. PMID:27427803
A Cascade Optimization Strategy for Solution of Difficult Multidisciplinary Design Problems
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.; Berke, Laszlo
1996-01-01
A research project to comparatively evaluate 10 nonlinear optimization algorithms was recently completed. A conclusion was that no single optimizer could successfully solve all 40 problems in the test bed, even though most optimizers successfully solved at least one-third of the problems. We realized that improved search directions and step lengths, available in the 10 optimizers compared, were not likely to alleviate the convergence difficulties. For the solution of those difficult problems we have devised an alternative approach called cascade optimization strategy. The cascade strategy uses several optimizers, one followed by another in a specified sequence, to solve a problem. A pseudorandom scheme perturbs design variables between the optimizers. The cascade strategy has been tested successfully in the design of supersonic and subsonic aircraft configurations and air-breathing engines for high-speed civil transport applications. These problems could not be successfully solved by an individual optimizer. The cascade optimization strategy, however, generated feasible optimum solutions for both aircraft and engine problems. This paper presents the cascade strategy and solutions to a number of these problems.
Shapiro, Johanna; Rakhra, Pavandeep; Wong, Adrianne
2016-10-01
Physicians have long had patients whom they have labeled "difficult", but little is known about how medical students perceive difficult encounters with patients. In this study, we analyzed 134 third year medical students' reflective essays written over an 18-month period about difficult student-patient encounters. We used a qualitative computerized software program, Atlas.ti to analyze students' observations and reflections. Main findings include that students described patients who were angry and upset; noncompliant with treatment plans; discussed "nonmedical" problems; fearful, worried, withdrawn, or "disinterested" in their health. Students often described themselves as anxious, uncertain, confused, and frustrated. Nevertheless, they saw themselves behaving in empathic and patient-centered ways while also taking refuge in "standard" behaviors not necessarily appropriate to the circumstances. Students rarely mentioned receiving guidance from attendings regarding how to manage these challenging interactions. These third-year medical students recognized the importance of behaving empathically in difficult situations and often did so. However, they often felt overwhelmed and frustrated, resorting to more reductive behaviors that did not match the needs of the patient. Students need more guidance from attending physicians in order to approach difficult interactions with specific problem-solving skills while maintaining an empathic, patient-centered context.
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
Application of the Water Needs Index: Can Tho City, Mekong Delta, Vietnam
NASA Astrophysics Data System (ADS)
Moglia, Magnus; Neumann, Luis E.; Alexander, Kim S.; Nguyen, Minh N.; Sharma, Ashok K.; Cook, Stephen; Trung, Nguyen H.; Tuan, Dinh D. A.
2012-10-01
SummaryProvision of urban water supplies to rapidly growing cities of South East Asia is difficult because of increasing demand for limited water supplies, periodic droughts, and depletion and contamination of surface and groundwater. In such adverse environments, effective policy and planning processes are required to secure adequate water supplies. Developing a Water Needs Index reveals key elements of the complex urban water supply by means of a participatory approach for rapid and interdisciplinary assessment. The index uses deliberative interactions with stakeholders to create opportunities for mutual understanding, confirmation of constructs and capacity building of all involved. In Can Tho City, located at the heart of the Mekong delta in Vietnam, a Water Needs Index has been developed with local stakeholders. The functional attributes of the Water Needs Index at this urban scale have been critically appraised. Systemic water issues, supply problems, health issues and inadequate, poorly functioning infrastructure requiring attention from local authorities have been identified. Entrenched social and economic inequities in access to water and sanitation, as well as polluting environmental management practices has caused widespread problems for urban populations. The framework provides a common language based on systems thinking, increased cross-sectoral communication, as well as increased recognition of problem issues; this ought to lead to improved urban water management. Importantly, the case study shows that the approach can help to overcome biases of local planners based on their limited experience (information black spots), to allow them to address problems experienced in all areas of the city.
NASA Astrophysics Data System (ADS)
Kalvin, Alan D.
2002-06-01
The importance of using perceptual colormaps for visualizing numerical data is well established in the fields of scientific visualization, computer graphics and color science and related areas of research. In practice however, the use of perceptual colormaps tends to be the exception rather than the rule. In general it is difficult for end-users to find suitable colormaps. In addition, even when such colormaps are available, the inherent variability in color reproduction among computer displays makes it very difficult for the users to verify that these colormaps do indeed preserve their perceptual characteristics when used on different displays. Generally, verification requires display profiling (evaluating the display's color reproduction characteristics), using a colorimeter or a similar type of measuring device. With the growth of the Internet, and the resulting proliferation of remote, client-based displays, the profiling problem has become even more difficult, and in many cases, impossible. We present a method for enumerating and generating perceptual colormaps in such a way that ensures that the perceptual characteristics of the colormaps are maintained for over a wide range of different displays. This method constructs colormaps that are guaranteed to be 'perceptually correct' for a given display by using whatever partial profile information of the display is available. We use the term 'graduated profiling' to describe this method of partial profiling.
SYNTHESIS REPORT ON FIVE DENSE, NONAQUEOUS-PHASE LIQUID (DNAPL) REMEDIATION PROJECTS
Dense non-aqueous phase liquid (DNAPL) poses a difficult problem for subsurface remediation because it serves as a continuing source to dissolved phase ground water contamination and is difficult to remove from interstitial pore space or bedrock fractures in the subsurface. Numer...
On Improving Efficiency of Differential Evolution for Aerodynamic Shape Optimization Applications
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.
2004-01-01
Differential Evolution (DE) is a simple and robust evolutionary strategy that has been provEn effective in determining the global optimum for several difficult optimization problems. Although DE offers several advantages over traditional optimization approaches, its use in applications such as aerodynamic shape optimization where the objective function evaluations are computationally expensive is limited by the large number of function evaluations often required. In this paper various approaches for improving the efficiency of DE are reviewed and discussed. Several approaches that have proven effective for other evolutionary algorithms are modified and implemented in a DE-based aerodynamic shape optimization method that uses a Navier-Stokes solver for the objective function evaluations. Parallelization techniques on distributed computers are used to reduce turnaround times. Results are presented for standard test optimization problems and for the inverse design of a turbine airfoil. The efficiency improvements achieved by the different approaches are evaluated and compared.
[Venous thromboembolism: an urgent call for action].
Páramo, José A; Lecumberri, Ramón
2009-10-17
Thousands of individuals suffer from deep vein thrombosis (DVT) all over the world, and many will die from its main complication, pulmonary embolism (PE). An important problem is that the diagnose is easy to overlook because the signs and symptoms are often difficult to recognize. Why do DVT and PE remain such a serious problem, particularly given the availability of effective strategies for preventing and treating them? The answer lays primarily in the failure to consistently use evidence-based interventions in high-risk individuals and in the lack of adherence to the different prophylactic interventions. In order to impact the incidence and burden of DVT/PE and increase public awareness, implementation of electronic alerts and evidence-based approaches, and scientific translational research are required. The commitment of all levels of governments as well as public and private institutions will be crucial to reduce the incidence of DVT, a leading cause of death.
Weather forecasting expert system study
NASA Technical Reports Server (NTRS)
1985-01-01
Weather forecasting is critical to both the Space Transportation System (STS) ground operations and the launch/landing activities at NASA Kennedy Space Center (KSC). The current launch frequency places significant demands on the USAF weather forecasters at the Cape Canaveral Forecasting Facility (CCFF), who currently provide the weather forecasting for all STS operations. As launch frequency increases, KSC's weather forecasting problems will be great magnified. The single most important problem is the shortage of highly skilled forecasting personnel. The development of forecasting expertise is difficult and requires several years of experience. Frequent personnel changes within the forecasting staff jeopardize the accumulation and retention of experience-based weather forecasting expertise. The primary purpose of this project was to assess the feasibility of using Artificial Intelligence (AI) techniques to ameliorate this shortage of experts by capturing aria incorporating the forecasting knowledge of current expert forecasters into a Weather Forecasting Expert System (WFES) which would then be made available to less experienced duty forecasters.
NASA Astrophysics Data System (ADS)
Wilson, H. F.
2013-12-01
First-principles atomistic simulation is a vital tool for understanding the properties of materials at the high-pressure high-temperature conditions prevalent in giant planet interiors, but properties such as solubility and phase boundaries are dependent on entropy, a quantity not directly accessible in simulation. Determining entropic properties from atomistic simulations is a difficult problem typically requiring a time-consuming integration over molecular dynamics trajectories. Here I will describe recent advances in first-principles thermodynamic calculations which substantially increase the simplicity and efficiency of thermodynamic integration and make entropic properties more readily accessible. I will also describe the use of first-principles thermodynamic calculations for understanding problems including core solubility in gas giants and superionic phase changes in ice giants, as well as future prospects for combining first-principles thermodynamics with planetary-scale models to help us understand the origin and consequences of compositional inhomogeneity in giant planet interiors.
Resolving Rapid Variation in Energy for Particle Transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haut, Terry Scot; Ahrens, Cory Douglas; Jonko, Alexandra
2016-08-23
Resolving the rapid variation in energy in neutron and thermal radiation transport is needed for the predictive simulation capability in high-energy density physics applications. Energy variation is difficult to resolve due to rapid variations in cross sections and opacities caused by quantized energy levels in the nuclei and electron clouds. In recent work, we have developed a new technique to simultaneously capture slow and rapid variations in the opacities and the solution using homogenization theory, which is similar to multiband (MB) and to the finite-element with discontiguous support (FEDS) method, but does not require closure information. We demonstrated the accuracymore » and efficiency of the method for a variety of problems. We are researching how to extend the method to problems with multiple materials and the same material but with different temperatures and densities. In this highlight, we briefly describe homogenization theory and some results.« less
Experimental Replication of an Aeroengine Combustion Instability
NASA Technical Reports Server (NTRS)
Cohen, J. M.; Hibshman, J. R.; Proscia, W.; Rosfjord, T. J.; Wake, B. E.; McVey, J. B.; Lovett, J.; Ondas, M.; DeLaat, J.; Breisacher, K.
2000-01-01
Combustion instabilities in gas turbine engines are most frequently encountered during the late phases of engine development, at which point they are difficult and expensive to fix. The ability to replicate an engine-traceable combustion instability in a laboratory-scale experiment offers the opportunity to economically diagnose the problem (to determine the root cause), and to investigate solutions to the problem, such as active control. The development and validation of active combustion instability control requires that the causal dynamic processes be reproduced in experimental test facilities which can be used as a test bed for control system evaluation. This paper discusses the process through which a laboratory-scale experiment was designed to replicate an instability observed in a developmental engine. The scaling process used physically-based analyses to preserve the relevant geometric, acoustic and thermo-fluid features. The process increases the probability that results achieved in the single-nozzle experiment will be scalable to the engine.
NASA Astrophysics Data System (ADS)
Liu, Bingsheng; Fu, Meiqing; Zhang, Shuibo; Xue, Bin; Zhou, Qi; Zhang, Shiruo
2018-01-01
The Choquet integral (IL) operator is an effective approach for handling interdependence among decision attributes in complex decision-making problems. However, the fuzzy measures of attributes and attribute sets required by IL are difficult to achieve directly, which limits the application of IL. This paper proposes a new method for determining fuzzy measures of attributes by extending Marichal's concept of entropy for fuzzy measure. To well represent the assessment information, interval-valued 2-tuple linguistic context is utilised to represent information. Then, we propose a Choquet integral operator in an interval-valued 2-tuple linguistic environment, which can effectively handle the correlation between attributes. In addition, we apply these methods to solve multi-attribute group decision-making problems. The feasibility and validity of the proposed operator is demonstrated by comparisons with other models in illustrative example part.
Harmonisation of microbial sampling and testing methods for distillate fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, G.C.; Hill, E.C.
1995-05-01
Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems andmore » describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.« less
Kunnari, Sari; Savinainen-Makkonen, Tuula; Leonard, Laurence B.; Mäkinen, Leena; Tolonen, Anna-Kaisa
2015-01-01
Children with specific language impairment (SLI) have difficulty expressing subject-verb agreement. However, in many languages, tense is fused with agreement, making it difficult to attribute the problem to agreement in particular. In Finnish, negative markers are function words that agree with the subject in person and number but do not express tense, providing an opportunity to assess the status of agreement in a more straightforward way. Fifteen Finnish-speaking preschoolers with SLI, 15 age controls, and 15 younger controls responded to items requiring negative markers in first person singular and plural, and third person singular and plural. The children with SLI were less accurate than both typically developing groups. However, their problems were limited to particular person-number combinations. Furthermore, the children with SLI appeared to have difficulty selecting the form of the lexical verb that should accompany the negative marker, suggesting that agreement was not the sole difficulty. PMID:24588468
A comparison of experimental and calculated thin-shell leading-edge buckling due to thermal stresses
NASA Technical Reports Server (NTRS)
Jenkins, Jerald M.
1988-01-01
High-temperature thin-shell leading-edge buckling test data are analyzed using NASA structural analysis (NASTRAN) as a finite element tool for predicting thermal buckling characteristics. Buckling points are predicted for several combinations of edge boundary conditions. The problem of relating the appropriate plate area to the edge stress distribution and the stress gradient is addressed in terms of analysis assumptions. Local plasticity was found to occur on the specimen analyzed, and this tended to simplify the basic problem since it effectively equalized the stress gradient from loaded edge to loaded edge. The initial loading was found to be difficult to select for the buckling analysis because of the transient nature of thermal stress. Multiple initial model loadings are likely required for complicated thermal stress time histories before a pertinent finite element buckling analysis can be achieved. The basic mode shapes determined from experimentation were correctly identified from computation.
Dual Key Speech Encryption Algorithm Based Underdetermined BSS
Zhao, Huan; Chen, Zuo; Zhang, Xixiang
2014-01-01
When the number of the mixed signals is less than that of the source signals, the underdetermined blind source separation (BSS) is a significant difficult problem. Due to the fact that the great amount data of speech communications and real-time communication has been required, we utilize the intractability of the underdetermined BSS problem to present a dual key speech encryption method. The original speech is mixed with dual key signals which consist of random key signals (one-time pad) generated by secret seed and chaotic signals generated from chaotic system. In the decryption process, approximate calculation is used to recover the original speech signals. The proposed algorithm for speech signals encryption can resist traditional attacks against the encryption system, and owing to approximate calculation, decryption becomes faster and more accurate. It is demonstrated that the proposed method has high level of security and can recover the original signals quickly and efficiently yet maintaining excellent audio quality. PMID:24955430
Bourdieu at the bedside: briefing parents in a pediatric hospital.
LeGrow, Karen; Hodnett, Ellen; Stremler, Robyn; McKeever, Patricia; Cohen, Eyal
2014-12-01
The philosophy of family-centered care (FCC) promotes partnerships between families and staff to plan, deliver, and evaluate services for children and has been officially adopted by a majority of pediatric hospitals throughout North America. However, studies indicated that many parents have continued to be dissatisfied with their decision-making roles in their child's care. This is particularly salient for parents of children with chronic ongoing complex health problems. These children are dependent upon medical technology and require frequent hospitalizations during which parents must contribute to difficult decisions regarding their child's care. Given this clinical issue, an alternative theoretical perspective was explored to redress this problem. Pierre Bourdieu's theoretical concepts of field, capital, and habitus were used to analyze the hierarchical relationships in pediatric acute care hospitals and to design a briefing intervention aimed at improving parents' satisfaction with decision making in that health care setting. © 2014 John Wiley & Sons Ltd.
Chiral encoding may provide a simple solution to the origin of life
NASA Astrophysics Data System (ADS)
Brewer, Ashley; Davis, Anthony P.
2014-07-01
The route by which the complex and specific molecules of life arose from the 'prebiotic soup' remains an unsolved problem. Evolution provides a large part of the answer, but this requires molecules that can carry information (that is, exist in many variants) and can replicate themselves. The process is commonplace in living organisms, but not so easy to achieve with simple chemical systems. It is especially difficult to contemplate in the chemical chaos of the prebiotic world. Although popular in many quarters, the notion that RNA was the first self-replicator carries many difficulties. Here, we present an alternative view, suggesting that there may be undiscovered self-replication mechanisms possible in much simpler systems. In particular, we highlight the possibility of information coding through stereochemical configurations of substituents in organic polymers. We also show that this coding system leads naturally to enantiopurity, solving the apparent problem of biological homochirality.
Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources.
Gao, Xiang; Acar, Levent
2016-07-04
This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors' data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.
Ohba, Seigo; Nakatani, Yuya; Kawasaki, Takako; Tajima, Nobutaka; Tobita, Takayoshi; Yoshida, Noriaki; Sawase, Takashi; Asahina, Izumi
2015-08-01
Increasing numbers of older patients are seeking orthognathic surgery to treat jaw deformity. However, orthodontic and orthognathic surgical treatment is difficult in cases without occlusal vertical stop. A 55-year-old man presented with Class III malocclusion and mandibular protrusion including esthetic problems and posterior bite collapse. He underwent dental implant treatment to reconstruct an occlusal vertical stop before orthognathic surgery. His occlusal function and esthetic problems improved after surgery, and his skeletal and occlusal stability has been maintained for 6 years. Dental implant placement at appropriate positions could help to determine the position of the proximal segment at orthognathic surgery and could shorten the time required to restore esthetic and occlusal function. This case demonstrates how skeletal and dental stability can be maintained long after surgery in a patient with jaw deformity and posterior bite collapse.
[Software for illustrating a cost-quality balance carried out by clinical laboratory practice].
Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi
2010-09-01
We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial
Missed cases of multiple forms of child abuse and neglect.
Koc, Feyza; Oral, Resmiye; Butteris, Regina
2014-01-01
Child abuse and neglect is a public health problem and usually associated with family dysfunction due to multiple psychosocial, individual, and environmental factors. The diagnosis of child abuse may be difficult and require a high index of suspicion on the part of the practitioners encountering the child and the family. System-related factors may also enable abuse or prevent the early recognition of abuse. Child abuse and neglect that goes undiagnosed may give rise to chronic abuse and increased morbidity-mortality. In this report, we present two siblings who missed early diagnosis and we emphasize the importance of systems issues to allow early recognition of child abuse and neglect.
Computational Psychometrics for Modeling System Dynamics during Stressful Disasters
Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe
2017-01-01
Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future. PMID:28861026
The economics of vein disease.
Sales, Clifford M; Podnos, Joan; Levison, Jonathan
2007-09-01
The management of cosmetic vein problems requires a very different approach than that for the majority of most other vascular disorders that occur in a vascular surgery practice. This article focuses on the business aspects of a cosmetic vein practice, with particular attention to the uniqueness of these issues. Managing patient expectations is critical to the success of a cosmetic vein practice. Maneuvering within the insurance can be difficult and frustrating for both the patient and the practice. Practices should use cost accounting principles to evaluate the success of their vein work. Vein surgery--especially if performed within the office--can undergo an accurate break-even analysis to determine its profitability.
Apparatus for electroplating particles of small dimension
Yu, Conrad M.; Illige, John D.
1982-01-01
The thickness, uniformity, and surface smoothness requirements for surface coatings of glass microspheres for use as targets for laser fusion research are critical. Because of their minute size, the microspheres are difficult to manipulate and control in electroplating systems. The electroplating apparatus (10) of the present invention addresses these problems by providing a cathode cell (20) having a cell chamber (22), a cathode (23) and an anode (26) electrically isolated from each other and connected to an electrical power source (24). During the plating process, the cathode (23) is controllably vibrated along with solution pulse to maintain the particles in random free motion so as to attain the desired properties.
A model for the submarine depthkeeping team
NASA Technical Reports Server (NTRS)
Ware, J. R.; Best, J. F.; Bozzi, P. J.; Kleinman, D. W.
1981-01-01
The most difficult task the depthkeeping team must face occurs during periscope-depth operations during which they may be required to maintain a submarine several hundred feet long within a foot of ordered depth and within one-half degree of ordered pitch. The difficulty is compounded by the facts that wave generated forces are extremely high, depth and pitch signals are very noisy and submarine speed is such that overall dynamics are slow. A mathematical simulation of the depthkeeping team based on the optimal control models is described. A solution of the optimal team control problem with an output control restriction (limited display to each controller) is presented.
Problem of Mistakes in Databases, Processing and Interpretation of Observations of the Sun. I.
NASA Astrophysics Data System (ADS)
Lozitska, N. I.
In databases of observations unnoticed mistakes and misprints could occur at any stage of observation, preparation and processing of databases. The current detection of errors is complicated by the fact that the works of observer, databases compiler and researcher were divided. Data acquisition from a spacecraft requires the greater amount of researchers than for ground-based observations. As a result, the probability of errors is increasing. Keeping track of the errors on each stage is very difficult, so we use of cross-comparison of data from different sources. We revealed some misprints in the typographic and digital results of sunspot group area measurements.
Detection and Reconstruction of Circular RNAs from Transcriptomic Data.
Zheng, Yi; Zhao, Fangqing
2018-01-01
Recent studies have shown that circular RNAs (circRNAs) are a novel class of abundant, stable, and ubiquitous noncoding RNA molecules in eukaryotic organisms. Comprehensive detection and reconstruction of circRNAs from high-throughput transcriptome data is an initial step to study their biogenesis and function. Several tools have been developed to deal with this issue, but they require many steps and are difficult to use. To solve this problem, we provide a protocol for researchers to detect and reconstruct circRNA by employing CIRI2, CIRI-AS, and CIRI-full. This protocol can not only simplify the usage of above tools but also integrate their results.
Is It Desirable to Be Able to Do the Undesirable? Moral Bioenhancement and the Little Alex Problem.
Hauskeller, Michael
2017-07-01
It has been argued that moral bioenhancement is desirable even if it would make it impossible for us to do what is morally required. Others find this apparent loss of freedom deplorable. However, it is difficult to see how a world in which there is no moral evil can plausibly be regarded as worse than a world in which people are not only free to do evil, but also where they actually do it, which would commit us to the seemingly paradoxical view that, under certain circumstances, the bad can be better than the good. Notwithstanding, this view is defended here.
Engel, Nora; van Lente, Harro
2014-07-01
Partnerships between public and private healthcare providers are often seen as an important way to improve health care in resource-constrained settings. Despite the reconfirmed policy support for including private providers into public tuberculosis control in India, the public-private mix (PPM) activities continue to face apprehension at local implementation sites. This article investigates the causes for those difficulties by examining PPM initiatives as cases of organisational innovation. It examines findings from semi-structured interviews, observations and document analyses in India around three different PPM models and the attempts of innovating and scaling up. The results reveal that in PPM initiatives underlying problem definitions and different control practices, including supervision, standardisation and culture, continue to clash and ultimately hinder the scaling up of PPM. Successful PPM initiatives require organisational control practices which are rooted in different professions to be bridged. This entails difficult balancing acts between innovation and control. The innovators handle those differently, based on their own ideas of the problem that PPM should address and their own control practices. We offer new perspectives on why collaboration is so difficult and show a possible way to mitigate the established apprehensions between professions in order to make organisational innovations, such as PPM, sustainable and scalable. © 2013 The Authors Sociology of Health & Illness © 2013 Foundation for the Sociology of Health & Illness/John Wiley & Sons Ltd.
Entropic criterion for model selection
NASA Astrophysics Data System (ADS)
Tseng, Chih-Yuan
2006-10-01
Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why use this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficult to get. Following the logic of inductive inference proposed by Caticha [Relative entropy and inductive inference, in: G. Erickson, Y. Zhai (Eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, AIP Conference Proceedings, vol. 707, 2004 (available from arXiv.org/abs/physics/0311093)], we show relative entropy to be a unique criterion, which requires no prior information and can be applied to different fields. We examine this criterion by considering a physical problem, simple fluids, and results are promising.
Habitat Design Optimization and Analysis
NASA Technical Reports Server (NTRS)
SanSoucie, Michael P.; Hull, Patrick V.; Tinker, Michael L.
2006-01-01
Long-duration surface missions to the Moon and Mars will require habitats for the astronauts. The materials chosen for the habitat walls play a direct role in the protection against the harsh environments found on the surface. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Advanced optimization techniques are necessary for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat design optimization tool utilizing genetic algorithms has been developed. Genetic algorithms use a "survival of the fittest" philosophy, where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multi-objective formulation of structural analysis, heat loss, radiation protection, and meteoroid protection. This paper presents the research and development of this tool.
Simple solution to the medical instrumentation software problem
NASA Astrophysics Data System (ADS)
Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.
1995-04-01
Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.
Metacognition: Student Reflections on Problem Solving
ERIC Educational Resources Information Center
Wismath, Shelly; Orr, Doug; Good, Brandon
2014-01-01
Twenty-first century teaching and learning focus on the fundamental skills of critical thinking and problem solving, creativity and innovation, and collaboration and communication. Metacognition is a crucial aspect of both problem solving and critical thinking, but it is often difficult to get students to engage in authentic metacognitive…
Beyond Utility Targeting: Toward Axiological Air Operations
2000-01-01
encounter the leader- sociopath , bereft of values, quite willing to live underground in hiding and in- sensitive to the absence of human comforts...that is a mere one thousand value-analysis problems to begin solving. A more difficult problem to solve is the problem of the leader- sociopath
Kapornai, Krisztina; Gentzler, Amy L; Tepper, Ping; Kiss, Eniko; Mayer, László; Tamás, Zsuzsanna; Kovacs, Maria; Vetró, Agnes
2007-06-01
We investigate the relations of early atypical characteristics (perinatal problems, developmental delay, and difficult temperament) and onset-age (as well as severity of) first major depressive disorder (MDD) and first internalizing disorder in a clinical sample of depressed children in Hungary. Participants were 371 children (ages 7-14) with MDD, and their biological mothers, recruited through multiple clinical sites. Diagnoses (via DSM-IV criteria) and onset dates of disorders were finalized "best estimate" psychiatrists, and based on multiple information sources. Mothers provided developmental data in a structured interview. Difficult temperament predicted earlier onset of MDD and first internalizing disorder, but its effect was ameliorated if the family was intact during early childhood. Further, the importance of difficult temperament decreased as a function of time. Perinatal problems and developmental delay did not impact onset ages of disorders, and none of the early childhood characteristics associated with MDD episode severity. Children with MDD may have added disadvantage of earlier onset if they had a difficult temperament in infancy. Because early temperament mirrors physiological reactivity and regulatory capacity, it can affect various areas of functioning related to psychopathology. Early caregiver stability may attenuate some adverse effects of difficult infant temperament.
An algorithm for the optimal collection of wet waste.
Laureri, Federica; Minciardi, Riccardo; Robba, Michela
2016-02-01
This work refers to the development of an approach for planning wet waste (food waste and other) collection at a metropolitan scale. Some specific modeling features distinguish this specific waste collection problem from the other ones. For instance, there may be significant differences as regards the values of the parameters (such as weight and volume) characterizing the various collection points. As it happens for classical waste collection planning, even in the case of wet waste, one has to deal with difficult combinatorial problems, where the determination of an optimal solution may require a very large computational effort, in the case of problem instances having a noticeable dimensionality. For this reason, in this work, a heuristic procedure for the optimal planning of wet waste is developed and applied to problem instances drawn from a real case study. The performances that can be obtained by applying such a procedure are evaluated by a comparison with those obtainable via a general-purpose mathematical programming software package, as well as those obtained by applying very simple decision rules commonly used in practice. The considered case study consists in an area corresponding to the historical center of the Municipality of Genoa. Copyright © 2015 Elsevier Ltd. All rights reserved.
Monitoring Affect States during Effortful Problem Solving Activities
ERIC Educational Resources Information Center
D'Mello, Sidney K.; Lehman, Blair; Person, Natalie
2010-01-01
We explored the affective states that students experienced during effortful problem solving activities. We conducted a study where 41 students solved difficult analytical reasoning problems from the Law School Admission Test. Students viewed videos of their faces and screen captures and judged their emotions from a set of 14 states (basic…
A Fiducial Approach to Extremes and Multiple Comparisons
ERIC Educational Resources Information Center
Wandler, Damian V.
2010-01-01
Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…
Students' and Teachers' Conceptual Metaphors for Mathematical Problem Solving
ERIC Educational Resources Information Center
Yee, Sean P.
2017-01-01
Metaphors are regularly used by mathematics teachers to relate difficult or complex concepts in classrooms. A complex topic of concern in mathematics education, and most STEM-based education classes, is problem solving. This study identified how students and teachers contextualize mathematical problem solving through their choice of metaphors.…
ERIC Educational Resources Information Center
Danielsen, Reidar
Skilled labor has always been difficult to recruit, and in a tight labor market unskilled, low-paying jobs with low status are also difficult to fill. Recruitment from outside seems necessary to satisfy demands, but migration creates at least as many problems as it solves. The consumption of theoretical training through the university level (a…
Professional Support for Families in Difficult Life Situations
ERIC Educational Resources Information Center
Zakirova, Venera G.; Gaysina, Guzel I.; Raykova, Elena
2016-01-01
Relevance of the problem stated in the article is determined by the presence of a significant number of families in difficult life situations who need in professional support and socio-psychological assistance. The article aims to substantiate the effectiveness of the structural-functional model of professional supporting for families in difficult…
ERIC Educational Resources Information Center
Worth-Baker, Marcia
2000-01-01
Describes one seventh grade teacher's experiences with a student from a problem home who was known for his difficult behavior, noting the student's deep interest in the Trojan War and its circumstances, describing his death at age 16, and concluding that it is important to notice and cherish all students, even the difficult ones. (SM)
Crooks, Noelle M.; Alibali, Martha W.
2013-01-01
This study investigated whether activating elements of prior knowledge can influence how problem solvers encode and solve simple mathematical equivalence problems (e.g., 3 + 4 + 5 = 3 + __). Past work has shown that such problems are difficult for elementary school students (McNeil and Alibali, 2000). One possible reason is that children's experiences in math classes may encourage them to think about equations in ways that are ultimately detrimental. Specifically, children learn a set of patterns that are potentially problematic (McNeil and Alibali, 2005a): the perceptual pattern that all equations follow an “operations = answer” format, the conceptual pattern that the equal sign means “calculate the total”, and the procedural pattern that the correct way to solve an equation is to perform all of the given operations on all of the given numbers. Upon viewing an equivalence problem, knowledge of these patterns may be reactivated, leading to incorrect problem solving. We hypothesized that these patterns may negatively affect problem solving by influencing what people encode about a problem. To test this hypothesis in children would require strengthening their misconceptions, and this could be detrimental to their mathematical development. Therefore, we tested this hypothesis in undergraduate participants. Participants completed either control tasks or tasks that activated their knowledge of the three patterns, and were then asked to reconstruct and solve a set of equivalence problems. Participants in the knowledge activation condition encoded the problems less well than control participants. They also made more errors in solving the problems, and their errors resembled the errors children make when solving equivalence problems. Moreover, encoding performance mediated the effect of knowledge activation on equivalence problem solving. Thus, one way in which experience may affect equivalence problem solving is by influencing what students encode about the equations. PMID:24324454
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yun; Zhang, Yin
2016-06-08
The mass sensing superiority of a micro/nanomechanical resonator sensor over conventional mass spectrometry has been, or at least, is being firmly established. Because the sensing mechanism of a mechanical resonator sensor is the shifts of resonant frequencies, how to link the shifts of resonant frequencies with the material properties of an analyte formulates an inverse problem. Besides the analyte/adsorbate mass, many other factors such as position and axial force can also cause the shifts of resonant frequencies. The in-situ measurement of the adsorbate position and axial force is extremely difficult if not impossible, especially when an adsorbate is as smallmore » as a molecule or an atom. Extra instruments are also required. In this study, an inverse problem of using three resonant frequencies to determine the mass, position and axial force is formulated and solved. The accuracy of the inverse problem solving method is demonstrated and how the method can be used in the real application of a nanomechanical resonator is also discussed. Solving the inverse problem is helpful to the development and application of mechanical resonator sensor on two things: reducing extra experimental equipments and achieving better mass sensing by considering more factors.« less
Insight and analysis problem solving in microbes to machines.
Clark, Kevin B
2015-11-01
A key feature for obtaining solutions to difficult problems, insight is oftentimes vaguely regarded as a special discontinuous intellectual process and/or a cognitive restructuring of problem representation or goal approach. However, this nearly century-old state of art devised by the Gestalt tradition to explain the non-analytical or non-trial-and-error, goal-seeking aptitude of primate mentality tends to neglect problem-solving capabilities of lower animal phyla, Kingdoms other than Animalia, and advancing smart computational technologies built from biological, artificial, and composite media. Attempting to provide an inclusive, precise definition of insight, two major criteria of insight, discontinuous processing and problem restructuring, are here reframed using terminology and statistical mechanical properties of computational complexity classes. Discontinuous processing becomes abrupt state transitions in algorithmic/heuristic outcomes or in types of algorithms/heuristics executed by agents using classical and/or quantum computational models. And problem restructuring becomes combinatorial reorganization of resources, problem-type substitution, and/or exchange of computational models. With insight bounded by computational complexity, humans, ciliated protozoa, and complex technological networks, for example, show insight when restructuring time requirements, combinatorial complexity, and problem type to solve polynomial and nondeterministic polynomial decision problems. Similar effects are expected from other problem types, supporting the idea that insight might be an epiphenomenon of analytical problem solving and consequently a larger information processing framework. Thus, this computational complexity definition of insight improves the power, external and internal validity, and reliability of operational parameters with which to classify, investigate, and produce the phenomenon for computational agents ranging from microbes to man-made devices. Copyright © 2015 Elsevier Ltd. All rights reserved.
Restricted mouth opening and its definitive management: A literature review.
Kumar, Bhushan; Fernandes, Aquaviva; Sandhu, Prabhdeep Kaur
2018-01-01
This review was intended to discuss the various possible modifications suggested in the literature for prosthetic steps and surgical corrective procedures in nonresponding or complicated cases during rehabilitation of patients with restricted mouth opening. Medline, PubMed, and Google were searched electronically for articles using keywords: microstomia and treatment options for restricted mouth opening. The various articles on prosthodontic rehabilitation in microstomia were segregated. From these, various modifications in the prosthetic steps were reviewed. Oral hygiene maintenance is difficult for patient either due to limited access or due to associated lack of manual dexterity, so dental decay and periodontal problems are more extensive in such patients; hence, tooth loss is a common finding. All prosthetic procedures require wide mouth opening to carry out various steps, starting from tray placement during impression making to the final prosthesis insertion, especially removable prosthesis. Various prosthetic modifications given by authors are included in this review for each step in prosthodontic management. A total of eight stock tray designs, 12 custom tray designs, and 17 removable prosthesis designs are discussed along with fixed (either tooth-supported or implant-supported) and maxillofacial prosthesis. However, some patients require surgical intervention also for the correction of microstomia either for function or for esthetic purpose before prosthetic rehabilitation and are also enumerated here. Among all prosthetic restorative options, removable prosthesis is most difficult for dentist to fabricate as conventional methods are either very difficult or impossible to apply. To get a more accurate final prosthesis, we need to modify these steps according to the existing case. Several modifications available are discussed here which can help while managing these patients.
NMESys: An expert system for network fault detection
NASA Technical Reports Server (NTRS)
Nelson, Peter C.; Warpinski, Janet
1991-01-01
The problem of network management is becoming an increasingly difficult and challenging task. It is very common today to find heterogeneous networks consisting of many different types of computers, operating systems, and protocols. The complexity of implementing a network with this many components is difficult enough, while the maintenance of such a network is an even larger problem. A prototype network management expert system, NMESys, implemented in the C Language Integrated Production System (CLIPS). NMESys concentrates on solving some of the critical problems encountered in managing a large network. The major goal of NMESys is to provide a network operator with an expert system tool to quickly and accurately detect hard failures, potential failures, and to minimize or eliminate user down time in a large network.
Formal methods technology transfer: Some lessons learned
NASA Technical Reports Server (NTRS)
Hamilton, David
1992-01-01
IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.
The analysis method of the DRAM cell pattern hotspot
NASA Astrophysics Data System (ADS)
Lee, Kyusun; Lee, Kweonjae; Chang, Jinman; Kim, Taeheon; Han, Daehan; Hong, Aeran; Kim, Yonghyeon; Kang, Jinyoung; Choi, Bumjin; Lee, Joosung; Lee, Jooyoung; Hong, Hyeongsun; Lee, Kyupil; Jin, Gyoyoung
2015-03-01
It is increasingly difficult to determine degree of completion of the patterning and the distribution at the DRAM Cell Patterns. When we research DRAM Device Cell Pattern, there are three big problems currently, it is as follows. First, due to etch loading, it is difficult to predict the potential defect. Second, due to under layer topology, it is impossible to demonstrate the influence of the hotspot. Finally, it is extremely difficult to predict final ACI pattern by the photo simulation, because current patterning process is double patterning technology which means photo pattern is completely different from final etch pattern. Therefore, if the hotspot occurs in wafer, it is very difficult to find it. CD-SEM is the most common pattern measurement tool in semiconductor fabrication site. CD-SEM is used to accurately measure small region of wafer pattern primarily. Therefore, there is no possibility of finding places where unpredictable defect occurs. Even though, "Current Defect detector" can measure a wide area, every chip has same pattern issue, the detector cannot detect critical hotspots. Because defect detecting algorithm of bright field machine is based on image processing, if same problems occur on compared and comparing chip, the machine cannot identify it. Moreover this instrument is not distinguished the difference of distribution about 1nm~3nm. So, "Defect detector" is difficult to handle the data for potential weak point far lower than target CD. In order to solve those problems, another method is needed. In this paper, we introduce the analysis method of the DRAM Cell Pattern Hotspot.
Motion planning with complete knowledge using a colored SOM.
Vleugels, J; Kok, J N; Overmars, M
1997-01-01
The motion planning problem requires that a collision-free path be determined for a robot moving amidst a fixed set of obstacles. Most neural network approaches to this problem are for the situation in which only local knowledge about the configuration space is available. The main goal of the paper is to show that neural networks are also suitable tools in situations with complete knowledge of the configuration space. In this paper we present an approach that combines a neural network and deterministic techniques. We define a colored version of Kohonen's self-organizing map that consists of two different classes of nodes. The network is presented with random configurations of the robot and, from this information, it constructs a road map of possible motions in the work space. The map is a growing network, and different nodes are used to approximate boundaries of obstacles and the Voronoi diagram of the obstacles, respectively. In a second phase, the positions of the two kinds of nodes are combined to obtain the road map. In this way a number of typical problems with small obstacles and passages are avoided, and the required number of nodes for a given accuracy is within reasonable limits. This road map is searched to find a motion connecting the given source and goal configurations of the robot. The algorithm is simple and general; the only specific computation that is required is a check for intersection of two polygons. We implemented the algorithm for planar robots allowing both translation and rotation and experiments show that compared to conventional techniques it performs well, even for difficult motion planning scenes.
Steel Sheet Piles - Applications and Elementary Design Issues
NASA Astrophysics Data System (ADS)
Sobala, Dariusz; Rybak, Jarosław
2017-10-01
High-intensity housing having been carried out in town’s centres causes that many complex issues related to earthworks and foundations must be resolved. Project owners are required to ensure respective number of parking bays, which in turn demands 2-3 storeys of underground car parks. It is especially difficult to fulfil in dense buildings of old town areas where apart from engineering problems, very stringent requirements of heritage conservator supervision are also raised. The problems with ensuring stability of excavation sidewalls need to be, at the same time, dealt with analysis of foundations of neighbouring structures, and possible strengthening them at the stages of installing the excavation protection walls, progressing the excavations and constructing basement storeys. A separate problem refers to necessity of constructing underground storeys below the level of local groundwater. This requires long-term lowering of water table inside excavation while at possibly limited intervention in hydrological regime beyond the project in progress. In river valleys such “hoarding off” the excavation and cutting off groundwater leads to temporary or permanent disturbances of groundwater run-off and local swellings. Traditional way to protect vertical fault and simultaneously to cut-off groundwater inflow consists in application of steel sheet pilings. They enable to construct monolithic reinforced concrete structures of underground storeys thus ensuring both their tightness and high rigidity of foundation. Depending on situation, steel sheet pilings can be in retrieving or staying-in-place versions. This study deals with some selected aspects of engineering design and fabrication of sheet piling for deep excavations and underground parts of buildings.
Parameter Optimization for Turbulent Reacting Flows Using Adjoints
NASA Astrophysics Data System (ADS)
Lapointe, Caelan; Hamlington, Peter E.
2017-11-01
The formulation of a new adjoint solver for topology optimization of turbulent reacting flows is presented. This solver provides novel configurations (e.g., geometries and operating conditions) based on desired system outcomes (i.e., objective functions) for complex reacting flow problems of practical interest. For many such problems, it would be desirable to know optimal values of design parameters (e.g., physical dimensions, fuel-oxidizer ratios, and inflow-outflow conditions) prior to real-world manufacture and testing, which can be expensive, time-consuming, and dangerous. However, computational optimization of these problems is made difficult by the complexity of most reacting flows, necessitating the use of gradient-based optimization techniques in order to explore a wide design space at manageable computational cost. The adjoint method is an attractive way to obtain the required gradients, because the cost of the method is determined by the dimension of the objective function rather than the size of the design space. Here, the formulation of a novel solver is outlined that enables gradient-based parameter optimization of turbulent reacting flows using the discrete adjoint method. Initial results and an outlook for future research directions are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Linda
The objective of the proposal was to develop graduate student training in materials and engineering research relevant to the development of particle accelerators. Many components used in today's accelerators or storage rings are at the limit of performance. The path forward in many cases requires the development of new materials or fabrication techniques, or a novel engineering approach. Often, accelerator-based laboratories find it difficult to get top-level engineers or materials experts with the motivation to work on these problems. The three years of funding provided by this grant was used to support development of accelerator components through a multidisciplinary approachmore » that cut across the disciplinary boundaries of accelerator physics, materials science, and surface chemistry. The following results were achieved: (1) significant scientific results on fabrication of novel photocathodes, (2) application of surface science and superconducting materials expertise to accelerator problems through faculty involvement, (3) development of instrumentation for fabrication and characterization of materials for accelerator components, (4) student involvement with problems at the interface of material science and accelerator physics.« less
An Automatic Medium to High Fidelity Low-Thrust Global Trajectory Toolchain; EMTG-GMAT
NASA Technical Reports Server (NTRS)
Beeson, Ryne T.; Englander, Jacob A.; Hughes, Steven P.; Schadegg, Maximillian
2015-01-01
Solving the global optimization, low-thrust, multiple-flyby interplanetary trajectory problem with high-fidelity dynamical models requires an unreasonable amount of computational resources. A better approach, and one that is demonstrated in this paper, is a multi-step process whereby the solution of the aforementioned problem is solved at a lower-fidelity and this solution is used as an initial guess for a higher-fidelity solver. The framework presented in this work uses two tools developed by NASA Goddard Space Flight Center: the Evolutionary Mission Trajectory Generator (EMTG) and the General Mission Analysis Tool (GMAT). EMTG is a medium to medium-high fidelity low-thrust interplanetary global optimization solver, which now has the capability to automatically generate GMAT script files for seeding a high-fidelity solution using GMAT's local optimization capabilities. A discussion of the dynamical models as well as thruster and power modeling for both EMTG and GMAT are given in this paper. Current capabilities are demonstrated with examples that highlight the toolchains ability to efficiently solve the difficult low-thrust global optimization problem with little human intervention.
Trajectory optimization for lunar soft landing with complex constraints
NASA Astrophysics Data System (ADS)
Chu, Huiping; Ma, Lin; Wang, Kexin; Shao, Zhijiang; Song, Zhengyu
2017-11-01
A unified trajectory optimization framework with initialization strategies is proposed in this paper for lunar soft landing for various missions with specific requirements. Two main missions of interest are Apollo-like Landing from low lunar orbit and Vertical Takeoff Vertical Landing (a promising mobility method) on the lunar surface. The trajectory optimization is characterized by difficulties arising from discontinuous thrust, multi-phase connections, jump of attitude angle, and obstacles avoidance. Here R-function is applied to deal with the discontinuities of thrust, checkpoint constraints are introduced to connect multiple landing phases, attitude angular rate is designed to get rid of radical changes, and safeguards are imposed to avoid collision with obstacles. The resulting dynamic problems are generally with complex constraints. The unified framework based on Gauss Pseudospectral Method (GPM) and Nonlinear Programming (NLP) solver are designed to solve the problems efficiently. Advanced initialization strategies are developed to enhance both the convergence and computation efficiency. Numerical results demonstrate the adaptability of the framework for various landing missions, and the performance of successful solution of difficult dynamic problems.
NASA Astrophysics Data System (ADS)
Xu, Jun
Topic 1. An Optimization-Based Approach for Facility Energy Management with Uncertainties. Effective energy management for facilities is becoming increasingly important in view of the rising energy costs, the government mandate on the reduction of energy consumption, and the human comfort requirements. This part of dissertation presents a daily energy management formulation and the corresponding solution methodology for HVAC systems. The problem is to minimize the energy and demand costs through the control of HVAC units while satisfying human comfort, system dynamics, load limit constraints, and other requirements. The problem is difficult in view of the fact that the system is nonlinear, time-varying, building-dependent, and uncertain; and that the direct control of a large number of HVAC components is difficult. In this work, HVAC setpoints are the control variables developed on top of a Direct Digital Control (DDC) system. A method that combines Lagrangian relaxation, neural networks, stochastic dynamic programming, and heuristics is developed to predict the system dynamics and uncontrollable load, and to optimize the setpoints. Numerical testing and prototype implementation results show that our method can effectively reduce total costs, manage uncertainties, and shed the load, is computationally efficient. Furthermore, it is significantly better than existing methods. Topic 2. Power Portfolio Optimization in Deregulated Electricity Markets with Risk Management. In a deregulated electric power system, multiple markets of different time scales exist with various power supply instruments. A load serving entity (LSE) has multiple choices from these instruments to meet its load obligations. In view of the large amount of power involved, the complex market structure, risks in such volatile markets, stringent constraints to be satisfied, and the long time horizon, a power portfolio optimization problem is of critical importance but difficulty for an LSE to serve the load, maximize its profit, and manage risks. In this topic, a mid-term power portfolio optimization problem with risk management is presented. Key instruments are considered, risk terms based on semi-variances of spot market transactions are introduced, and penalties on load obligation violations are added to the objective function to improve algorithm convergence and constraint satisfaction. To overcome the inseparability of the resulting problem, a surrogate optimization framework is developed enabling a decomposition and coordination approach. Numerical testing results show that our method effectively provides decisions for various instruments to maximize profit, manage risks, and is computationally efficient.
Standardised Benchmarking in the Quest for Orthologs
Altenhoff, Adrian M.; Boeckmann, Brigitte; Capella-Gutierrez, Salvador; Dalquen, Daniel A.; DeLuca, Todd; Forslund, Kristoffer; Huerta-Cepas, Jaime; Linard, Benjamin; Pereira, Cécile; Pryszcz, Leszek P.; Schreiber, Fabian; Sousa da Silva, Alan; Szklarczyk, Damian; Train, Clément-Marie; Bork, Peer; Lecompte, Odile; von Mering, Christian; Xenarios, Ioannis; Sjölander, Kimmen; Juhl Jensen, Lars; Martin, Maria J.; Muffato, Matthieu; Gabaldón, Toni; Lewis, Suzanna E.; Thomas, Paul D.; Sonnhammer, Erik; Dessimoz, Christophe
2016-01-01
The identification of evolutionarily related genes across different species—orthologs in particular—forms the backbone of many comparative, evolutionary, and functional genomic analyses. Achieving high accuracy in orthology inference is thus essential. Yet the true evolutionary history of genes, required to ascertain orthology, is generally unknown. Furthermore, orthologs are used for very different applications across different phyla, with different requirements in terms of the precision-recall trade-off. As a result, assessing the performance of orthology inference methods remains difficult for both users and method developers. Here, we present a community effort to establish standards in orthology benchmarking and facilitate orthology benchmarking through an automated web-based service (http://orthology.benchmarkservice.org). Using this new service, we characterise the performance of 15 well-established orthology inference methods and resources on a battery of 20 different benchmarks. Standardised benchmarking provides a way for users to identify the most effective methods for the problem at hand, sets a minimal requirement for new tools and resources, and guides the development of more accurate orthology inference methods. PMID:27043882
Deformation mechanisms in a coal mine roadway in extremely swelling soft rock.
Li, Qinghai; Shi, Weiping; Yang, Renshu
2016-01-01
The problem of roadway support in swelling soft rock was one of the challenging problems during mining. For most geological conditions, combinations of two or more supporting approaches could meet the requirements of most roadways; however, in extremely swelling soft rock, combined approaches even could not control large deformations. The purpose of this work was to probe the roadway deformation mechanisms in extremely swelling soft rock. Based on the main return air-way in a coal mine, deformation monitoring and geomechanical analysis were conducted, as well as plastic zone mechanical model was analysed. Results indicated that this soft rock was potentially very swelling. When the ground stress acted alone, the support strength needed in situ was not too large and combined supporting approaches could meet this requirement; however, when this potential released, the roadway would undergo permanent deformation. When the loose zone reached 3 m within surrounding rock, remote stress p ∞ and supporting stress P presented a linear relationship. Namely, the greater the swelling stress, the more difficult it would be in roadway supporting. So in this extremely swelling soft rock, a better way to control roadway deformation was to control the releasing of surrounding rock's swelling potential.
Leake, S.A.; Lilly, M.R.
1995-01-01
The Fairbanks, Alaska, area has many contaminated sites in a shallow alluvial aquifer. A ground-water flow model is being developed using the MODFLOW finite-difference ground-water flow model program with the River Package. The modeled area is discretized in the horizontal dimensions into 118 rows and 158 columns of approximately 150-meter square cells. The fine grid spacing has the advantage of providing needed detail at the contaminated sites and surface-water features that bound the aquifer. However, the fine spacing of cells adds difficulty to simulating interaction between the aquifer and the large, braided Tanana River. In particular, the assignment of a river head is difficult if cells are much smaller than the river width. This was solved by developing a procedure for interpolating and extrapolating river head using a river distance function. Another problem is that future transient simulations would require excessive numbers of input records using the current version of the River Package. The proposed solution to this problem is to modify the River Package to linearly interpolate river head for time steps within each stress period, thereby reducing the number of stress periods required.
The Challenge of Wireless Reliability and Coexistence.
Berger, H Stephen
2016-09-01
Wireless communication plays an increasingly important role in healthcare delivery. This further heightens the importance of wireless reliability, but quantifying wireless reliability is a complex and difficult challenge. Understanding the risks that accompany the many benefits of wireless communication should be a component of overall risk management. The emerging trend of using sensors and other device-to-device communications, as part of the emerging Internet of Things concept, is evident in healthcare delivery. The trend increases both the importance and complexity of this challenge. As with most system problems, finding a solution requires breaking down the problem into manageable steps. Understanding the operational reliability of a new wireless device and its supporting system requires developing solid, quantified answers to three questions: 1) How well can this new device and its system operate in a spectral environment where many other wireless devices are also operating? 2) What is the spectral environment in which this device and its system are expected to operate? Are the risks and reliability in its operating environment acceptable? 3) How might the new device and its system affect other devices and systems already in use? When operated under an insightful risk management process, wireless technology can be safely implemented, resulting in improved delivery of care.
[Thoughts about patient education: the experience of diabetes].
Grimaldi, André; Simon, Dominique; Sachon, Claude
2009-12-01
Patient education is not simply information or teaching or coaching. It is learning that is both practical and specialized, intended to help patients acquire therapeutic skills and to support them in changing their self-care practices to attain personalized objectives. It is therapeutic. An effective strategy to overcome health problems requires that patients not avoid the problem by denying the disease. Health prevention behavior requires that the patient be simultaneously confident in the prescribed treatments and able to project into the future. It is more difficult for asymptomatic patients to have a mental representation of the disease and thus be able to modify their lifestyle. Self-measurement of blood glucose can create anxiety and make the risk of complications more tangible, but it is beneficial only if it induces action or reassurance. Changing behavior is possible only to the extent that it does not challenge the patient's own well-being. It may be unreasonable but it is also rational to refuse what the patient perceives as a threat to his or her own identity. Thus, physicians caring for patients with a chronic disease must be skilled in three different fields: biomedicine, pedagogy, and psychology.
Environmental strategies: A case study of systematic evaluation
NASA Astrophysics Data System (ADS)
Sherman, Douglas J.; Garès, Paul A.
1982-09-01
A major problem facing environmental managers is the necessity to effectively evaluate management alternatives. Traditional environmental assessments have emphasized the use of economic analyses. These approaches are often deficient due to difficulty in assigning dollar values to environmental systems and to social amenities. A more flexible decisionmaking model has been developed to analyze management options for coping with beach erosion problems at the Sandy Hook Unit of Gateway National Recreation Area in New Jersey. This model is comprised of decision-making variables which are formulated from a combination of environmental and management criteria, and it has an accept-reject format in which the management options are analyzed in terms of the variables. Through logical ordering of the insertion of the variables into the model, stepwise elimination of alternatives is possible. A hierarchy of variables is determined through estimating work required to complete an assessment of the alternatives for each variable. The assessment requiring the least work is performed first so that the more difficult evaluation will be limited to fewer alternatives. The application of this approach is illustrated with a case study in which beach protection alternatives were evaluated for the United States National Park Service.
Hendriks, Anna-Marie; Jansen, Maria W J; Gubbels, Jessica S; De Vries, Nanne K; Paulussen, Theo; Kremers, Stef P J
2013-04-18
Childhood obesity is a 'wicked' public health problem that is best tackled by an integrated approach, which is enabled by integrated public health policies. The development and implementation of such policies have in practice proven to be difficult, however, and studying why this is the case requires a tool that may assist local policy-makers and those assisting them. A comprehensive framework that can help to identify options for improvement and to systematically develop solutions may be used to support local policy-makers. We propose the 'Behavior Change Ball' as a tool to study the development and implementation of integrated public health policies within local government. Based on the tenets of the 'Behavior Change Wheel' by Michie and colleagues (2011), the proposed conceptual framework distinguishes organizational behaviors of local policy-makers at the strategic, tactical and operational levels, as well as the determinants (motivation, capability, opportunity) required for these behaviors, and interventions and policy categories that can influence them. To illustrate the difficulty of achieving sustained integrated approaches, we use the metaphor of a ball in our framework: the mountainous landscapes surrounding the ball reflect the system's resistance to change (by making it difficult for the ball to roll). We apply this framework to the problem of childhood obesity prevention. The added value provided by the framework lies in its comprehensiveness, theoretical basis, diagnostic and heuristic nature and face validity. Since integrated public health policies have not been widely developed and implemented in practice, organizational behaviors relevant to the development of these policies remain to be investigated. A conceptual framework that can assist in systematically studying the policy process may facilitate this. Our Behavior Change Ball adds significant value to existing public health policy frameworks by incorporating multiple theoretical perspectives, specifying a set of organizational behaviors and linking the analysis of these behaviors to interventions and policies. We would encourage examination by others of our framework as a tool to explain and guide the development of integrated policies for the prevention of wicked public health problems.
High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications
NASA Technical Reports Server (NTRS)
Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad
2012-01-01
NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure shows an example of this capability where the Brazil Nut problem is simulated: as the container full of granular material is vibrated, the large ball slowly moves upwards. This capability was expanded to account for anchors of different shapes and penetration velocities, interacting with granular soils.
2013-01-01
Background Childhood obesity is a ‘wicked’ public health problem that is best tackled by an integrated approach, which is enabled by integrated public health policies. The development and implementation of such policies have in practice proven to be difficult, however, and studying why this is the case requires a tool that may assist local policy-makers and those assisting them. A comprehensive framework that can help to identify options for improvement and to systematically develop solutions may be used to support local policy-makers. Discussion We propose the ‘Behavior Change Ball’ as a tool to study the development and implementation of integrated public health policies within local government. Based on the tenets of the ‘Behavior Change Wheel’ by Michie and colleagues (2011), the proposed conceptual framework distinguishes organizational behaviors of local policy-makers at the strategic, tactical and operational levels, as well as the determinants (motivation, capability, opportunity) required for these behaviors, and interventions and policy categories that can influence them. To illustrate the difficulty of achieving sustained integrated approaches, we use the metaphor of a ball in our framework: the mountainous landscapes surrounding the ball reflect the system’s resistance to change (by making it difficult for the ball to roll). We apply this framework to the problem of childhood obesity prevention. The added value provided by the framework lies in its comprehensiveness, theoretical basis, diagnostic and heuristic nature and face validity. Summary Since integrated public health policies have not been widely developed and implemented in practice, organizational behaviors relevant to the development of these policies remain to be investigated. A conceptual framework that can assist in systematically studying the policy process may facilitate this. Our Behavior Change Ball adds significant value to existing public health policy frameworks by incorporating multiple theoretical perspectives, specifying a set of organizational behaviors and linking the analysis of these behaviors to interventions and policies. We would encourage examination by others of our framework as a tool to explain and guide the development of integrated policies for the prevention of wicked public health problems. PMID:23597122
ERIC Educational Resources Information Center
Cooper, Melanie M.; Cox, Charles T., Jr.; Nammouz, Minory; Case, Edward; Stevens, Ronald
2008-01-01
Improving students' problem-solving skills is a major goal for most science educators. While a large body of research on problem solving exists, assessment of meaningful problem solving is very difficult, particularly for courses with large numbers of students in which one-on-one interactions are not feasible. We have used a suite of software…
ERIC Educational Resources Information Center
Danek, Amory H.; Wiley, Jennifer; Öllinger, Michael
2016-01-01
Insightful problem solving is a vital part of human thinking, yet very difficult to grasp. Traditionally, insight has been investigated by using a set of established "insight tasks," assuming that insight has taken place if these problems are solved. Instead of assuming that insight takes place during every solution of the 9 Dot, 8 Coin,…
Chen, Xianling; Chen, Buyuan; Li, Xiaofan; Song, Qingxiao; Chen, Yuanzhong
2017-03-04
Hematology is difficult for students to learn. A beneficial education method for hematology clerkship training is required to help students develop clinical skills. Foreign medical students often encounter communication issues in China. To address this issue, Chinese post-graduates from our institute are willing to assist with educating foreign students. Therefore, we propose a mixed team-based learning method (MTBL) which might overcome communication problems in hematology clerkship. Twenty-two foreign medical Students attended a 2-week hematology clerkship in Fujian Medical University Union Hospital. Twenty-one foreign African medical students were assigned randomly into two groups. Fourteen foreign African medical students were assigned to MTBL group. Each MTBL team included two foreign African medical students and one Chinese post-graduate. Seven foreign African medical students were assigned to lecture-based learning method (LBL) group, which had a foreign medical classmate from Hong Kong or Chinese intern volunteers to serve as translators. The practice test scores of MTBL were significantly higher than LBL group (p < 0.05). The MTBL group had increased motivation to prepare before class, an engaged classroom atmosphere, and an improvement in their understanding of difficult topics. Interestingly, the Chinese post-graduates also benefited from this setting, as they found that this interaction improved their communication in the English language. The mixed team-based learning method overcomes communication problems in hematology clerkship. Foreign medical students and Chinese post-graduates alike can benefit from MTBL. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(2):93-96, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
Solving Constraint Satisfaction Problems with Networks of Spiking Neurons
Jonke, Zeno; Habenschuss, Stefan; Maass, Wolfgang
2016-01-01
Network of neurons in the brain apply—unlike processors in our current generation of computer hardware—an event-based processing strategy, where short pulses (spikes) are emitted sparsely by neurons to signal the occurrence of an event at a particular point in time. Such spike-based computations promise to be substantially more power-efficient than traditional clocked processing schemes. However, it turns out to be surprisingly difficult to design networks of spiking neurons that can solve difficult computational problems on the level of single spikes, rather than rates of spikes. We present here a new method for designing networks of spiking neurons via an energy function. Furthermore, we show how the energy function of a network of stochastically firing neurons can be shaped in a transparent manner by composing the networks of simple stereotypical network motifs. We show that this design approach enables networks of spiking neurons to produce approximate solutions to difficult (NP-hard) constraint satisfaction problems from the domains of planning/optimization and verification/logical inference. The resulting networks employ noise as a computational resource. Nevertheless, the timing of spikes plays an essential role in their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines) and Gibbs sampling. PMID:27065785
Fecal contamination of waters used for recreation, drinking water, and aquaculture is an environmental problem and poses significant human health risks. The problem is often difficult to correct because the source of the contamination cannot be determined with certainty. Run-of...
Evolving neural networks for strategic decision-making problems.
Kohl, Nate; Miikkulainen, Risto
2009-04-01
Evolution of neural networks, or neuroevolution, has been a successful approach to many low-level control problems such as pole balancing, vehicle control, and collision warning. However, certain types of problems-such as those involving strategic decision-making-have remained difficult for neuroevolution to solve. This paper evaluates the hypothesis that such problems are difficult because they are fractured: The correct action varies discontinuously as the agent moves from state to state. A method for measuring fracture using the concept of function variation is proposed and, based on this concept, two methods for dealing with fracture are examined: neurons with local receptive fields, and refinement based on a cascaded network architecture. Experiments in several benchmark domains are performed to evaluate how different levels of fracture affect the performance of neuroevolution methods, demonstrating that these two modifications improve performance significantly. These results form a promising starting point for expanding neuroevolution to strategic tasks.
NASA Astrophysics Data System (ADS)
Nakanishi, Mayumi; Hoshino, Satoshi; Hashimoto, Shizuka; Kuki, Yasuaki
The problems of Marginal Hamlets are getting worse, in which more than half of the population is over 65 and community-based life is difficult. To contribute to effective policy making, we conducted a questionnaire survey to members of the National Liaison Council of ‘Suigen no Sato’ constituted to share information about problems and effective counter measures for marginal hamlets. Our study clarified that first, most of respondents had common problems such as lack of job-opportunities and animal damage on farm, and second, though most of respondents recognized the effectiveness of selecting target communities in policy implementations, it is difficult for municipal governments to establish such ordinance provided that councilors and those who were not living in areas of policy target wouldn't agree with it. Finally, we pointed out the roles of national and prefectural governments to help municipal governments effectively cope with such entangled situations.
The bright side of being blue: Depression as an adaptation for analyzing complex problems
Andrews, Paul W.; Thomson, J. Anderson
2009-01-01
Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
The programs and context of medical education in Argentina.
Centeno, Angel M
2006-12-01
There are 29 medical schools in Argentina (this number has increased rapidly in the last decade) offering a 6-year curriculum that usually consists of 3 years of basic science, 2 years of clinical sciences, and one internship year. Annually, 5,000 physicians graduate from these programs. Admission requirements vary depending on each university's policy. Some do not have entry requirements; others require a course, usually on the basics of mathematics, biology, chemistry or physics, and some introduction to social and humanistic studies. Each year, there are approximately 12,000 first-year medical students attending the 29 schools, which suffer a high dropout rate during the first years because of vocational problems or inability to adapt to university life. Some schools have massive classes (over 2,000 students), which makes it difficult for the schools to perfect their teaching. The number of full-time faculty members is low, and some of them have appointments at more than one medical school. Residency programs offer an insufficient number of places, and fewer than 50% of the graduates can obtain a residency position because of strict admission requirements. Coordination between the Ministry of Health, representing the health care system, and the Ministry of Education, representing the medical education system, needs to be improved. Despite the problems of medical education in Argentina, the movement to improve the education of health care workers is growing. The author offers two recommendations to help accomplish this goal.
Advances in spectroscopic methods for quantifying soil carbon
Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean
2012-01-01
The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.
Take a Multidisciplinary, Team-based Approach on Elder Abuse.
2016-07-01
While EDs are well positioned to identify incidents of elder abuse, providers often miss the opportunity. Experts say providers find only one in every 24 cases, and that the pendulum must swing toward over-detection. Investigators acknowledge elder abuse is difficult to confirm, given that disease processes can explain some of the signs. Further, older adults are often reluctant to report abuse because they fear they will be removed from their homes or separated from their caregivers. Given the complexity involved with addressing the issue, investigators recommend EDs establish a multidisciplinary approach to the problem. Providing great care to a victim of elder abuse requires time and setting up a circumstance whereby one can actually communicate with the patient reliably and alone. While most states require providers to report suspected cases of elder abuse to Adult Protective Services, there is little evidence this requirement has incentivized more reports in the same way a similar requirement has prompted providers to report cases of suspected child abuse. Investigators advise ED leaders to train and empower every member of their team to identify potential signs of elder abuse.
THE APPLICATION OF MULTIVIEW METHODS FOR HIGH-PRECISION ASTROMETRIC SPACE VLBI AT LOW FREQUENCIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodson, R.; Rioja, M.; Imai, H.
2013-06-15
High-precision astrometric space very long baseline interferometry (S-VLBI) at the low end of the conventional frequency range, i.e., 20 cm, is a requirement for a number of high-priority science goals. These are headlined by obtaining trigonometric parallax distances to pulsars in pulsar-black hole pairs and OH masers anywhere in the Milky Way and the Magellanic Clouds. We propose a solution for the most difficult technical problems in S-VLBI by the MultiView approach where multiple sources, separated by several degrees on the sky, are observed simultaneously. We simulated a number of challenging S-VLBI configurations, with orbit errors up to 8 mmore » in size and with ionospheric atmospheres consistent with poor conditions. In these simulations we performed MultiView analysis to achieve the required science goals. This approach removes the need for beam switching requiring a Control Moment Gyro, and the space and ground infrastructure required for high-quality orbit reconstruction of a space-based radio telescope. This will dramatically reduce the complexity of S-VLBI missions which implement the phase-referencing technique.« less
Fast and Efficient Discrimination of Traveling Salesperson Problem Stimulus Difficulty
ERIC Educational Resources Information Center
Dry, Matthew J.; Fontaine, Elizabeth L.
2014-01-01
The Traveling Salesperson Problem (TSP) is a computationally difficult combinatorial optimization problem. In spite of its relative difficulty, human solvers are able to generate close-to-optimal solutions in a close-to-linear time frame, and it has been suggested that this is due to the visual system's inherent sensitivity to certain geometric…
The Problem Solving Studio: An Apprenticeship Environment for Aspiring Engineers
ERIC Educational Resources Information Center
Le Doux, Joseph M.; Waller, Alisha A.
2016-01-01
This paper describes the problem-solving studio (PSS) learning environment. PSS was designed to teach students how to solve difficult analytical engineering problems without resorting to rote memorization of algorithms, while at the same time developing their deep conceptual understanding of the course topics. There are several key features of…
ERIC Educational Resources Information Center
Walker, Decker F.
This paper addresses the reasons that it is difficult to find good educational software and proposes measures for coping with this problem. The fundamental problem is a shortange of educational software that can be used as a major part of the teaching of academic subjects in elementary and secondary schools--a shortage that is both the effect and…
Why Inquiry Is Inherently Difficult...and Some Ways to Make It Easier
ERIC Educational Resources Information Center
Meyer, Daniel Z.; Avery, Leanne M.
2010-01-01
In this article, the authors offer a framework that identifies two critical problems in designing inquiry-based instruction and suggests three models for developing instruction that overcomes those problems. The Protocol Model overcomes the Getting on Board Problem by providing students an initial experience through clearly delineated steps with a…
How to Arrive at Good Research Questions?
ERIC Educational Resources Information Center
Gafoor, K. Abdul
2008-01-01
Identifying an area of research a topic, deciding on a problem, and formulating it in to a researchable question are very difficult stages in the whole research process at least for beginners. Few books on research methodology elaborates the various process involved in problem selection and clarification. Viewing research and problem selection as…
Integrating Worked Examples into Problem Posing in a Web-Based Learning Environment
ERIC Educational Resources Information Center
Hsiao, Ju-Yuan; Hung, Chun-Ling; Lan, Yu-Feng; Jeng, Yoau-Chau
2013-01-01
Most students always lack of experience and perceive difficult regarding problem posing. The study hypothesized that worked examples may have benefits for supporting students' problem posing activities. A quasi-experiment was conducted in the context of a business mathematics course for examining the effects of integrating worked examples into…
Graff, Mario; Poli, Riccardo; Flores, Juan J
2013-01-01
Modeling the behavior of algorithms is the realm of evolutionary algorithm theory. From a practitioner's point of view, theory must provide some guidelines regarding which algorithm/parameters to use in order to solve a particular problem. Unfortunately, most theoretical models of evolutionary algorithms are difficult to apply to realistic situations. However, in recent work (Graff and Poli, 2008, 2010), where we developed a method to practically estimate the performance of evolutionary program-induction algorithms (EPAs), we started addressing this issue. The method was quite general; however, it suffered from some limitations: it required the identification of a set of reference problems, it required hand picking a distance measure in each particular domain, and the resulting models were opaque, typically being linear combinations of 100 features or more. In this paper, we propose a significant improvement of this technique that overcomes the three limitations of our previous method. We achieve this through the use of a novel set of features for assessing problem difficulty for EPAs which are very general, essentially based on the notion of finite difference. To show the capabilities or our technique and to compare it with our previous performance models, we create models for the same two important classes of problems-symbolic regression on rational functions and Boolean function induction-used in our previous work. We model a variety of EPAs. The comparison showed that for the majority of the algorithms and problem classes, the new method produced much simpler and more accurate models than before. To further illustrate the practicality of the technique and its generality (beyond EPAs), we have also used it to predict the performance of both autoregressive models and EPAs on the problem of wind speed forecasting, obtaining simpler and more accurate models that outperform in all cases our previous performance models.
ERIC Educational Resources Information Center
Kerr, Deirdre
2014-01-01
Educational video games provide an opportunity for students to interact with and explore complex representations of academic content and allow for the examination of problem-solving strategies and mistakes that can be difficult to capture in more traditional environments. However, data from such games are notoriously difficult to analyze. This…
Overcoming Misconceptions in Neurophysiology Learning: An Approach Using Color-Coded Animations
ERIC Educational Resources Information Center
Guy, Richard
2012-01-01
Anyone who has taught neurophysiology would be aware of recurring concepts that students find difficult to understand. However, a greater problem is the development of misconceptions that may be difficult to change. For example, one common misconception is that action potentials pass directly across chemical synapses. Difficulties may be…
ERIC Educational Resources Information Center
Auduc, Jean-Louis
This book outlines challenges involved in ensuring that teacher training becomes the gateway to implementation of appropriate strategies for students to achieve and for managing the problems of authority, discipline, and aggressive behavior. The six chapters examine: (1) "Teaching in Schools or Classes Considered Difficult: A Contemporary…
Reflectivity of crack sealant.
DOT National Transportation Integrated Search
2002-01-01
Crack sealing is used in road maintenance but presents a problem when crack seal material visually pops out on the roadway, making it difficult to see lane stripes. This problem will increase as New Mexico increases its use of crack sealants. This su...
PLUM: Parallel Load Balancing for Unstructured Adaptive Meshes. Degree awarded by Colorado Univ.
NASA Technical Reports Server (NTRS)
Oliker, Leonid
1998-01-01
Dynamic mesh adaption on unstructured grids is a powerful tool for computing large-scale problems that require grid modifications to efficiently resolve solution features. By locally refining and coarsening the mesh to capture physical phenomena of interest, such procedures make standard computational methods more cost effective. Unfortunately, an efficient parallel implementation of these adaptive methods is rather difficult to achieve, primarily due to the load imbalance created by the dynamically-changing nonuniform grid. This requires significant communication at runtime, leading to idle processors and adversely affecting the total execution time. Nonetheless, it is generally thought that unstructured adaptive- grid techniques will constitute a significant fraction of future high-performance supercomputing. Various dynamic load balancing methods have been reported to date; however, most of them either lack a global view of loads across processors or do not apply their techniques to realistic large-scale applications.
Muhammad Sarfraz, Rai; Bashir, Sajid; Mahmood, Asif; Ahsan, Haseeb; Riaz, Humayun; Raza, Hina; Rashid, Zermina; Atif Raza, Syed; Asad Abrar, Muhammad; Abbas, Khawar; Yasmeen, Tahira
2017-03-01
Solubility is concerned with solute and solvent to form a homogenous mixture. If solubility of a drug is low, then usually it is difficult to achieve desired therapeutic level of drug. Most of the newly developed entities have solubility problems and encounter difficulty in dissolution. Basic aim of solubility enhancement is to achieve desired therapeutic'level of drug to produce required pharmacological response. Different techniques are being used to enhance the solubility of water insoluble drugs. These techniques include particle size reduction, spray drying, kneading method, solvent evaporation method, salt formation, microemulsions, co-solven- cy, hydrosols, prodrug approach, supercritical fluid process, hydrogel micro particles etc. Selection of solubility improving method depends on drug properties, site of absorption, and required dosage form characteristics. Variety of polymers are also used to enhance solubility of these drugs like polyethylene glycol 300, polyvinyl pyrrolidone, chitosan, β-cyclodextrins etc.
SpineCreator: a Graphical User Interface for the Creation of Layered Neural Models.
Cope, A J; Richmond, P; James, S S; Gurney, K; Allerton, D J
2017-01-01
There is a growing requirement in computational neuroscience for tools that permit collaborative model building, model sharing, combining existing models into a larger system (multi-scale model integration), and are able to simulate models using a variety of simulation engines and hardware platforms. Layered XML model specification formats solve many of these problems, however they are difficult to write and visualise without tools. Here we describe a new graphical software tool, SpineCreator, which facilitates the creation and visualisation of layered models of point spiking neurons or rate coded neurons without requiring the need for programming. We demonstrate the tool through the reproduction and visualisation of published models and show simulation results using code generation interfaced directly into SpineCreator. As a unique application for the graphical creation of neural networks, SpineCreator represents an important step forward for neuronal modelling.
Propeller Flaps: A Review of Indications, Technique, and Results
D'Arpa, Salvatore; Toia, Francesca; Pirrello, Roberto; Moschella, Francesco; Cordova, Adriana
2014-01-01
In the last years, propeller flaps have become an appealing option for coverage of a large range of defects. Besides having a more reliable vascular pedicle than traditional flap, propeller flaps allow for great freedom in design and for wide mobilization that extend the possibility of reconstructing difficult wounds with local tissues and minimal donor-site morbidity. They also allow one-stage reconstruction of defects that usually require multiple procedures. Harvesting of a propeller flap requires accurate patient selection, preoperative planning, and dissection technique. Complication rate can be kept low, provided that potential problems are prevented, promptly recognized, and adequately treated. This paper reviews current knowledge on propeller flaps. Definition, classification, and indications in the different body regions are discussed based on a review of the literature and on the authors' experience. Details about surgical technique are provided, together with tips to avoid and manage complications. PMID:24971367
A case for the sentence in reading comprehension.
Scott, Cheryl M
2009-04-01
This article addresses sentence comprehension as a requirement of reading comprehension within the framework of the narrow view of reading that was advocated in the prologue to this forum. The focus is on the comprehension requirements of complex sentences, which are characteristic of school texts. Topics included in this discussion are (a) evidence linking sentence comprehension and syntax with reading, (b) syntactic properties of sentences that make them difficult to understand, (c) clinical applications for the assessment of sentence comprehension as it relates to reading, and (d) evidence and methods for addressing sentence complexity in treatment. Sentence complexity can create comprehension problems for struggling readers. The contribution of sentence comprehension to successful reading has been overlooked in models that emphasize domain-general comprehension strategies at the text level. The author calls for the evaluation of sentence comprehension within the context of content domains where complex sentences are found.
Forming Mandrels for X-Ray Mirror Substrates
NASA Technical Reports Server (NTRS)
Blake, Peter N.; Saha, Timo; Zhang, Will; O'Dell, Stephen; Kester, Thomas; Jones, William
2011-01-01
Future x-ray astronomical missions, like the International X-ray Observatory (IXO), will likely require replicated mirrors to reduce both mass and production costs. Accurately figured and measured mandrels - upon which the mirror substrates are thermally formed - are essential to enable these missions. The challenge of making these mandrels within reasonable costs and schedule has led the Goddard and Marshall Space Flight Centers to develop in-house processes and to encourage small businesses to attack parts of the problem. Both Goddard and Marshall have developed full-aperture polishing processes and metrologies that yield high-precision axial traces of the finished mandrels. Outside technologists have been addressing challenges presented by subaperture CNC machining processes: particularly difficult is the challenge of reducing mid-spatial frequency errors below 2 nm rms. The end-product of this approach is a realistic plan for the economically feasible production of mandrels that meet program requirements in both figure and quantity.
Component model reduction via the projection and assembly method
NASA Technical Reports Server (NTRS)
Bernard, Douglas E.
1989-01-01
The problem of acquiring a simple but sufficiently accurate model of a dynamic system is made more difficult when the dynamic system of interest is a multibody system comprised of several components. A low order system model may be created by reducing the order of the component models and making use of various available multibody dynamics programs to assemble them into a system model. The difficulty is in choosing the reduced order component models to meet system level requirements. The projection and assembly method, proposed originally by Eke, solves this difficulty by forming the full order system model, performing model reduction at the the system level using system level requirements, and then projecting the desired modes onto the components for component level model reduction. The projection and assembly method is analyzed to show the conditions under which the desired modes are captured exactly; to the numerical precision of the algorithm.
Visualizing Parallel Computer System Performance
NASA Technical Reports Server (NTRS)
Malony, Allen D.; Reed, Daniel A.
1988-01-01
Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.
Mukhamadiyarov, Rinat A; Sevostyanova, Victoria V; Shishkova, Daria K; Nokhrin, Andrey V; Sidorova, Olga D; Kutikhin, Anton G
2016-06-01
A broad use of the graft replacement requires a detailed investigation of the host-graft interaction, including both histological examination and electron microscopy. A high quality sectioning of the host tissue with a graft seems to be complicated; in addition, it is difficult to examine the same tissue area by both of the mentioned microscopy techniques. To solve these problems, we developed a new technique of epoxy resin embedding with the further grinding, polishing, and staining. Graft-containing tissues prepared by grinding and polishing preserved their structure; however, sectioning frequently required the explantation of the graft and led to tissue disintegration. Moreover, stained samples prepared by grinding and polishing may then be assessed by both light microscopy and backscattered scanning electron microscopy. Therefore, grinding and polishing outperform sectioning when applied to the tissues with a graft. Copyright © 2016 Elsevier Ltd. All rights reserved.
Geopan AT@S: a Brokering Based Gateway to Georeferenced Historical Maps for Risk Analysis
NASA Astrophysics Data System (ADS)
Previtali, M.
2017-08-01
Importance of ancient and historical maps is nowadays recognized in many applications (e.g., urban planning, landscape valorisation and preservation, land changes identification, etc.). In the last years a great effort has been done by different institutions, such as Geographical Institutes, Public Administrations, and collaborative communities, for digitizing and publishing online collections of historical maps. In spite of this variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required. In addition, problems of interconnection between different data sources and their restricted interoperability may arise. This paper describe a new brokering based gateway developed to assure interoperability between data, in particular georeferenced historical maps and geographic data, gathered from different data providers, with various features and referring to different historical periods. The developed approach is exemplified by a new application named GeoPAN Atl@s that is aimed at linking in Northern Italy area land changes with risk analysis (local seismicity amplification and flooding risk) by using multi-temporal data sources and historic maps.
Avoiding Errors in the Management of Pediatric Polytrauma Patients.
Chin, Kenneth; Abzug, Joshua; Bae, Donald S; Horn, Bernard D; Herman, Martin; Eberson, Craig P
2016-01-01
Management of pediatric polytrauma patients is one of the most difficult challenges for orthopaedic surgeons. Multisystem injuries frequently include complex orthopaedic surgical problems that require intervention. The physiology and anatomy of children and adolescent trauma patients differ from the physiology and anatomy of an adult trauma patient, which alters the types of injuries sustained and the ideal methods for management. Errors of pediatric polytrauma care are included in two broad categories: missed injuries and inadequate fracture treatment. Diagnoses may be missed most frequently because of a surgeon's inability to reliably assess patients who have traumatic brain injuries and painful distracting injuries. Cervical spine injuries are particularly difficult to identify in a child with polytrauma and may have devastating consequences. In children who have multiple injuries, the stabilization of long bone fractures with pediatric fixation techniques, such as elastic nails and other implants, allows for easier care and more rapid mobilization compared with cast treatments. Adolescent polytrauma patients who are approaching skeletal maturity, however, are ideally treated as adults to avoid complications, such as loss of fixation, and to speed rehabilitation.
Prolonged weaning: from the intensive care unit to home.
Navalesi, P; Frigerio, P; Patzlaff, A; Häußermann, S; Henseke, P; Kubitschek, M
2014-01-01
Weaning is the process of withdrawing mechanical ventilation which starts with the first spontaneous breathing trial (SBT). Based on the degree of difficulty and duration, weaning is classified as simple, difficult and prolonged. Prolonged weaning, which includes patients who fail 3 SBTs or are still on mechanical ventilation 7 days after the first SBT, affects a relatively small fraction of mechanically ventilated ICU patients but these, however, requires disproportionate resources. There are several potential causes which can lead to prolonged weaning. It is nonetheless important to understand the problem from the point of view of each individual patient in order to adopt appropriate treatment and define precise prognosis. An otherwise stable patient who remains on mechanical ventilation will be considered for transfer to a specialized weaning unit (SWU). Though there is not a precise definition, SWU can be considered as highly specialized and protected environments for patients requiring mechanical ventilation despite resolution of the acute disorder. Proper staffing, well defined short-term and long-term goals, attention to psychological and social problems represent key determinants of SWU success. Some patients cannot be weaned, either partly or entirely, and may require long-term home mechanical ventilation. In these cases the logistics relating to caregivers and the equipment must be carefully considered and addressed. Copyright © 2014 Sociedade Portuguesa de Pneumologia. Published by Elsevier España. All rights reserved.
Autonomous Intersection Management
2009-12-01
is irrelevant. Fortunately, researchers are attacking this problem with many techniques. In 2004, Honda introduced an intelligent night vision system...or less a solved problem . The problem itself is not too difficult: there are no pedestrians or cyclists and vehicles travel in the same direction at...organized according to the following subgoals, each of which is a contribution of the thesis. 1. Problem definition First, this thesis contributes a
The resolvent of singular integral equations. [of kernel functions in mixed boundary value problems
NASA Technical Reports Server (NTRS)
Williams, M. H.
1977-01-01
The investigation reported is concerned with the construction of the resolvent for any given kernel function. In problems with ill-behaved inhomogeneous terms as, for instance, in the aerodynamic problem of flow over a flapped airfoil, direct numerical methods become very difficult. A description is presented of a solution method by resolvent which can be employed in such problems.
Anaesthesia and intensive care management of face transplantation.
Sedaghati-nia, A; Gilton, A; Liger, C; Binhas, M; Cook, F; Ait-Mammar, B; Scherrer, E; Hivelin, M; Lantieri, L; Marty, J; Plaud, B
2013-10-01
The face-grafting techniques are innovative and highly complex, requiring well-defined organization of all the teams involved. Subsequent to the first report in France in 2005, there have been 17 facial allograft transplantations performed worldwide. We describe anaesthesia and postoperative management, and the problems encountered, during the course of seven facial composite tissue grafts performed between 2007 and 2011 in our hospital. The reasons for transplantation were ballistic trauma in four patients, extensive neurofibromatosis in two patients, and severe burns in one patient. Anaesthesia for this long procedure involves advanced planning for airway management, vascular access, technique of anaesthesia, and fluid management. Preparation and grafting phases were highly haemorrhagic (>one blood volume), requiring massive transfusion. Median (range) volumes given for packed red cell (PRC) and fresh-frozen plasma (FFP) were 64.2 ml kg(-1) (35.5-227.5) and 46.2 ml kg(-1) (6.3-173.7), respectively. Blood loss quantification was difficult because of diffuse bleeding to the drapes. The management of patients with neurofibromatosis or burns involving the whole face was more difficult and haemorrhagic than the patients with lower face transplantation. Average surgical duration was 19.1 h (15-28 h). Postoperative severe graft oedema was present in most patients. Most patients encountered complications in ICU, such as renal insufficiency, acute respiratory distress syndrome, and jugular thrombosis. Opportunistic bacterial infections were a feature during the postoperative period in these highly immunosuppressed patients.
Large-scale wind turbine structures
NASA Technical Reports Server (NTRS)
Spera, David A.
1988-01-01
The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.
Direct Monte Carlo simulation of chemical reaction systems: Simple bimolecular reactions
NASA Astrophysics Data System (ADS)
Piersall, Shannon D.; Anderson, James B.
1991-07-01
In applications to several simple reaction systems we have explored a ``direct simulation'' method for predicting and understanding the behavior of gas phase chemical reaction systems. This Monte Carlo method, originated by Bird, has been found remarkably successful in treating a number of difficult problems in rarefied dynamics. Extension to chemical reactions offers a powerful tool for treating reaction systems with nonthermal distributions, with coupled gas-dynamic and reaction effects, with emission and adsorption of radiation, and with many other effects difficult to treat in any other way. The usual differential equations of chemical kinetics are eliminated. For a bimolecular reaction of the type A+B→C+D with a rate sufficiently low to allow a continued thermal equilibrium of reactants we find that direct simulation reproduces the expected second order kinetics. Simulations for a range of temperatures yield the activation energies expected for the reaction models specified. For faster reactions under conditions leading to a depletion of energetic reactant species, the expected slowing of reaction rates and departures from equilibrium distributions are observed. The minimum sample sizes required for adequate simulations are as low as 1000 molecules for these cases. The calculations are found to be simple and straightforward for the homogeneous systems considered. Although computation requirements may be excessively high for very slow reactions, they are reasonably low for fast reactions, for which nonequilibrium effects are most important.
Treatment of Eczema: Corticosteroids and Beyond.
Chong, Melanie; Fonacier, Luz
2016-12-01
Atopic dermatitis (AD) is a chronic inflammatory skin condition that requires a manifold approach to therapy. The goal of therapy is to restore the function of the epidermal barrier and to reduce skin inflammation. This can be achieved with skin moisturization and topical anti-inflammatory agents, such as topical corticosteroids and calcineurin inhibitors. Furthermore, proactive therapy with twice weekly use of both topical corticosteroids and calcineurin inhibitors in previously affected areas has been found to reduce the time to the next eczematous flare. Adjunctive treatment options include wet wrap therapy, anti-histamines, and vitamin D supplementation. Bacterial colonization, in particular Staphylococcus aureus, can contribute to eczematous flares and overt infection. Use of systemic antibiotics in infected lesions is warranted; however, empiric antibiotics use in uninfected lesions is controversial. Local antiseptic measures (i.e., bleach baths) and topical antimicrobial therapies can be considered in patients with high bacterial colonization. Difficult-to-treat AD is a complex clinical problem that may require re-evaluation of the initial diagnosis of AD, especially if the onset of disease occurs in adulthood. It may also necessitate evaluation for contact, food, and inhaled allergens that may exacerbate the underlying AD. There are a host of systemic therapies that have been successful in patients with difficult-to-treat AD, however, these agents are limited by their side effect profiles. Lastly, with further insight into the pathophysiology of AD, new biological agents have been investigated with promising results.
The molecular matching problem
NASA Technical Reports Server (NTRS)
Kincaid, Rex K.
1993-01-01
Molecular chemistry contains many difficult optimization problems that have begun to attract the attention of optimizers in the Operations Research community. Problems including protein folding, molecular conformation, molecular similarity, and molecular matching have been addressed. Minimum energy conformations for simple molecular structures such as water clusters, Lennard-Jones microclusters, and short polypeptides have dominated the literature to date. However, a variety of interesting problems exist and we focus here on a molecular structure matching (MSM) problem.
An in-flight simulation of approach and landing of a STOL transport with adverse ground effect
NASA Technical Reports Server (NTRS)
Ellis, D. R.
1976-01-01
The results of an in-flight simulation program undertaken to study the problems of landing a representative STOL transport in the presence of adverse ground effects are presented. Landings were performed with variations in ground effect magnitude, ground effect lag, and thrust response. Other variations covered the effects of augmented lift response, SAS-failures, turbulence, segmented approach, and flare warning. The basic STOL airplane required coordinated use of both stick and throttle for consistently acceptable landings, and the presence of adverse ground effects made the task significantly more difficult. Ground effect lag and good engine response gave noticeable improvement, as did augmented lift response.
Tips and techniques for engaging and managing the reluctant, resistant or hostile young person.
McCutcheon, Louise K; Chanen, Andrew M; Fraser, Richard J; Drew, Lorelle; Brewer, Warrick
2007-10-01
Creating a collaborative doctor-patient relationship is the bedrock upon which effective treatments are delivered. The interaction between normal developmental changes and psychopathology can present particular challenges to clinicians attempting to assess and treat young people. Assuming an attitude in which young people are seen to be doing their best, rather than being deliberately difficult or manipulative, can help clinicians avoid a controlling or punitive relationship and can facilitate collaborative problem solving. Stigma, denial and avoidance, ambivalence, hopelessness and coercion are potential threats to engagement and must be addressed specifically. Challenging patients, such as the reluctant, resistant, aggressive, self-harming or intoxicated patient require specific management strategies that can be learned.
Donovan, Carl; Harwood, John; King, Stephanie; Booth, Cormac; Caneco, Bruno; Walker, Cameron
2016-01-01
There are many developments for offshore renewable energy around the United Kingdom whose installation typically produces large amounts of far-reaching noise, potentially disturbing many marine mammals. The potential to affect the favorable conservation status of many species means extensive environmental impact assessment requirements for the licensing of such installation activities. Quantification of such complex risk problems is difficult and much of the key information is not readily available. Expert elicitation methods can be employed in such pressing cases. We describe the methodology used in an expert elicitation study conducted in the United Kingdom for combining expert opinions based on statistical distributions and copula-like methods.
A Generalized Technique in Numerical Integration
NASA Astrophysics Data System (ADS)
Safouhi, Hassan
2018-02-01
Integration by parts is one of the most popular techniques in the analysis of integrals and is one of the simplest methods to generate asymptotic expansions of integral representations. The product of the technique is usually a divergent series formed from evaluating boundary terms; however, sometimes the remaining integral is also evaluated. Due to the successive differentiation and anti-differentiation required to form the series or the remaining integral, the technique is difficult to apply to problems more complicated than the simplest. In this contribution, we explore a generalized and formalized integration by parts to create equivalent representations to some challenging integrals. As a demonstrative archetype, we examine Bessel integrals, Fresnel integrals and Airy functions.
Visualization of the variability of 3D statistical shape models by animation.
Lamecker, Hans; Seebass, Martin; Lange, Thomas; Hege, Hans-Christian; Deuflhard, Peter
2004-01-01
Models of the 3D shape of anatomical objects and the knowledge about their statistical variability are of great benefit in many computer assisted medical applications like images analysis, therapy or surgery planning. Statistical model of shapes have successfully been applied to automate the task of image segmentation. The generation of 3D statistical shape models requires the identification of corresponding points on two shapes. This remains a difficult problem, especially for shapes of complicated topology. In order to interpret and validate variations encoded in a statistical shape model, visual inspection is of great importance. This work describes the generation and interpretation of statistical shape models of the liver and the pelvic bone.
Demonstration Of Ultra HI-FI (UHF) Methods
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.
2004-01-01
Computational aero-acoustics (CAA) requires efficient, high-resolution simulation tools. Most current techniques utilize finite-difference approaches because high order accuracy is considered too difficult or expensive to achieve with finite volume or finite element methods. However, a novel finite volume approach (Ultra HI-FI or UHF) which utilizes Hermite fluxes is presented which can achieve both arbitrary accuracy and fidelity in space and time. The technique can be applied to unstructured grids with some loss of fidelity or with multi-block structured grids for maximum efficiency and resolution. In either paradigm, it is possible to resolve ultra-short waves (less than 2 PPW). This is demonstrated here by solving the 4th CAA workshop Category 1 Problem 1.
Minimal Increase Network Coding for Dynamic Networks.
Zhang, Guoyin; Fan, Xu; Wu, Yanxia
2016-01-01
Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery.
Minimal Increase Network Coding for Dynamic Networks
Wu, Yanxia
2016-01-01
Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211
Ochi, Manami; Fujiwara, Takeo
2016-08-01
Research in parental social support has chiefly examined received social support. Studies have suggested that provided social support may also be protective for child mental health problems. We aim to investigate the association between parental social interaction (both received and provided social support) and offspring behavior problems. We analyzed the data of 982 households, including 1538 children aged 4 to 16 years, from the Japanese Study of Stratification, Health, Income, and Neighborhood (J-SHINE) survey conducted over 2010-2011. We used a 5-point Likert scale to assess social interaction including parental emotional and instrumental support received from and provided to the spouse, other co-residing family members, non-co-residing family members or relatives, neighbors, and friends. Behavior problems in offspring were assessed using parental responses to the Strengths and Difficulties Questionnaire. Associations between parental social interaction and behavior problems were analyzed using ordered logistic regression. We found that higher maternal social interaction is significantly associated with lower odds of both difficult and prosocial behavior problems, while the same associations were not found for paternal social interaction. Further, maternal provided social support showed an independent negative association with prosocial behavior problems in offspring, even when adjusted for received maternal social support and paternal social interaction. This study showed that maternal social interaction, but not paternal social interaction, might have a protective effect on offspring behavior problems. Further study is required to investigate the effect of the intervention to increase social participation among mothers whose children have behavior problems.
Biomedical Applications of NASA Science and Technology
NASA Technical Reports Server (NTRS)
Brown, James N., Jr.
1968-01-01
During the period 15 September 1968 to 14 December 1968, the NASA supported Biomedical Application Team at the Research Triangle Institute has identified 6 new problems, performed significant activities on 15 of the active problems identified previously, performed 5 computer searches of the NASA aerospace literature, and maintained one current awareness search. As a partial result of these activities, one technology transfer was accomplished. As a part of continuing problem review, 13 problems were classified inactive. Activities during the quarter involved all phases of team activity with respect to biomedical problems. As has been observed in preceding years, it has been exceedingly difficult to arrange meetings with medical investigators during the fourth quarter of the calendar year. This is a result of a combination of factors. Teaching requirements, submission of grant applications and holidays are the most significant factors involved. As a result, the numbers of new problems identified and of transfers and potential transfers are relatively low during this quarter. Most of our activities have thus been directed toward obtaining information related to problems already identified. Consequently, during the next quarter we will follow up on these activities with the expectation that transfers will be accomplished on a number of them. In addition, the normal availability of researchers to the team is expected to be restored during this quarter, permitting an increase in new problem identification activities as well as follow-up with other researchers on old problems. Another activity scheduled for the next quarter is consultation with several interested biomedical equipment manufacturers to explore means of effective interaction between the Biomedical Application Team and these companies.
Whale, Katie; Cramer, Helen; Joinson, Carol
2018-05-01
To explore the impact of the secondary school environment on young people with continence problems. In-depth qualitative semi-structured interviews. We interviewed 20 young people aged 11-19 years (11 female and nine male) with continence problems (daytime wetting, bedwetting, and/or soiling). Interviews were conducted by Skype (n = 11) and telephone (n = 9). Transcripts were analysed using inductive thematic analysis. We generated five main themes: (1) Boundaries of disclosure: friends and teachers; (2) Social consequences of avoidance and deceit; (3) Strict and oblivious gatekeepers; (4) Intimate actions in public spaces; and (5) Interrupted learning. Disclosure of continence problems at school to both friends and teachers was rare, due to the perceived stigma and fears of bullying and social isolation. The lack of disclosure to teachers and other school staff, such as pastoral care staff, creates challenges in how best to support these young people. Young people with continence problems require unrestricted access to private and adequate toilet facilities during the school day. There is a need for inclusive toilet access policies and improved toilet standards in schools. Addressing the challenges faced by young people with continence problems at school could help to remove the barriers to successful self-management of their symptoms. It is particularly concerning that young people with continence problems are at higher risk of academic underachievement. Increased support at school is needed to enable young people with continence problems to achieve their academic potential. Statement of Contribution What is already known on this subject? Continence problems are among the most common paediatric health problems Self-management of continence problems requires a structured schedule of fluid intake and bladder emptying Inadequate toilet facilities and restricted access make it difficult for young people to manage their incontinence What does this study add? Improvement is needed in teacher understanding of the needs of young people with continence problems Young people are reluctant to disclose continence problems due to perceived stigma and fear of social isolation Young people with continence problems may be at increased risk of academic underachievement. © 2017 The Authors. British Journal of Health Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Wan, Haisu; Li, Yongwen; Fan, Yu; Meng, Fanrong; Chen, Chen; Zhou, Qinghua
2012-01-15
Site-directed mutagenesis has become routine in molecular biology. However, many mutants can still be very difficult to create. Complicated chimerical mutations, tandem repeats, inverted sequences, GC-rich regions, and/or heavy secondary structures can cause inefficient or incorrect binding of the mutagenic primer to the target sequence and affect the subsequent amplification. In theory, these problems can be avoided by introducing the mutations into the target sequence using mutagenic fragments and so removing the need for primer-template annealing. The cassette mutagenesis uses the mutagenic fragment in its protocol; however, in most cases it needs to perform two rounds of mutagenic primer-based mutagenesis to introduce suitable restriction enzyme sites into templates and is not suitable for routine mutagenesis. Here we describe a highly efficient method in which the template except the region to be mutated is amplified by polymerase chain reaction (PCR) and the type IIs restriction enzyme-digested PCR product is directly ligated with the mutagenic fragment. Our method requires no assistance of mutagenic primers. We have used this method to create various types of difficult-to-make mutants with mutagenic frequencies of nearly 100%. Our protocol has many advantages over the prevalent QuikChange method and is a valuable tool for studies on gene structure and function. Copyright © 2011 Elsevier Inc. All rights reserved.
Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Matzen, M. Keith
2014-09-16
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less
Bandyopadhyay, Mridula
2011-11-25
The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Leg lengthening in patients with congenital fibular hemimelia.
Jasiewicz, Barbara; Kacki, Wojciech; Koniarski, Arkadiusz; Kasprzyk, Marcin; Zarzycka, Maja; Tesiorowski, Maciej
2002-08-30
Background. Anisomelia in patients with congenital fibular deficiencies is a difficult orthopedic problem due to concomitant deformities of the angle and knee. The goal of the present study was to analyze outcomes of tibia lengthening in these patients. Material and methods. In the period 1989-2001 we performed lengthening of 26 limbs in 21 patients with congenital fibular deficiency (11 female, 10 male, average age 10.1 years). Under the Achterman-Kalamchi classification, 8 tibiae were Type 1, 3 were Type 1b, and 10 were Type 2 (including one case with bilateral defect). The average baseline shortening was 4.6 cm, i.e. 15.3%. The Ilizarov method was used in 24 cases, chondrial lengthening in the others. We measured time of lengthening, time of stabilization, total healing time, amount of lengthening, and the lengthening index, as well as the range of ankle and knee movement, the positioning of the foot, and the axis of the tibia at each stage. Problems and complications were classified according to Paley. The average follow-up was 4.9 years Results. The mean time of lengthening was 101 days, stabilization time 177 days, total healing time 269 days, mean lengthening 5.6 cm (22.9%). As of the last examination only 7 patients did not require follow-up surgery, 6 with Type 1a and 1 with Type 1b. Conclusions. Tibia lengthening with axis correction constitutes an alternative to amputation in congenital fibular deficiency. It is a difficult procedure, however, encumbered by a significant risk of complications.
Substrate noise coupling: a pain for mixed-signal systems (Keynote Address)
NASA Astrophysics Data System (ADS)
Wambacq, Piet; Van der Plas, Geert; Donnay, Stephane; Badaroglu, Mustafa; Soens, Charlotte
2005-06-01
Crosstalk from digital to analog in mixed-signal ICs is recognized as one of the major roadblocks for systems-on-chip (SoC) in future CMOS technologies. This crosstalk mainly happens via the semiconducting silicon substrate, which is usually treated as a ground node by analog and RF designers. The substrate noise coupling problem leads more and more to malfunctioning or extra design iterations. One of the reasons is that the phenomenon of substrate noise coupling is difficult to model and hence difficult to understand. It can be caused by the switching of thousands or millions of gates and depends on layout details. From the generation side (the digital domain), coping with the large amount of noise generators can be solved by macromodeling. On the other hand, the impact of substrate noise on the analog circuits requires careful modeling at the level of transistors and parasitics of layout, power supply, package, PCB, Comparison to measurements of macromodeling at the digital side and careful modeling at the analog side, shows that both the generation and the impact of substrate noise can be predicted with an accuracy of a few dB. In addition, this combination of macromodeling at the digital side and careful modeling at the analog side leads to an understanding of the problem, which can be used for digital low-noise design techniques to minimize the generation of noise, and substrate noise immune design of analog/RF circuits.
Multiobjective Optimization Using a Pareto Differential Evolution Approach
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)
2002-01-01
Differential Evolution is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. In this paper, the Differential Evolution algorithm is extended to multiobjective optimization problems by using a Pareto-based approach. The algorithm performs well when applied to several test optimization problems from the literature.
ERIC Educational Resources Information Center
Yates, Jennifer L.
2011-01-01
The purpose of this research study was to explore the process of learning and development of problem solving skills in radiologic technologists. The researcher sought to understand the nature of difficult problems encountered in clinical practice, to identify specific learning practices leading to the development of professional expertise, and to…
ERIC Educational Resources Information Center
Westbrook, Amy F.
2011-01-01
It can be difficult to find adequate strategies when teaching problem solving in a standard based mathematics classroom. The purpose of this study was to improve students' problem solving skills and attitudes through differentiated instruction when working on lengthy performance tasks in cooperative groups. This action research studied for 15 days…
ERIC Educational Resources Information Center
Tawfik, Andrew A.
2017-01-01
Theorists have argued instructional strategies that emphasize ill-structured problem solving are an effective means to support higher order learning skills such as argumentation. However, argumentation is often difficult because novices lack the expertise or experience needed to solve contextualized problems. One way to supplement this lack of…
ERIC Educational Resources Information Center
Vick, Malcolm
2006-01-01
Current reviews of teacher education pay considerable attention to problems associated with the practicum, and often claim to propose major changes in order to improve the quality of new graduates. Many of the problems they address concerning the practicum and its relation to the "theoretical" component of programs are longstanding, and…
ERIC Educational Resources Information Center
DOLBY, J.L.; AND OTHERS
THE STUDY IS CONCERNED WITH THE LINGUISTIC PROBLEM INVOLVED IN TEXT COMPRESSION--EXTRACTING, INDEXING, AND THE AUTOMATIC CREATION OF SPECIAL-PURPOSE CITATION DICTIONARIES. IN SPITE OF EARLY SUCCESS IN USING LARGE-SCALE COMPUTERS TO AUTOMATE CERTAIN HUMAN TASKS, THESE PROBLEMS REMAIN AMONG THE MOST DIFFICULT TO SOLVE. ESSENTIALLY, THE PROBLEM IS TO…
EEG Estimates of Cognitive Workload and Engagement Predict Math Problem Solving Outcomes
ERIC Educational Resources Information Center
Beal, Carole R.; Galan, Federico Cirett
2012-01-01
In the present study, the authors focused on the use of electroencephalography (EEG) data about cognitive workload and sustained attention to predict math problem solving outcomes. EEG data were recorded as students solved a series of easy and difficult math problems. Sequences of attention and cognitive workload estimates derived from the EEG…
The Compound Atwood Machine Problem
ERIC Educational Resources Information Center
Coelho, R. Lopes
2017-01-01
The present paper accounts for progress in physics teaching in the sense that a problem, which has been closed to students for being too difficult, is gained for the high school curriculum. This problem is the compound Atwood machine with three bodies. Its introduction into high school classes is based on a recent study on the weighing of an…
The compound Atwood machine problem
NASA Astrophysics Data System (ADS)
Lopes Coelho, R.
2017-05-01
The present paper accounts for progress in physics teaching in the sense that a problem, which has been closed to students for being too difficult, is gained for the high school curriculum. This problem is the compound Atwood machine with three bodies. Its introduction into high school classes is based on a recent study on the weighing of an Atwood machine.
The Crisis Manual for Early Childhood Teachers: How To Handle the Really Difficult Problems.
ERIC Educational Resources Information Center
Miller, Karen
More and more teachers report the increasing incidence of crises in children's lives, crises that interfere with the child's ability to learn. Noting that developmentally appropriate curricula must respond to the issues of immediate concern and interest to children, this source book assists teachers in facing difficult issues in the classroom and…
ERIC Educational Resources Information Center
Kochanska, Grazyna; Kim, Sanghag
2013-01-01
Background: Research has shown that interactions between young children's temperament and the quality of care they receive predict the emergence of positive and negative socioemotional developmental outcomes. This multimethod study addresses such interactions, using observed and mother-rated measures of difficult temperament, children's…
Computer Problems Can Infuriate Even the Most Tech Savvy
ERIC Educational Resources Information Center
Goldsborough, Reid
2004-01-01
Which high-tech products give you the most grief? Surprisingly, more people singled out TiVo and replay digital recording systems than personal computers, according to a recent survey by Best Buy. Nine percent of people said they found these TV devices difficult to use. The same percentage found PDAs (personal digital assistants) difficult. Only 2…
Insight into the ten-penny problem: guiding search by constraints and maximization.
Öllinger, Michael; Fedor, Anna; Brodt, Svenja; Szathmáry, Eörs
2017-09-01
For a long time, insight problem solving has been either understood as nothing special or as a particular class of problem solving. The first view implicates the necessity to find efficient heuristics that restrict the search space, the second, the necessity to overcome self-imposed constraints. Recently, promising hybrid cognitive models attempt to merge both approaches. In this vein, we were interested in the interplay of constraints and heuristic search, when problem solvers were asked to solve a difficult multi-step problem, the ten-penny problem. In three experimental groups and one control group (N = 4 × 30) we aimed at revealing, what constraints drive problem difficulty in this problem, and how relaxing constraints, and providing an efficient search criterion facilitates the solution. We also investigated how the search behavior of successful problem solvers and non-solvers differ. We found that relaxing constraints was necessary but not sufficient to solve the problem. Without efficient heuristics that facilitate the restriction of the search space, and testing the progress of the problem solving process, the relaxation of constraints was not effective. Relaxing constraints and applying the search criterion are both necessary to effectively increase solution rates. We also found that successful solvers showed promising moves earlier and had a higher maximization and variation rate across solution attempts. We propose that this finding sheds light on how different strategies contribute to solving difficult problems. Finally, we speculate about the implications of our findings for insight problem solving.
Counting Tree Growth Rings Moderately Difficult to Distinguish
C. B. Briscoe; M. Chudnoff
1964-01-01
There is an extensive literature dealing with techniques and gadgets to facilitate counting tree growth rings. A relatively simple method is described below, satisfactory for species too difficult to count in the field, but not sufficiently difficult to require the preparation of microscope slides nor staining techniques.
Interpretations of Quantum Theory in the Light of Modern Cosmology
NASA Astrophysics Data System (ADS)
Castagnino, Mario; Fortin, Sebastian; Laura, Roberto; Sudarsky, Daniel
2017-11-01
The difficult issues related to the interpretation of quantum mechanics and, in particular, the "measurement problem" are revisited using as motivation the process of generation of structure from quantum fluctuations in inflationary cosmology. The unessential mathematical complexity of the particular problem is bypassed, facilitating the discussion of the conceptual issues, by considering, within the paradigm set up by the cosmological problem, another problem where symmetry serves as a focal point: a simplified version of Mott's problem.
THE FOREMAN PROBLEM IN JAPANESE INDUSTRY.
ERIC Educational Resources Information Center
THURLEY, KEITH
BRITAIN STUDIED SUPERVISORY TRAINING IN JAPAN, IN ORDER TO GAIN INSIGHT INTO ITS OWN TRAINING PROBLEMS. TRADITIONAL SUPERVISION IN JAPANESE INDUSTRY HAD PRODUCED INCAPABLE FOREMEN THROUGH SENIORITY PROMOTION, CAUSED DIFFICULT RELATIONSHIPS BECAUSE OF AUTHORITARIAN ATTITUDES, AND FAILED TO CLARIFY AUTHORITY ROLES. THE GOVERNMENT RECOMMENDED MORE…
Lessons Learned During Solutions of Multidisciplinary Design Optimization Problems
NASA Technical Reports Server (NTRS)
Patnaik, Suna N.; Coroneos, Rula M.; Hopkins, Dale A.; Lavelle, Thomas M.
2000-01-01
Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. During solution of the multidisciplinary problems several issues were encountered. This paper lists four issues and discusses the strategies adapted for their resolution: (1) The optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. (2) Optimum solutions obtained were infeasible for aircraft and air-breathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. (3) Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. (4) The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through six problems: (1) design of an engine component, (2) synthesis of a subsonic aircraft, (3) operation optimization of a supersonic engine, (4) design of a wave-rotor-topping device, (5) profile optimization of a cantilever beam, and (6) design of a cvlindrical shell. The combined effort of designers and researchers can bring the optimization method from academia to industry.
An advanced artificial intelligence tool for menu design.
Khan, Abdus Salam; Hoffmann, Achim
2003-01-01
The computer-assisted menu design still remains a difficult task. Usually knowledge that aids in menu design by a computer is hard-coded and because of that a computerised menu planner cannot handle the menu design problem for an unanticipated client. To address this problem we developed a menu design tool, MIKAS (menu construction using incremental knowledge acquisition system), an artificial intelligence system that allows the incremental development of a knowledge-base for menu design. We allow an incremental knowledge acquisition process in which the expert is only required to provide hints to the system in the context of actual problem instances during menu design using menus stored in a so-called Case Base. Our system incorporates Case-Based Reasoning (CBR), an Artificial Intelligence (AI) technique developed to mimic human problem solving behaviour. Ripple Down Rules (RDR) are a proven technique for the acquisition of classification knowledge from expert directly while they are using the system, which complement CBR in a very fruitful way. This combination allows the incremental improvement of the menu design system while it is already in routine use. We believe MIKAS allows better dietary practice by leveraging a dietitian's skills and expertise. As such MIKAS has the potential to be helpful for any institution where dietary advice is practised.
NASA Astrophysics Data System (ADS)
Nichols, Jeri Ann
This study examined the relationship between mathematics background and performance on graph-related problems in physics before and after instruction on the graphical analysis of motion and several microcomputer-based laboratory experiences. Students identified as either having or not having a graphing technology enhanced precalculus mathematics background were further categorized into one of four groups according to mathematics placement at the university. The performances of these groups were compared to identity differences. Pre- and Post-test data were collected from 589 students and 312 students during Autumn Quarter 1990 and Winter Quarter 1991 respectively. Background information was collected from each student. Significant differences were found between students with the technology enhanced mathematics background and those without when considering the entire populations both quarters. The students with the technology background were favored Autumn quarter and students without the technology background were favored Winter quarter. However, the entire population included an underrepresentation of students at the highest and lowest placements; hence, these were eliminated from the analyses. No significant differences were found between the technology/no technology groups after the elimination of the underrepresented groups. All categories of students increased their mean scores from pretest to post-test; the average increase was 8.23 points Autumn Quarter and 11.41 points Winter Quarter. Males consistently outperformed females on both the pretest and the post-test Autumn 1990. All students found questions involving the concept of acceleration more difficult than questions involving velocity or distance. Questions requiring students to create graphs were more difficult than questions requiring students to interpret graphs. Further research involving a qualitative component is recommended to identify the specific skills students use when solving graph-related physics problems. In addition, it is recommended that a similar study be conducted to include a control group not participating in the microcomputer -based laboratory experiments.
The National Center for Atmospheric Research (NCAR) Research Data Archive: a Data Education Center
NASA Astrophysics Data System (ADS)
Peng, G. S.; Schuster, D.
2015-12-01
The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA), rda.ucar.edu, is not just another data center or data archive. It is a data education center. We not only serve data, we TEACH data. Weather and climate data is the original "Big Data" dataset and lessons learned while playing with weather data are applicable to a wide range of data investigations. Erroneous data assumptions are the Achilles heel of Big Data. It doesn't matter how much data you crunch if the data is not what you think it is. Each dataset archived at the RDA is assigned to a data specialist (DS) who curates the data. If a user has a question not answered in the dataset information web pages, they can call or email a skilled DS for further clarification. The RDA's diverse staff—with academic training in meteorology, oceanography, engineering (electrical, civil, ocean and database), mathematics, physics, chemistry and information science—means we likely have someone who "speaks your language." Data discovery is another difficult Big Data problem; one can only solve problems with data if one can find the right data. Metadata, both machine and human-generated, underpin the RDA data search tools. Users can quickly find datasets by name or dataset ID number. They can also perform a faceted search that successively narrows the options by user requirements or simply kick off an indexed search with a few words. Weather data formats can be difficult to read for non-expert users; it's usually packed in binary formats requiring specialized software and parameter names use specialized vocabularies. DSs create detailed information pages for each dataset and maintain lists of helpful software, documentation and links of information around the web. We further grow the level of sophistication of the users with tips, tutorials and data stories on the RDA Blog, http://ncarrda.blogspot.com/. How-to video tutorials are also posted on the NCAR Computational and Information Systems Laboratory (CISL) YouTube channel.
Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System.
Chinnadurai, Sunil; Selvaprabhu, Poongundran; Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho
2017-09-18
In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach's algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme.
Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System
Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho
2017-01-01
In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach’s algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme. PMID:28927019
Magsat attitude dynamics and control: Some observations and explanations
NASA Technical Reports Server (NTRS)
Stengle, T. H.
1980-01-01
Before its reentry 7 months after launch, Magsat transmitted an abundance of valuable data for mapping the Earth's magnetic field. As an added benefit, a wealth of attitude data for study by spacecraft dynamicists was also collected. Because of its unique configuration, Magsat presented new control problems. With its aerodynamic trim boom, attitude control was given an added dimension. Minimization of attitude drift, which could be mapped in relative detail, became the goal. Momentum control, which was accomplished by pitching the spacecraft in order to balance aerodynamic and gravity gradient torques, was seldom difficult to achieve. Several interesting phenomena observed as part of this activity included occasional momentum wheel instability and a rough correlation between solar flux and the pitch angle required to maintain acceptable momentum. An overview is presented of the attitude behavior of Magsat and some of the control problems encountered. Plausible explanations for some of this behavior are offered. Some of the control philosophy used during the mission is examined and aerodynamic trimming operations are summarized.
Chrominance watermark for mobile applications
NASA Astrophysics Data System (ADS)
Reed, Alastair; Rogers, Eliot; James, Dan
2010-01-01
Creating an imperceptible watermark which can be read by a broad range of cell phone cameras is a difficult problem. The problems are caused by the inherently low resolution and noise levels of typical cell phone cameras. The quality limitations of these devices compared to a typical digital camera are caused by the small size of the cell phone and cost trade-offs made by the manufacturer. In order to achieve this, a low resolution watermark is required which can be resolved by a typical cell phone camera. The visibility of a traditional luminance watermark was too great at this lower resolution, so a chrominance watermark was developed. The chrominance watermark takes advantage of the relatively low sensitivity of the human visual system to chrominance changes. This enables a chrominance watermark to be inserted into an image which is imperceptible to the human eye but can be read using a typical cell phone camera. Sample images will be presented showing images with a very low visibility which can be easily read by a typical cell phone camera.
From Information Society to Knowledge Society: The Ontology Issue
NASA Astrophysics Data System (ADS)
Roche, Christophe
2002-09-01
Information society, virtual enterprise, e-business rely more and more on communication and knowledge sharing between heterogeneous actors. But, no communication is possible, and all the more so no co-operation or collaboration, if those actors do not share the same or at least a compatible meaning for the terms they use. Ontology, understood as an agreed vocabulary of common terms and meanings, is a solution to that problem. Nevertheless, although there is quite a lot of experience in using ontologies, several barriers remain which stand against a real use of ontology. As a matter of fact, it is very difficult to build, reuse and share ontologies. We claim that the ontology problem requires a multidisciplinary approach based on sound epistemological, logical and linguistic principles. This article presents the Ontological Knowledge Station (OK Station©), a software environment for building and using ontologies which relies on such principles. The OK Station is currently being used in several industrial applications.
Cross-Identification of Astronomical Catalogs on Multiple GPUs
NASA Astrophysics Data System (ADS)
Lee, M. A.; Budavári, T.
2013-10-01
One of the most fundamental problems in observational astronomy is the cross-identification of sources. Observations are made in different wavelengths, at different times, and from different locations and instruments, resulting in a large set of independent observations. The scientific outcome is often limited by our ability to quickly perform meaningful associations between detections. The matching, however, is difficult scientifically, statistically, as well as computationally. The former two require detailed physical modeling and advanced probabilistic concepts; the latter is due to the large volumes of data and the problem's combinatorial nature. In order to tackle the computational challenge and to prepare for future surveys, whose measurements will be exponentially increasing in size past the scale of feasible CPU-based solutions, we developed a new implementation which addresses the issue by performing the associations on multiple Graphics Processing Units (GPUs). Our implementation utilizes up to 6 GPUs in combination with the Thrust library to achieve an over 40x speed up verses the previous best implementation running on a multi-CPU SQL Server.
Rocket exhaust plume impingement on the Voyager spacecraft
NASA Technical Reports Server (NTRS)
Baerwald, R. K.
1978-01-01
In connection with the conduction of the long-duration Voyager missions to the outer planets and the sophisticated propulsion systems required, it was necessary to carry out an investigation to avoid exhaust plume impingement problems. The rarefied gas dynamics literature indicates that, for most engineering surfaces, the assumption of diffuse reemission and complete thermal accommodation is warranted in the free molecular flow regime. This assumption was applied to an analysis of a spacecraft plume impingement problem in the near-free molecular flow regime and yielded results to within a few percent of flight data. The importance of a correct treatment of the surface temperature was also demonstrated. Specular reflection, on the other hand, was shown to yield results which may be unconservative by a factor of 2 or 3. It is pointed out that one of the most difficult portions of an exhaust plume impingement analysis is the simulation of the impinged hardware. The geometry involved must be described as accurately and completely as possible.
Visualization for Hyper-Heuristics: Back-End Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Luke
Modern society is faced with increasingly complex problems, many of which can be formulated as generate-and-test optimization problems. Yet, general-purpose optimization algorithms may sometimes require too much computational time. In these instances, hyperheuristics may be used. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario, finding the solution significantly faster than its predecessor. However, it may be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics and an easy-to-understand scientific visualizationmore » for the produced solutions. To support the development of this GUI, my portion of the research involved developing algorithms that would allow for parsing of the data produced by the hyper-heuristics. This data would then be sent to the front-end, where it would be displayed to the end user.« less
NASA Technical Reports Server (NTRS)
Macfarlane, J. J.
1992-01-01
We investigate the convergence properties of Lambda-acceleration methods for non-LTE radiative transfer problems in planar and spherical geometry. Matrix elements of the 'exact' A-operator are used to accelerate convergence to a solution in which both the radiative transfer and atomic rate equations are simultaneously satisfied. Convergence properties of two-level and multilevel atomic systems are investigated for methods using: (1) the complete Lambda-operator, and (2) the diagonal of the Lambda-operator. We find that the convergence properties for the method utilizing the complete Lambda-operator are significantly better than those of the diagonal Lambda-operator method, often reducing the number of iterations needed for convergence by a factor of between two and seven. However, the overall computational time required for large scale calculations - that is, those with many atomic levels and spatial zones - is typically a factor of a few larger for the complete Lambda-operator method, suggesting that the approach should be best applied to problems in which convergence is especially difficult.
High Performance Computing of Meshless Time Domain Method on Multi-GPU Cluster
NASA Astrophysics Data System (ADS)
Ikuno, Soichiro; Nakata, Susumu; Hirokawa, Yuta; Itoh, Taku
2015-01-01
High performance computing of Meshless Time Domain Method (MTDM) on multi-GPU using the supercomputer HA-PACS (Highly Accelerated Parallel Advanced system for Computational Sciences) at University of Tsukuba is investigated. Generally, the finite difference time domain (FDTD) method is adopted for the numerical simulation of the electromagnetic wave propagation phenomena. However, the numerical domain must be divided into rectangle meshes, and it is difficult to adopt the problem in a complexed domain to the method. On the other hand, MTDM can be easily adept to the problem because MTDM does not requires meshes. In the present study, we implement MTDM on multi-GPU cluster to speedup the method, and numerically investigate the performance of the method on multi-GPU cluster. To reduce the computation time, the communication time between the decomposed domain is hided below the perfect matched layer (PML) calculation procedure. The results of computation show that speedup of MTDM on 128 GPUs is 173 times faster than that of single CPU calculation.
The handicapped child: psychological effects of parental, marital, and sibling relationships.
Fisman, S; Wolf, L
1991-03-01
Although the nature and severity of a handicapping condition are not the sole determinants of family functioning, the presence of a child with a pervasive developmental disorder has a significant effect on family members. Maternal mental health suffers, and the resulting depression affects her role as mother and marriage partner. Unlike other handicapping conditions with obvious physical stigmata, the invisible handicap of the autistic child and the frequent delay in diagnosis contribute to the mother's self-doubt about her parental competence. While the impact on paternal psychological health is less, the fathers of autistic children are nevertheless highly stressed and appear to be particularly vulnerable to the stress generated by these difficult children. Living within this family climate, the risks for emotional and behavioral problems for siblings must be evaluated, along with their intrinsic strengths, to plan preventive interventions for these children. Effective work with these families requires an understanding of the evolution of family system problems and their dynamic and reciprocal interaction over time.
Soh, Jae Seung; Yang, Dong-Hoon; Lee, Sang Soo; Lee, Seohyun; Bae, Jungho; Byeon, Jeong-Sik; Myung, Seung-Jae; Yang, Suk-Kyun
2015-09-01
Patients with altered anatomy such as a Roux-en-Y anastomosis often present with various pancreaticobiliary problems requiring therapeutic intervention. However, a conventional endoscopic approach to the papilla is very difficult owing to the long afferent limb and acute angle of a Roux-en-Y anastomosis. Balloon-assisted enteroscopy can be used for endoscopic retrograde cholangiopancreatography (ERCP) in patients with altered anatomy. We experienced six cases of Roux-en-Y anastomosis with biliary problems, and attempted ERCP using single balloon enteroscopy (SBE). SBE insertion followed by replacement with a conventional endoscope was attempted in five of six patients. The papilla was successfully approached using SBE in all cases. However, therapeutic intervention was completed in only three cases because of poor maneuverability caused by postoperative adhesion. We conclude that in patients with Roux-en-Y anastomosis, the ampulla can be readily accessed with SBE, but longer dedicated accessories are necessary to improve this therapeutic intervention.
Bailey, E. G.; Jensen, J.; Nelson, J.; Wiberg, H. K.; Bell, J. D.
2017-01-01
First-year students often become discouraged during introductory biology courses when repeated attempts to understand concepts nevertheless result in poor test scores. This challenge is exacerbated by traditional course structures that impose premature judgments on students’ achievements. Repeated testing has been shown to benefit student ability to recognize and recall information, but an effective means to similarly facilitate skill with higher-order problems in introductory courses is needed. Here, we show that an innovative format that uses a creative grading scheme together with weekly formative midterm exams produced significant gains in student success with difficult items requiring analysis and interpretation. This format is designed to promote tenacity and avoid discouragement by providing multiple opportunities to attempt demanding problems on exams, detailed immediate feedback, and strong incentives to retain hope and improve. Analysis of individual performance trajectories with heat maps reveals the diversity of learning patterns and provides rational means for advising students. PMID:28130269
The choice of the energy embedding law in the design of heavy ionic fusion cylindrical targets
NASA Astrophysics Data System (ADS)
Dolgoleva, GV; Zykova, A. I.
2017-10-01
The paper considers the numerical design of heavy ion fusion (FIHIF) targets, which is one of the branches of controlled thermonuclear fusion (CTF). One of the important tasks in the targets design for controlled thermonuclear fusion is the energy embedding selection whereby it is possible to obtain “burning” (the presence of thermonuclear reactions) of the working DT region. The work is devoted to the rapid ignition of FIHIF targets by means of an additional short-term energy contribution to the DT substance already compressed by massively more longer by energy embedding. This problem has been fairly well studied for laser targets, but this problem is new for heavy ion fusion targets. Maximum momentum increasing is very technically difficult and expensive on modern FIHIF installations. The work shows that the additional energy embedding (“igniting” impulse) reduces the requirements to the maximum impulse. The purpose of this work is to research the ignition impulse effect on the FIHIF target parameters.
NASA Astrophysics Data System (ADS)
Sanan, P.; Schnepp, S. M.; May, D.; Schenk, O.
2014-12-01
Geophysical applications require efficient forward models for non-linear Stokes flow on high resolution spatio-temporal domains. The bottleneck in applying the forward model is solving the linearized, discretized Stokes problem which takes the form of a large, indefinite (saddle point) linear system. Due to the heterogeniety of the effective viscosity in the elliptic operator, devising effective preconditioners for saddle point problems has proven challenging and highly problem-dependent. Nevertheless, at least three approaches show promise for preconditioning these difficult systems in an algorithmically scalable way using multigrid and/or domain decomposition techniques. The first is to work with a hierarchy of coarser or smaller saddle point problems. The second is to use the Schur complement method to decouple and sequentially solve for the pressure and velocity. The third is to use the Schur decomposition to devise preconditioners for the full operator. These involve sub-solves resembling inexact versions of the sequential solve. The choice of approach and sub-methods depends crucially on the motivating physics, the discretization, and available computational resources. Here we examine the performance trade-offs for preconditioning strategies applied to idealized models of mantle convection and lithospheric dynamics, characterized by large viscosity gradients. Due to the arbitrary topological structure of the viscosity field in geodynamical simulations, we utilize low order, inf-sup stable mixed finite element spatial discretizations which are suitable when sharp viscosity variations occur in element interiors. Particular attention is paid to possibilities within the decoupled and approximate Schur complement factorization-based monolithic approaches to leverage recently-developed flexible, communication-avoiding, and communication-hiding Krylov subspace methods in combination with `heavy' smoothers, which require solutions of large per-node sub-problems, well-suited to solution on hybrid computational clusters. To manage the combinatorial explosion of solver options (which include hybridizations of all the approaches mentioned above), we leverage the modularity of the PETSc library.
ARE TIDAL EFFECTS RESPONSIBLE FOR EXOPLANETARY SPIN–ORBIT ALIGNMENT?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Gongjie; Winn, Joshua N., E-mail: gli@cfa.harvard.edu
The obliquities of planet-hosting stars are clues about the formation of planetary systems. Previous observations led to the hypothesis that for close-in giant planets, spin–orbit alignment is enforced by tidal interactions. Here, we examine two problems with this hypothesis. First, Mazeh and coworkers recently used a new technique—based on the amplitude of starspot-induced photometric variability—to conclude that spin–orbit alignment is common even for relatively long-period planets, which would not be expected if tides were responsible. We re-examine the data and find a statistically significant correlation between photometric variability and planetary orbital period that is qualitatively consistent with tidal interactions. Howevermore » it is still difficult to explain quantitatively, as it would require tides to be effective for periods as long as tens of days. Second, Rogers and Lin argued against a particular theory for tidal re-alignment by showing that initially retrograde systems would fail to be re-aligned, in contradiction with the observed prevalence of prograde systems. We investigate a simple model that overcomes this problem by taking into account the dissipation of inertial waves and the equilibrium tide, as well as magnetic braking. We identify a region of parameter space where re-alignment can be achieved, but it only works for close-in giant planets, and requires some fine tuning. Thus, while we find both problems to be more nuanced than they first appeared, the tidal model still has serious shortcomings.« less
College Basketball on the Line.
ERIC Educational Resources Information Center
Suggs, Welch
1999-01-01
The National Collegiate Athletic Association (NCAA) has convened a working group to address problems in recruiting, gambling, academic standards, and other corrupt practices in college basketball programs. Such problems are neither new nor unique to basketball, and changing college sports has proven to be difficult. Recommendations are anticipated…
Adaptive grid methods for RLV environment assessment and nozzle analysis
NASA Technical Reports Server (NTRS)
Thornburg, Hugh J.
1996-01-01
Rapid access to highly accurate data about complex configurations is needed for multi-disciplinary optimization and design. In order to efficiently meet these requirements a closer coupling between the analysis algorithms and the discretization process is needed. In some cases, such as free surface, temporally varying geometries, and fluid structure interaction, the need is unavoidable. In other cases the need is to rapidly generate and modify high quality grids. Techniques such as unstructured and/or solution-adaptive methods can be used to speed the grid generation process and to automatically cluster mesh points in regions of interest. Global features of the flow can be significantly affected by isolated regions of inadequately resolved flow. These regions may not exhibit high gradients and can be difficult to detect. Thus excessive resolution in certain regions does not necessarily increase the accuracy of the overall solution. Several approaches have been employed for both structured and unstructured grid adaption. The most widely used involve grid point redistribution, local grid point enrichment/derefinement or local modification of the actual flow solver. However, the success of any one of these methods ultimately depends on the feature detection algorithm used to determine solution domain regions which require a fine mesh for their accurate representation. Typically, weight functions are constructed to mimic the local truncation error and may require substantial user input. Most problems of engineering interest involve multi-block grids and widely disparate length scales. Hence, it is desirable that the adaptive grid feature detection algorithm be developed to recognize flow structures of different type as well as differing intensity, and adequately address scaling and normalization across blocks. These weight functions can then be used to construct blending functions for algebraic redistribution, interpolation functions for unstructured grid generation, forcing functions to attract/repel points in an elliptic system, or to trigger local refinement, based upon application of an equidistribution principle. The popularity of solution-adaptive techniques is growing in tandem with unstructured methods. The difficultly of precisely controlling mesh densities and orientations with current unstructured grid generation systems has driven the use of solution-adaptive meshing. Use of derivatives of density or pressure are widely used for construction of such weight functions, and have been proven very successful for inviscid flows with shocks. However, less success has been realized for flowfields with viscous layers, vortices or shocks of disparate strength. It is difficult to maintain the appropriate mesh point spacing in the various regions which require a fine spacing for adequate resolution. Mesh points often migrate from important regions due to refinement of dominant features. An example of this is the well know tendency of adaptive methods to increase the resolution of shocks in the flowfield around airfoils, but in the incorrect location due to inadequate resolution of the stagnation region. This problem has been the motivation for this research.
Teaching the Pressure-Flow Hypothesis of Phloem Transport in a Problem-Solving Session
ERIC Educational Resources Information Center
Clifford, Paul
2004-01-01
Problem solving is an ideal learning strategy, especially for topics that are perceived as difficult to teach. As an example, a format is described for a problem-solving session designed to help students understand the pressure-flow hypothesis of phloem transport in plants. Five key facts and their discussion can lead to the conclusion that a…
A Comparative Analysis of Word Problems in Selected United States and Russian First Grade Textbooks
ERIC Educational Resources Information Center
Grishchenko, Svetlana
2009-01-01
The purpose of this study was to explore word problems as a subject matter in mathematics textbook curricula. The motivation for the study derived from the following evidence: (a) American students find some word problems are more difficult than others (Garcia, Jimenez, & Hess, 2006; Riley & Green, 1988; Stern, 2001), and (b) one of the…
Preschoolers Grow Their Brains: Shifting Mindsets for Greater Resiliency and Better Problem Solving
ERIC Educational Resources Information Center
Pawlina, Shelby; Stanford, Christie
2011-01-01
Challenges, mistakes, and problems are inherent every day in learning activities and social interactions. How children think about and respond to those difficult situations has an impact on how they see themselves as being able to shape their own learning and on how they handle the next problem that comes their way. Building resilience means…
ERIC Educational Resources Information Center
Lehr, Susan; Lehr, Robert
This monograph aims to assist parents in dealing with behavior problems of children with disabilities. It begins with a case history of an 8-year-old girl with learning disabilities, emotional problems, and behavior problems and her parents' advocacy efforts to obtain an appropriate educational environment for her. Aversive interventions are…
Vortex generator design for aircraft inlet distortion as a numerical optimization problem
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Levy, Ralph
1991-01-01
Aerodynamic compatibility of aircraft/inlet/engine systems is a difficult design problem for aircraft that must operate in many different flight regimes. Takeoff, subsonic cruise, supersonic cruise, transonic maneuvering, and high altitude loiter each place different constraints on inlet design. Vortex generators, small wing like sections mounted on the inside surfaces of the inlet duct, are used to control flow separation and engine face distortion. The design of vortex generator installations in an inlet is defined as a problem addressable by numerical optimization techniques. A performance parameter is suggested to account for both inlet distortion and total pressure loss at a series of design flight conditions. The resulting optimization problem is difficult since some of the design parameters take on integer values. If numerical procedures could be used to reduce multimillion dollar development test programs to a small set of verification tests, numerical optimization could have a significant impact on both cost and elapsed time to design new aircraft.
A New Classification of Endodontic-Periodontal Lesions
Al-Fouzan, Khalid S.
2014-01-01
The interrelationship between periodontal and endodontic disease has always aroused confusion, queries, and controversy. Differentiating between a periodontal and an endodontic problem can be difficult. A symptomatic tooth may have pain of periodontal and/or pulpal origin. The nature of that pain is often the first clue in determining the etiology of such a problem. Radiographic and clinical evaluation can help clarify the nature of the problem. In some cases, the influence of pulpal pathology may cause the periodontal involvement and vice versa. The simultaneous existence of pulpal problems and inflammatory periodontal disease can complicate diagnosis and treatment planning. An endo-perio lesion can have a varied pathogenesis which ranges from simple to relatively complex one. The differential diagnosis of endodontic and periodontal diseases can sometimes be difficult, but it is of vital importance to make a correct diagnosis for providing the appropriate treatment. This paper aims to discuss a modified clinical classification to be considered for accurately diagnosing and treating endo-perio lesion. PMID:24829580
A new classification of endodontic-periodontal lesions.
Al-Fouzan, Khalid S
2014-01-01
The interrelationship between periodontal and endodontic disease has always aroused confusion, queries, and controversy. Differentiating between a periodontal and an endodontic problem can be difficult. A symptomatic tooth may have pain of periodontal and/or pulpal origin. The nature of that pain is often the first clue in determining the etiology of such a problem. Radiographic and clinical evaluation can help clarify the nature of the problem. In some cases, the influence of pulpal pathology may cause the periodontal involvement and vice versa. The simultaneous existence of pulpal problems and inflammatory periodontal disease can complicate diagnosis and treatment planning. An endo-perio lesion can have a varied pathogenesis which ranges from simple to relatively complex one. The differential diagnosis of endodontic and periodontal diseases can sometimes be difficult, but it is of vital importance to make a correct diagnosis for providing the appropriate treatment. This paper aims to discuss a modified clinical classification to be considered for accurately diagnosing and treating endo-perio lesion.
ERIC Educational Resources Information Center
Ohio Commission on Dispute Resolution and Conflict Management, Columbus.
Conflict is a natural and inevitable part of living, but managing conflict is difficult for many people because they have not been taught how to resolve differences in cooperative, nonviolent ways. Communication problems can lead to misunderstanding and make conflicts more difficult to resolve. The Governor of Ohio has designated May 1-7, 2000 as…
Easing the Transition from Schooling to Work. New Directions for Community Colleges, Number 16.
ERIC Educational Resources Information Center
Silberman, Harry F., Ed.; Ginsburg, Mark B., Ed.
1976-01-01
The transition from schooling to work has been recognized as a difficult one. As society has become more modernized, the problem of transition has become even more aggravated. American postsecondary education has a role to play in making this transition less difficult, and in integrating the educational process into the world of work. This…
ERIC Educational Resources Information Center
Maina, Michael P.; Maina, Julie Schlegel; Hunt, Kevin
2016-01-01
Often students have a difficult time when asked to use critical thinking skills to solve a problem. Perhaps students have a difficult time adjusting because teachers frequently tell them exactly what to do and how to do it. When asked to use critical thinking skills, students may suddenly become confused and discouraged because the teacher no…
Medical and psychosocial experiences of family caregivers with children fed enterally at home.
Enrione, Evelyn B; Thomlison, Barbara; Rubin, Aviva
2005-01-01
Pediatric home enteral nutrition (HEN) studies that evaluate the psychosocial aspects of caregiving are limited. Overlooking the psychosocial needs of the caregiver may result in negative outcomes such as lack of adherence to the HEN regimen. This study determined whether caregivers report psychosocial situations more frequent and difficult to manage than medical situations. A questionnaire, which identified 10 psychosocial and 10 medical issues related to pediatric HEN, was mailed to 150 caregivers (37 responded), who rated the statements for frequency and difficulty. Each statement was ranked from most frequent/difficult to least frequent/difficult by mean cross-product score (frequency x difficulty). To indicate overall burden, a medical total composite score (MTCS) and a psychosocial total composite score (PTCS) were calculated by summing the cross-products of the respective problems. Paired t tests compared MTCS to PTCS and also the psychosocial frequency means and difficulty means to the same for the medical problems. Of the top 10 problems, 7 were psychosocial, whereas 3 were medical. Caregivers reported incidences of psychosocial problems more frequently (p < .003) than medical problems, and they had more difficulty (p < .001) with the psychosocial situations than with the medical ones. The PTCS was significantly higher (p < .001) than the MTCS. The psychosocial situations were perceived as causing a greater burden and greater difficulty in coping with everyday life. Health professionals need to understand and address the psychosocial difficulties of the caregiver in order to provide support for the caregiver and promote positive growth and development of the child.
Simulating variable source problems via post processing of individual particle tallies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.
2000-10-20
Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less
Innovation and behavioral flexibility in wild redfronted lemurs (Eulemur rufifrons).
Huebner, Franziska; Fichtel, Claudia
2015-05-01
Innovations and problem-solving abilities can provide animals with important ecological advantages as they allow individuals to deal with novel social and ecological challenges. Innovation is a solution to a novel problem or a novel solution to an old problem, with the latter being especially difficult. Finding a new solution to an old problem requires individuals to inhibit previously applied solutions to invent new strategies and to behave flexibly. We examined the role of experience on cognitive flexibility to innovate and to find new problem-solving solutions with an artificial feeding task in wild redfronted lemurs (Eulemur rufifrons). Four groups of lemurs were tested with feeding boxes, each offering three different techniques to extract food, with only one technique being available at a time. After the subjects learned a technique, this solution was no longer successful and subjects had to invent a new technique. For the first transition between task 1 and 2, subjects had to rely on their experience of the previous technique to solve task 2. For the second transition, subjects had to inhibit the previously learned technique to learn the new task 3. Tasks 1 and 2 were solved by most subjects, whereas task 3 was solved by only a few subjects. In this task, besides behavioral flexibility, especially persistence, i.e., constant trying, was important for individual success during innovation. Thus, wild strepsirrhine primates are able to innovate flexibly, suggesting a general ecological relevance of behavioral flexibility and persistence during innovation and problem solving across all primates.
NASA Astrophysics Data System (ADS)
Amalia, A.; Gunawan, D.; Hardi, S. M.; Rachmawati, D.
2018-02-01
The Internal Quality Assurance System (in Indonesian: SPMI (Sistem Penjaminan Mutu Internal) is a systemic activity of quality assurance of higher education in Indonesia. SPMI should be done by all higher education or universities in Indonesia based on the Regulation of the Minister of Research, Technology and Higher Education of the Republic of Indonesia Number 62 of 2016. Implementation of SPMI must refer to the principle of SPMI that is independent, standardize, accurate, well planned and sustainable, documented and systematic. To assist the SPMI cycle properly, universities need a supporting software to monitor all the activities of SPMI. But in reality, many universities are not optimal in building this SPMI monitoring system. One of the obstacles is the determination of system requirements in support of SPMI principles is difficult to achieve. In this paper, we observe the initial phase of the engineering requirements elicitation. Unlike other methods that collect system requirements from users and stakeholders, we find the system requirements of the SPMI principles from SPMI guideline book. The result of this paper can be used as a choice in determining SPMI software requirements. This paper can also be used by developers and users to understand the scenario of SPMI so that could overcome the problems of understanding between this two parties.
Exact solution of large asymmetric traveling salesman problems.
Miller, D L; Pekny, J F
1991-02-15
The traveling salesman problem is one of a class of difficult problems in combinatorial optimization that is representative of a large number of important scientific and engineering problems. A survey is given of recent applications and methods for solving large problems. In addition, an algorithm for the exact solution of the asymmetric traveling salesman problem is presented along with computational results for several classes of problems. The results show that the algorithm performs remarkably well for some classes of problems, determining an optimal solution even for problems with large numbers of cities, yet for other classes, even small problems thwart determination of a provably optimal solution.
NASA Technical Reports Server (NTRS)
Arnold, S. M.
2006-01-01
Materials property information such as composition and thermophysical/mechanical properties abound in the literature. Oftentimes, however, the corresponding response curves from which these data are determined are missing or at the very least difficult to retrieve. Further, the paradigm for collecting materials property information has historically centered on (1) properties for materials comparison/selection purposes and (2) input requirements for conventional design/analysis methods. However, just as not all materials are alike or equal, neither are all constitutive models (and thus design/ analysis methods) equal; each model typically has its own specific and often unique required materials parameters, some directly measurable and others indirectly measurable. Therefore, the type and extent of materials information routinely collected is not always sufficient to meet the current, much less future, needs of the materials modeling community. Informatics has been defined as the science concerned with gathering, manipulating, storing, retrieving, and classifying recorded information. A key aspect of informatics is its focus on understanding problems and applying information technology as needed to address those problems. The primary objective of this article is to highlight the need for a paradigm shift in materials data collection, analysis, and dissemination so as to maximize the impact on both practitioners and researchers. Our hope is to identify and articulate what constitutes "sufficient" data content (i.e., quality and quantity) for developing, characterizing, and validating sophisticated nonlinear time- and history-dependent (hereditary) constitutive models. Likewise, the informatics infrastructure required for handling the potentially massive amounts of materials data will be discussed.
COPS: Large-scale nonlinearly constrained optimization problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bondarenko, A.S.; Bortz, D.M.; More, J.J.
2000-02-10
The authors have started the development of COPS, a collection of large-scale nonlinearly Constrained Optimization Problems. The primary purpose of this collection is to provide difficult test cases for optimization software. Problems in the current version of the collection come from fluid dynamics, population dynamics, optimal design, and optimal control. For each problem they provide a short description of the problem, notes on the formulation of the problem, and results of computational experiments with general optimization solvers. They currently have results for DONLP2, LANCELOT, MINOS, SNOPT, and LOQO.
Lindblad, Ida; Engström, Ann-Charlotte; Nylander, Charlotte; Fernell, Elisabeth
2017-12-01
Managing type 1 diabetes mellitus requires efficient cognitive and executive skills, and adolescents who have attention-deficit/hyperactivity disorder (ADHD) may face specific challenges. This study explored young people's experiences of diabetes treatment and care. In a population-based study, comprising 175 patients aged 5-16 years with type 1 diabetes mellitus in two Swedish counties, we found that eight also met criteria for ADHD. Six of these, aged 14.5-16 years, participated 2013-2014 in interviews that targeted aspects of their diabetes treatment. Conducted by two psychologists, these used the inductive qualitative, semi-structured interview format. The two boys and four girls all reported difficulties in creating routines for their diabetes treatment and that problems were aggravated during stress. They had been criticised by their parents and the diabetes team when their blood levels indicated inadequate diabetes control. They requested ongoing information, involvement of their friends, group meetings and easy access to the healthcare system during difficult times. Patients with type 1 diabetes mellitus and concomitant ADHD faced problems with their diabetes management, especially during stressful situations. Diabetes care provision should pay particular attention to patients with co-existing neuropsychiatric and neurodevelopmental disorders such as ADHD. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.
Vernick, Jon S; Webster, Daniel W; Bulzacchelli, Maria T; Mair, Julie Samia
2006-01-01
Firearms were associated with 30, 136 deaths in the United States in 2003. Most guns are initially sold to the public through a network of retail dealers. Licensed firearm dealers are an important source of guns for criminals and gun traffickers. Just one percent of licensed dealers were responsible for more than half of all guns traced to crime. Federal law makes it difficult for ATF to inspect and revoke the licenses of problem gun dealers. State licensing systems, however, are a greatly under-explored opportunity for firearm dealer oversight. We identify and categorize these state systems to identify opportunities for interventions to prevent problem dealers from supplying guns to criminals, juveniles, or gun traffickers. Just seventeen states license gun dealers. Twenty-three states permit routine inspections of dealers but only two mandate that those inspections occur on a regular basis. Twenty-six states impose record-keeping requirements for gun sales. Only thirteen states require some form of store security measures to minimize firearm theft. We conclude with recommendations for a comprehensive system of state licensing and oversight of gun dealers. Our findings can be useful for the coalition of more than fifty U.S. mayors that recently announced it would work together to combat illegal gun trafficking.
Schaffartzik, Anke; Haberl, Helmut; Kastner, Thomas; Wiedenhofer, Dominik; Eisenmenger, Nina; Erb, Karl-Heinz
2015-10-01
Land use is recognized as a pervasive driver of environmental impacts, including climate change and biodiversity loss. Global trade leads to "telecoupling" between the land use of production and the consumption of biomass-based goods and services. Telecoupling is captured by accounts of the upstream land requirements associated with traded products, also commonly referred to as land footprints. These accounts face challenges in two main areas: (1) the allocation of land to products traded and consumed and (2) the metrics to account for differences in land quality and land-use intensity. For two main families of accounting approaches (biophysical, factor-based and environmentally extended input-output analysis), this review discusses conceptual differences and compares results for land footprints. Biophysical approaches are able to capture a large number of products and different land uses, but suffer from a truncation problem. Economic approaches solve the truncation problem, but are hampered by the limited disaggregation of sectors and products. In light of the conceptual differences, the overall similarity of results generated by both types of approaches is remarkable. Diametrically opposed results for some of the world's largest producers and consumers of biomass-based products, however, make interpretation difficult. This review aims to provide clarity on some of the underlying conceptual issues of accounting for land footprints.
Safe laser application requires more than laser safety
NASA Astrophysics Data System (ADS)
Frevel, A.; Steffensen, B.; Vassie, L.
1995-02-01
An overview is presented concerning aspects of laser safety in European industrial laser use. Surveys indicate that there is a large variation in the safety strategies amongst industrial laser users. Some key problem areas are highlighted. Emission of hazardous substances is a major problem for users of laser material processing systems where the majority of the particulate is of a sub-micrometre size, presenting a respiratory hazard. Studies show that in many cases emissions are not frequently monitored in factories and uncertainty exists over the hazards. Operators of laser machines do not receive adequate job training or safety training. The problem is compounded by a plethora of regulations and standards which are difficult to interpret and implement, and inspectors who are not conversant with the technology or the issues. A case is demonstrated for a more integrated approach to laser safety, taking into account the development of laser applications, organizational and personnel development, in addition to environmental and occupational health and safety aspects. It is necessary to achieve a harmonization between these elements in any organization involved in laser technology. This might be achieved through establishing technology transfer centres in laser technology.
NASA Astrophysics Data System (ADS)
Matovnikov, Sergei; Matovnikova, Natalia; Samoylenko, Polina
2018-03-01
The paper considers the issues of designing a modern courtyard space for high-rise buildings in Volgograd to obtain a multifunctional environment through the arrangement of new recreational territories and the search of innovative planning methods for urban landscape design. In professionals' opinion, the problem concerning the design and construction of recreational zones and greenery planting is very acute for Volgograd, such territories are often absent in many districts of the city. Generally, the decrease in the natural component and a low level of recreational territories improvement are typical for Volgograd. In addition, the problem of designing a modern urban courtyard space for high-rise buildings to obtain a multi-functional environment exists and requires a thorough investigation. The question is if there is a possibility to solve these difficult tasks by means of local design methods only or whether there should be a complex approach at the stage of the formation of master plans for modern residential areas and which modern design methods can ensure the creation of a courtyard space as a multi-functional environment. These questions as well as some other ones will be the topic of our paper.
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III
1991-01-01
Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.
Generating Data Flow Programs from Nonprocedural Specifications.
1983-03-01
With the I-structures, Gajski points out, it is difficult to know ahead of time the optimal memory allocation scheme to pertition large arrays. amory...contention problems may occur for frequently accessed elements stored in the sam memory module. Gajski observes that these are the same problem which
A Novel Concept for a Deformable Membrane Mirror for Correction of Large Amplitude Aberrations
NASA Technical Reports Server (NTRS)
Moore, Jim; Patrick, Brian
2006-01-01
Very large, light weight mirrors are being developed for applications in space. Due to launch mass and volume restrictions these mirrors will need to be much more flexible than traditional optics. The use of primary mirrors with these characteristics will lead to requirements for adaptive optics capable of correcting wave front errors with large amplitude relatively low spatial frequency aberrations. The use of low modulus membrane mirrors actuated with electrostatic attraction forces is a potential solution for this application. Several different electrostatic membrane mirrors are now available commercially. However, as the dynamic range requirement of the adaptive mirror is increased the separation distance between the membrane and the electrodes must increase to accommodate the required face sheet deformations. The actuation force applied to the mirror decreases inversely proportional to the square of the separation distance; thus for large dynamic ranges the voltage requirement can rapidly increase into the high voltage regime. Experimentation with mirrors operating in the KV range has shown that at the higher voltages a serious problem with electrostatic field cross coupling between actuators can occur. Voltage changes on individual actuators affect the voltage of other actuators making the system very difficult to control. A novel solution has been proposed that combines high voltage electrodes with mechanical actuation to overcome this problem. In this design an array of electrodes are mounted to a backing structure via light weight large dynamic range flextensional actuators. With this design the control input becomes the separation distance between the electrode and the mirror. The voltage on each of the actuators is set to a uniform relatively high voltage, thus the problem of cross talk between actuators is avoided and the favorable distributed load characteristic of electrostatic actuation is retained. Initial testing and modeling of this concept demonstrates that this is an attractive concept for increasing the dynamic range capability of electrostatic deformable mirrors.
A novel fully automatic scheme for fiducial marker-based alignment in electron tomography.
Han, Renmin; Wang, Liansan; Liu, Zhiyong; Sun, Fei; Zhang, Fa
2015-12-01
Although the topic of fiducial marker-based alignment in electron tomography (ET) has been widely discussed for decades, alignment without human intervention remains a difficult problem. Specifically, the emergence of subtomogram averaging has increased the demand for batch processing during tomographic reconstruction; fully automatic fiducial marker-based alignment is the main technique in this process. However, the lack of an accurate method for detecting and tracking fiducial markers precludes fully automatic alignment. In this paper, we present a novel, fully automatic alignment scheme for ET. Our scheme has two main contributions: First, we present a series of algorithms to ensure a high recognition rate and precise localization during the detection of fiducial markers. Our proposed solution reduces fiducial marker detection to a sampling and classification problem and further introduces an algorithm to solve the parameter dependence of marker diameter and marker number. Second, we propose a novel algorithm to solve the tracking of fiducial markers by reducing the tracking problem to an incomplete point set registration problem. Because a global optimization of a point set registration occurs, the result of our tracking is independent of the initial image position in the tilt series, allowing for the robust tracking of fiducial markers without pre-alignment. The experimental results indicate that our method can achieve an accurate tracking, almost identical to the current best one in IMOD with half automatic scheme. Furthermore, our scheme is fully automatic, depends on fewer parameters (only requires a gross value of the marker diameter) and does not require any manual interaction, providing the possibility of automatic batch processing of electron tomographic reconstruction. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.
1992-01-01
Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.
A Bandwidth-Efficient Dissemination Scheme of Non-Safety Information in Urban VANETs †
Garcia-Lozano, Estrella; Campo, Celeste; Garcia-Rubio, Carlos; Rodriguez-Carrion, Alicia
2016-01-01
The recent release of standards for vehicular communications will hasten the development of smart cities in the following years. Many applications for vehicular networks, such as blocked road warnings or advertising, will require multi-hop dissemination of information to all vehicles in a region of interest. However, these networks present special features and difficulties that may require special measures. The dissemination of information may cause broadcast storms. Urban scenarios are especially sensitive to broadcast storms because of the high density of vehicles in downtown areas. They also present numerous crossroads and signal blocking due to buildings, which make dissemination more difficult than in open, almost straight interurban roadways. In this article, we discuss several options to avoid the broadcast storm problem while trying to achieve the maximum coverage of the region of interest. Specifically, we evaluate through simulations different ways to detect and take advantage of intersections and a strategy based on store-carry-forward to overcome short disconnections between groups of vehicles. Our conclusions are varied, and we propose two different solutions, depending on the requirements of the application. PMID:27355956
NASA Astrophysics Data System (ADS)
Döge, Stefan; Hingerl, Jürgen
2018-03-01
The improvement of the number of extractable ultracold neutrons (UCNs) from converters based on solid deuterium (sD2) crystals requires a good understanding of the UCN transport and how the crystal's morphology influences its transparency to the UCNs. Measurements of the UCN transmission through cryogenic liquids and solids of interest, such as hydrogen (H2) and deuterium (D2), require sample containers with thin, highly polished and optically transparent windows and a well defined sample thickness. One of the most difficult sealing problems is that of light gases like hydrogen and helium at low temperatures against high vacuum. Here we report on the design of a sample container with two 1 mm thin amorphous silica windows cold-welded to aluminum clamps using indium wire gaskets, in order to form a simple, reusable, and hydrogen-tight cryogenic seal. The container meets the above-mentioned requirements and withstands up to 2 bar hydrogen gas pressure against isolation vacuum in the range of 10-5 to 10-7 mbar at temperatures down to 4.5 K. Additionally, photographs of the crystallization process are shown and discussed.
A Bandwidth-Efficient Dissemination Scheme of Non-Safety Information in Urban VANETs.
Garcia-Lozano, Estrella; Campo, Celeste; Garcia-Rubio, Carlos; Rodriguez-Carrion, Alicia
2016-06-27
The recent release of standards for vehicular communications will hasten the development of smart cities in the following years. Many applications for vehicular networks, such as blocked road warnings or advertising, will require multi-hop dissemination of information to all vehicles in a region of interest. However, these networks present special features and difficulties that may require special measures. The dissemination of information may cause broadcast storms. Urban scenarios are especially sensitive to broadcast storms because of the high density of vehicles in downtown areas. They also present numerous crossroads and signal blocking due to buildings, which make dissemination more difficult than in open, almost straight interurban roadways. In this article, we discuss several options to avoid the broadcast storm problem while trying to achieve the maximum coverage of the region of interest. Specifically, we evaluate through simulations different ways to detect and take advantage of intersections and a strategy based on store-carry-forward to overcome short disconnections between groups of vehicles. Our conclusions are varied, and we propose two different solutions, depending on the requirements of the application.
New realisation of Preisach model using adaptive polynomial approximation
NASA Astrophysics Data System (ADS)
Liu, Van-Tsai; Lin, Chun-Liang; Wing, Home-Young
2012-09-01
Modelling system with hysteresis has received considerable attention recently due to the increasing accurate requirement in engineering applications. The classical Preisach model (CPM) is the most popular model to demonstrate hysteresis which can be represented by infinite but countable first-order reversal curves (FORCs). The usage of look-up tables is one way to approach the CPM in actual practice. The data in those tables correspond with the samples of a finite number of FORCs. This approach, however, faces two major problems: firstly, it requires a large amount of memory space to obtain an accurate prediction of hysteresis; secondly, it is difficult to derive efficient ways to modify the data table to reflect the timing effect of elements with hysteresis. To overcome, this article proposes the idea of using a set of polynomials to emulate the CPM instead of table look-up. The polynomial approximation requires less memory space for data storage. Furthermore, the polynomial coefficients can be obtained accurately by using the least-square approximation or adaptive identification algorithm, such as the possibility of accurate tracking of hysteresis model parameters.
Secondary Students' Perceptions about Learning Qualitative Analysis in Inorganic Chemistry
NASA Astrophysics Data System (ADS)
Tan, Kim-Chwee Daniel; Goh, Ngoh-Khang; Chia, Lian-Sai; Treagust, David F.
2001-02-01
Grade 10 students in Singapore find qualitative analysis one of the more difficult topics in their external examinations. Fifty-one grade 10 students (15-17 years old) from three schools were interviewed to investigate their perceptions about learning qualitative analysis and the aspects of qualitative analysis they found difficult. The results showed that students found qualitative analysis tedious, difficult to understand and found the practical sessions unrelated to what they learned in class. They also believed that learning qualitative analysis required a great amount of memory work. It is proposed that their difficulties may arise from not knowing explicitly what is required in qualitative analysis, the content of qualitative analysis, the lack of motivation to understand qualitative analysis, cognitive overloading, and the lack of mastery of the required process skills.
A mechanical system for tensile testing of supported films at the nanoscale.
Pantano, Maria F; Speranza, G; Galiotis, Costas; Pugno, Nicola M
2018-06-27
Standard tensile tests of materials are usually performed on freestanding specimens. However, such requirement is difficult to implement when the materials of interest are of nanoscopic dimensions due to problems related to their handling and manipulation. In the present paper, a new device is presented for tensile testing of thin nanomaterials, which allows tests to be carried out on specimens initially deposited onto a macroscopic pre-notched substrate. On loading, however, no substrate effects are introduced, allowing the films to be freely stretched. The results obtained from a variety of thin metal or polymeric films are very promising for the further development of this technique as a standard method for nanomaterial mechanical testing. © 2018 IOP Publishing Ltd.
Stability issues of black hole in non-local gravity
NASA Astrophysics Data System (ADS)
Myung, Yun Soo; Park, Young-Jai
2018-04-01
We discuss stability issues of Schwarzschild black hole in non-local gravity. It is shown that the stability analysis of black hole for the unitary and renormalizable non-local gravity with γ2 = - 2γ0 cannot be performed in the Lichnerowicz operator approach. On the other hand, for the unitary and non-renormalizable case with γ2 = 0, the black hole is stable against the metric perturbations. For non-unitary and renormalizable local gravity with γ2 = - 2γ0 = const (fourth-order gravity), the small black holes are unstable against the metric perturbations. This implies that what makes the problem difficult in stability analysis of black hole is the simultaneous requirement of unitarity and renormalizability around the Minkowski spacetime.
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
Enhanced Conformational Sampling Using Replica Exchange with Collective-Variable Tempering.
Gil-Ley, Alejandro; Bussi, Giovanni
2015-03-10
The computational study of conformational transitions in RNA and proteins with atomistic molecular dynamics often requires suitable enhanced sampling techniques. We here introduce a novel method where concurrent metadynamics are integrated in a Hamiltonian replica-exchange scheme. The ladder of replicas is built with different strengths of the bias potential exploiting the tunability of well-tempered metadynamics. Using this method, free-energy barriers of individual collective variables are significantly reduced compared with simple force-field scaling. The introduced methodology is flexible and allows adaptive bias potentials to be self-consistently constructed for a large number of simple collective variables, such as distances and dihedral angles. The method is tested on alanine dipeptide and applied to the difficult problem of conformational sampling in a tetranucleotide.
Decision Manifold Approximation for Physics-Based Simulations
NASA Technical Reports Server (NTRS)
Wong, Jay Ming; Samareh, Jamshid A.
2016-01-01
With the recent surge of success in big-data driven deep learning problems, many of these frameworks focus on the notion of architecture design and utilizing massive databases. However, in some scenarios massive sets of data may be difficult, and in some cases infeasible, to acquire. In this paper we discuss a trajectory-based framework that quickly learns the underlying decision manifold of binary simulation classifications while judiciously selecting exploratory target states to minimize the number of required simulations. Furthermore, we draw particular attention to the simulation prediction application idealized to the case where failures in simulations can be predicted and avoided, providing machine intelligence to novice analysts. We demonstrate this framework in various forms of simulations and discuss its efficacy.
About the use of vector optimization for company's contractors selection
NASA Astrophysics Data System (ADS)
Medvedeva, M. A.; Medvedev, M. A.
2017-07-01
For effective functioning of an enterprise it is necessary to make a right choice of partners: suppliers of raw material, buyers of finished products, and others with which the company interacts in the course of their business. However, the presence on the market of big amount of enterprises makes the choice of the most appropriate among them very difficult and requires the ability to objectively assess of the possible partners, based on multilateral analysis of their activities. This analysis can be carried out based on the solution of multiobjective problem of mathematical programming by using the methods of vector optimization. The present work addresses the theoretical foundations of such approach and also describes an algorithm realizing proposed method on practical example.
Using Arabidopsis to understand centromere function: progress and prospects.
Copenhaver, Gregory P
2003-01-01
Arabidopsis thaliana has emerged in recent years as a leading model for understanding the structure and function of higher eukaryotic centromeres. Arabidopsis centromeres, like those of virtually all higher eukaryotes, encompass large DNA domains consisting of a complex combination of unique, dispersed middle repetitive and highly repetitive DNA. For this reason, they have required creative analysis using molecular, genetic, cytological and genomic techniques. This synergy of approaches, reinforced by rapid progress in understanding how proteins interact with the centromere DNA to form a complete functional unit, has made Arabidopsis one the best understood centromere systems. Yet major problems remain to be solved: gaining a complete structural definition of the centromere has been surprisingly difficult, and developing synthetic mini-chromosomes in plants has been even more challenging.
Method for simulating discontinuous physical systems
Baty, Roy S.; Vaughn, Mark R.
2001-01-01
The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.
In-network processing of joins in wireless sensor networks.
Kang, Hyunchul
2013-03-11
The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified.
In-Network Processing of Joins in Wireless Sensor Networks
Kang, Hyunchul
2013-01-01
The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified. PMID:23478603
Recent advances in basic and clinical nanomedicine.
Morrow, K John; Bawa, Raj; Wei, Chiming
2007-09-01
Nanomedicine is a global business enterprise. Industry and governments clearly are beginning to envision nanomedicine's enormous potential. A clear definition of nanotechnology is an issue that requires urgent attention. This problem exists because nanotechnology represents a cluster of technologies, each of which may have different characteristics and applications. Although numerous novel nanomedicine-related applications are under development or nearing commercialization, the process of converting basic research in nanomedicine into commercially viable products will be long and difficult. Although realization of the full potential of nanomedicine may be years or decades away, recent advances in nanotechnology-related drug delivery, diagnosis, and drug development are beginning to change the landscape of medicine. Site-specific targeted drug delivery and personalized medicine are just a few concepts that are on the horizon.
Cross-Talk in Superconducting Transmon Quantum Computing Architecture
NASA Astrophysics Data System (ADS)
Abraham, David; Chow, Jerry; Corcoles, Antonio; Rothwell, Mary; Keefe, George; Gambetta, Jay; Steffen, Matthias; IBM Quantum Computing Team
2013-03-01
Superconducting transmon quantum computing test structures often exhibit significant undesired cross-talk. For experiments with only a handful of qubits this cross-talk can be quantified and understood, and therefore corrected. As quantum computing circuits become more complex, and thereby contain increasing numbers of qubits and resonators, it becomes more vital that the inadvertent coupling between these elements is minimized. The task of accurately controlling each single qubit to the level of precision required throughout the realization of a quantum algorithm is difficult by itself, but coupled with the need of nulling out leakage signals from neighboring qubits or resonators would quickly become impossible. We discuss an approach to solve this critical problem. We acknowledge support from IARPA under contract W911NF-10-1-0324.
Autonomous reinforcement learning with experience replay.
Wawrzyński, Paweł; Tanwani, Ajay Kumar
2013-05-01
This paper considers the issues of efficiency and autonomy that are required to make reinforcement learning suitable for real-life control tasks. A real-time reinforcement learning algorithm is presented that repeatedly adjusts the control policy with the use of previously collected samples, and autonomously estimates the appropriate step-sizes for the learning updates. The algorithm is based on the actor-critic with experience replay whose step-sizes are determined on-line by an enhanced fixed point algorithm for on-line neural network training. An experimental study with simulated octopus arm and half-cheetah demonstrates the feasibility of the proposed algorithm to solve difficult learning control problems in an autonomous way within reasonably short time. Copyright © 2012 Elsevier Ltd. All rights reserved.
Spacecraft on-board SAR image generation for EOS-type missions
NASA Technical Reports Server (NTRS)
Liu, K. Y.; Arens, W. E.; Assal, H. M.; Vesecky, J. F.
1987-01-01
Spacecraft on-board synthetic aperture radar (SAR) image generation is an extremely difficult problem because of the requirements for high computational rates (usually on the order of Giga-operations per second), high reliability (some missions last up to 10 years), and low power dissipation and mass (typically less than 500 watts and 100 Kilograms). Recently, a JPL study was performed to assess the feasibility of on-board SAR image generation for EOS-type missions. This paper summarizes the results of that study. Specifically, it proposes a processor architecture using a VLSI time-domain parallel array for azimuth correlation. Using available space qualifiable technology to implement the proposed architecture, an on-board SAR processor having acceptable power and mass characteristics appears feasible for EOS-type applications.
Some advantages of methane in an aircraft gas turbine
NASA Technical Reports Server (NTRS)
Graham, R. W.; Glassman, A. J.
1980-01-01
Liquid methane, which can be manufactured from any of the hydrocarbon sources such as coal, shale biomass, and organic waste considered as a petroleum replacement for aircraft fuels. A simple cycle analysis is carried out for a turboprop engine flying a Mach 0.8 and 10, 688 meters (35,000 ft.) altitude. Cycle performance comparisions are rendered for four cases in which the turbine cooling air is cooled or not cooled by the methane fuel. The advantages and disadvantages of involving the fuel in the turbine cooling system are discussed. Methane combustion characteristics are appreciably different from Jet A and will require different combustor designs. Although a number of similar difficult technical problems exist, a highly fuel efficient turboprop engine burning methane appear to be feasible.
Multicasting for all-optical multifiber networks
NASA Astrophysics Data System (ADS)
Kã¶Ksal, Fatih; Ersoy, Cem
2007-02-01
All-optical wavelength-routed WDM WANs can support the high bandwidth and the long session duration requirements of the application scenarios such as interactive distance learning or on-line diagnosis of patients simultaneously in different hospitals. However, multifiber and limited sparse light splitting and wavelength conversion capabilities of switches result in a difficult optimization problem. We attack this problem using a layered graph model. The problem is defined as a k-edge-disjoint degree-constrained Steiner tree problem for routing and fiber and wavelength assignment of k multicasts. A mixed integer linear programming formulation for the problem is given, and a solution using CPLEX is provided. However, the complexity of the problem grows quickly with respect to the number of edges in the layered graph, which depends on the number of nodes, fibers, wavelengths, and multicast sessions. Hence, we propose two heuristics layered all-optical multicast algorithm [(LAMA) and conservative fiber and wavelength assignment (C-FWA)] to compare with CPLEX, existing work, and unicasting. Extensive computational experiments show that LAMA's performance is very close to CPLEX, and it is significantly better than existing work and C-FWA for nearly all metrics, since LAMA jointly optimizes routing and fiber-wavelength assignment phases compared with the other candidates, which attack the problem by decomposing two phases. Experiments also show that important metrics (e.g., session and group blocking probability, transmitter wavelength, and fiber conversion resources) are adversely affected by the separation of two phases. Finally, the fiber-wavelength assignment strategy of C-FWA (Ex-Fit) uses wavelength and fiber conversion resources more effectively than the First Fit.
Euthanasia from the perspective of hospice care.
Gillett, G
1994-01-01
The hospice believes in the concept of a gentle and harmonious death. In most hospice settings there is also a rejection of active euthanasia. This set of two apparently conflicting principles can be defended on the basis of two arguments. The first is that doctors should not foster the intent to kill as part of their moral and clinical character. This allows proper sensitivity to the complex and difficult situation that arises in many of the most difficult terminal care situations. The second argument turns on the seduction of technological solutions to human problems and the slippery slope that may arise in the presence of a quick and convenient way of dealing with problems of death and dying.
Self-inflicted injuries. Challenging knowledge, skill, and compassion.
Haswell, D E; Graham, M
1996-09-01
Self-inflicted injuries and other serious self-destructive behaviours are common and difficult to recognize, prevent, and manage. Although they have previously been understood as repeated, failed attempts at suicide, they are better understood as maladaptive coping strategies. Women who present repeatedly with self-inflicted injuries need help to control this self-destructive behaviour and substitute more positive coping strategies. Physicians also need help in working with patients who respond to problems in this way. The program is made up of two broad sections. The first section involves understanding the problem and its origins in post-traumatic stress disorders. The second section offers a practical approach to helping patients presenting with injuries inflict upon themselves. A deeper understanding of the etiology and management of repeated self-inflicted injuries will enable physicians to help patients with this difficult problem while minimizing their own anxiety and frustration.
Compressing Aviation Data in XML Format
NASA Technical Reports Server (NTRS)
Patel, Hemil; Lau, Derek; Kulkarni, Deepak
2003-01-01
Design, operations and maintenance activities in aviation involve analysis of variety of aviation data. This data is typically in disparate formats making it difficult to use with different software packages. Use of a self-describing and extensible standard called XML provides a solution to this interoperability problem. XML provides a standardized language for describing the contents of an information stream, performing the same kind of definitional role for Web content as a database schema performs for relational databases. XML data can be easily customized for display using Extensible Style Sheets (XSL). While self-describing nature of XML makes it easy to reuse, it also increases the size of data significantly. Therefore, transfemng a dataset in XML form can decrease throughput and increase data transfer time significantly. It also increases storage requirements significantly. A natural solution to the problem is to compress the data using suitable algorithm and transfer it in the compressed form. We found that XML-specific compressors such as Xmill and XMLPPM generally outperform traditional compressors. However, optimal use of Xmill requires of discovery of optimal options to use while running Xmill. This, in turn, depends on the nature of data used. Manual disc0ver.y of optimal setting can require an engineer to experiment for weeks. We have devised an XML compression advisory tool that can analyze sample data files and recommend what compression tool would work the best for this data and what are the optimal settings to be used with a XML compression tool.
ERIC Educational Resources Information Center
Burns, Barbara A.; Jordan, Thomas M.
2006-01-01
Business managers are faced with complex decisions involving a wide range of issues--technical, social, environmental, and financial--and their interaction. Our education system focuses heavily on presenting structured problems and teaching students to apply a set of tools or methods to solve these problems. Yet the most difficult thing to teach…
When 95% Accurate Isn't: Exploring Bayes's Theorem
ERIC Educational Resources Information Center
CadwalladerOlsker, Todd D.
2011-01-01
Bayes's theorem is notorious for being a difficult topic to learn and to teach. Problems involving Bayes's theorem (either implicitly or explicitly) generally involve calculations based on two or more given probabilities and their complements. Further, a correct solution depends on students' ability to interpret the problem correctly. Most people…