Developing the skills required for evidence-based practice.
French, B
1998-01-01
The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.
Ocular Changes and Approaches of Ophthalmopathy in Basedow – Graves- Parry- Flajani Disease
SARACI, George; TRETA, Anamaria
2011-01-01
ABSTRACT Basedow-Graves disease is an autoimmune condition with multiple local and systemic aspects. Among these, oculopathy has a major impact on patient's life from both functional and esthetic point of view. Basedow-Graves oculopathy requires an appropriate positive and differential diagnosis using clinical and imagistic approaches. Treatment is always required in moderate or severe forms and it begins with simple general points and continues with medical and surgical therapies. Current article stresses upon the most characteristic clinical signs of thyroidian ophthalmopathy and the required current therapeutic approaches. PMID:22205899
An Exploratory Survey of Information Requirements for Instrument Approach Charts
DOT National Transportation Integrated Search
1995-03-01
This report documents a user centered survey and interview effort conducted to analyze the information content of : current Instrument Approach Plates (IAP). In the pilot opinion survey of approach chart information requirements, : respondents indica...
Systems and context modeling approach to requirements analysis
NASA Astrophysics Data System (ADS)
Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick
2014-08-01
Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.
SCOS 2: ESA's new generation of mission control system
NASA Technical Reports Server (NTRS)
Jones, M.; Head, N. C.; Keyte, K.; Howard, P.; Lynenskjold, S.
1994-01-01
New mission-control infrastructure is currently being developed by ESOC, which will constitute the second generation of the Spacecraft Control Operations system (SCOS 2). The financial, functional and strategic requirements lying behind the new development are explained. The SCOS 2 approach is described. The technological implications of these approaches is described: in particular it is explained how this leads to the use of object oriented techniques to provide the required 'building block' approach. The paper summarizes the way in which the financial, functional and strategic requirements have been met through this combination of solutions. Finally, the paper outlines the development process to date, noting how risk reduction was achieved in the approach to new technologies and summarizes the current status future plans.
Crew Exploration Vehicle Environmental Control and Life Support Fire Protection Approach
NASA Technical Reports Server (NTRS)
Lewis, John F.; Barido, Richard; Tuan, George C.
2007-01-01
As part of preparing for the Crew Exploration Vehicle (CEV), the National Aeronautics and Space Administration (NASA) worked on developing the requirements to manage the fire risk. The new CEV poses unique challenges to current fire protection systems. The size and configuration of the vehicle resembles the Apollo capsule instead of the current Space Shuttle or the International Space Station. The smaller free air volume and fully cold plated avionic bays of the CEV requires a different approach in fire protection than the ones currently utilized. The fire protection approach discussed in this paper incorporates historical lessons learned and fire detection and suppression system design philosophy spanning from Apollo to the International Space Station. Working with NASA fire and materials experts, this approach outlines the best requirements for both the closed out area of the vehicle, such as the avionics bay, and the crew cabin area to address the unique challenges due to the size and configuration of the CEV.
Calculating Electrical Requirements for Direct Current Electric Actuators
2017-11-29
These requirements lead to the determination of multiple design decisions such as: operating voltage, regenerative energy capture/dissipation, and...15. SUBJECT TERMS Electro-mechanical actuation Regenerative energy Electrical power Servo control Direct current (DC...Method 6 Power Supply Requirements 7 Approaches to Handling Regenerative Energy 8 Conductor Selection 10 Results and Discussions 10 Example
Financial Management: An Organic Approach
ERIC Educational Resources Information Center
Laux, Judy
2013-01-01
Although textbooks present corporate finance using a topical approach, good financial management requires an organic approach that integrates the various assignments financial managers confront every day. Breaking the tasks into meaningful subcategories, the current article offers one approach.
Alternative approach for fire suppression of class A, B and C fires in gloveboxes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberger, Mark S; Tsiagkouris, James A
2011-02-10
Department of Energy (DOE) Orders and National Fire Protection Association (NFPA) Codes and Standards require fire suppression in gloveboxes. Several potential solutions have been and are currently being considered at Los Alamos National Laboratory (LANL). The objective is to provide reliable, minimally invasive, and seismically robust fire suppression capable of extinguishing Class A, B, and C fires; achieve compliance with DOE and NFPA requirements; and provide value-added improvements to fire safety in gloveboxes. This report provides a brief summary of current approaches and also documents the successful fire tests conducted to prove that one approach, specifically Fire Foe{trademark} tubes, ismore » capable of achieving the requirement to provide reliable fire protection in gloveboxes in a cost-effective manner.« less
Supervising Writing: Helping Postgraduate Students Develop as Researchers
ERIC Educational Resources Information Center
Lee, Anne; Murray, Rowena
2015-01-01
Research and enquiry skills are increasingly required of students at all levels of the higher education curriculum, and this requires a sophisticated pedagogical response. The question is: how can we integrate current knowledge about academic writing with current knowledge about supervision? This article integrates different approaches to writing…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-14
... development of procedures to determine the competency of designees, to perform system audits and review, to... systems approach currently required for the importation of Hass avocados into all States of the United States from Michoac[aacute]n, Mexico. The systems approach requirements include trapping, orchard...
Artificial intelligence in the materials processing laboratory
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Kaukler, William F.
1990-01-01
Materials science and engineering provides a vast arena for applications of artificial intelligence. Advanced materials research is an area in which challenging requirements confront the researcher, from the drawing board through production and into service. Advanced techniques results in the development of new materials for specialized applications. Hand-in-hand with these new materials are also requirements for state-of-the-art inspection methods to determine the integrity or fitness for service of structures fabricated from these materials. Two problems of current interest to the Materials Processing Laboratory at UAH are an expert system to assist in eddy current inspection of graphite epoxy components for aerospace and an expert system to assist in the design of superalloys for high temperature applications. Each project requires a different approach to reach the defined goals. Results to date are described for the eddy current analysis, but only the original concepts and approaches considered are given for the expert system to design superalloys.
One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...
Cryogenic Fluid Transfer for Exploration
NASA Technical Reports Server (NTRS)
Chato, David J.
2007-01-01
This paper discusses current plans and issues for exploration that involve the use of cryogenic transfer. The benefits of cryogenic transfer to exploration missions are examined. The current state of the art of transfer technology is reviewed. Mission concepts of operation for exploration are presented, and used to qualitatively discuss the performance benefits of transfer. The paper looks at the challenges faced to implement a cryogenic transfer system and suggest approaches to address them with advanced development research. Transfer rates required for exploration are shown to have already been achieved in ground test. Cost effective approaches to the required on-orbit demonstration are suggested.
Cryogenic Fluid Transfer for Exploration
NASA Technical Reports Server (NTRS)
Chato, David J.
2008-01-01
This paper discusses current plans and issues for exploration that involve the use of cryogenic transfer. The benefits of cryogenic transfer to exploration missions are examined. The current state of the art of transfer technology is reviewed. Mission concepts of operation for exploration are presented, and used to qualitatively discuss the performance benefits of transfer. The paper looks at the challenges faced to implement a cryogenic transfer system and suggest approaches to address them with advanced development research. Transfer rates required for exploration are shown to have already been achieved in ground test. Cost-effective approaches to the required on-orbit demonstration are suggested.
Application Perspective of 2D+SCALE Dimension
NASA Astrophysics Data System (ADS)
Karim, H.; Rahman, A. Abdul
2016-09-01
Different applications or users need different abstraction of spatial models, dimensionalities and specification of their datasets due to variations of required analysis and output. Various approaches, data models and data structures are now available to support most current application models in Geographic Information System (GIS). One of the focuses trend in GIS multi-dimensional research community is the implementation of scale dimension with spatial datasets to suit various scale application needs. In this paper, 2D spatial datasets that been scaled up as the third dimension are addressed as 2D+scale (or 3D-scale) dimension. Nowadays, various data structures, data models, approaches, schemas, and formats have been proposed as the best approaches to support variety of applications and dimensionality in 3D topology. However, only a few of them considers the element of scale as their targeted dimension. As the scale dimension is concerned, the implementation approach can be either multi-scale or vario-scale (with any available data structures and formats) depending on application requirements (topology, semantic and function). This paper attempts to discuss on the current and new potential applications which positively could be integrated upon 3D-scale dimension approach. The previous and current works on scale dimension as well as the requirements to be preserved for any given applications, implementation issues and future potential applications forms the major discussion of this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D; Fasenfest, B; Rieben, R
2006-09-08
We are concerned with the solution of time-dependent electromagnetic eddy current problems using a finite element formulation on three-dimensional unstructured meshes. We allow for multiple conducting regions, and our goal is to develop an efficient computational method that does not require a computational mesh of the air/vacuum regions. This requires a sophisticated global boundary condition specifying the total fields on the conductor boundaries. We propose a Biot-Savart law based volume-to-surface boundary condition to meet this requirement. This Biot-Savart approach is demonstrated to be very accurate. In addition, this approach can be accelerated via a low-rank QR approximation of the discretizedmore » Biot-Savart law.« less
Early Fixation of Calcaneus Fractures.
Swords, Michael P; Penny, Phillip
2017-03-01
The treatment of calcaneus fractures is controversial. Historically, most operatively treated fractures have been approached with a lateral extensile incision requiring delay in operative treatment until swelling has improved. There is a current trend and interest in small incision approaches allowing, and in some cases requiring, earlier operative fixation. Clinical scenarios amenable to consideration for early fixation are reviewed. The sinus tarsi surgical approach and reduction techniques are outlined in detail. Copyright © 2016 Elsevier Inc. All rights reserved.
Models for forecasting hospital bed requirements in the acute sector.
Farmer, R D; Emami, J
1990-01-01
STUDY OBJECTIVE--The aim was to evaluate the current approach to forecasting hospital bed requirements. DESIGN--The study was a time series and regression analysis. The time series for mean duration of stay for general surgery in the age group 15-44 years (1969-1982) was used in the evaluation of different methods of forecasting future values of mean duration of stay and its subsequent use in the formation of hospital bed requirements. RESULTS--It has been suggested that the simple trend fitting approach suffers from model specification error and imposes unjustified restrictions on the data. Time series approach (Box-Jenkins method) was shown to be a more appropriate way of modelling the data. CONCLUSION--The simple trend fitting approach is inferior to the time series approach in modelling hospital bed requirements. PMID:2277253
Data compression for full motion video transmission
NASA Technical Reports Server (NTRS)
Whyte, Wayne A., Jr.; Sayood, Khalid
1991-01-01
Clearly transmission of visual information will be a major, if not dominant, factor in determining the requirements for, and assessing the performance of the Space Exploration Initiative (SEI) communications systems. Projected image/video requirements which are currently anticipated for SEI mission scenarios are presented. Based on this information and projected link performance figures, the image/video data compression requirements which would allow link closure are identified. Finally several approaches which could satisfy some of the compression requirements are presented and possible future approaches which show promise for more substantial compression performance improvement are discussed.
Integral Sensor Fault Detection and Isolation for Railway Traction Drive.
Garramiola, Fernando; Del Olmo, Jon; Poza, Javier; Madina, Patxi; Almandoz, Gaizka
2018-05-13
Due to the increasing importance of reliability and availability of electric traction drives in Railway applications, early detection of faults has become an important key for Railway traction drive manufacturers. Sensor faults are important sources of failures. Among the different fault diagnosis approaches, in this article an integral diagnosis strategy for sensors in traction drives is presented. Such strategy is composed of an observer-based approach for direct current (DC)-link voltage and catenary current sensors, a frequency analysis approach for motor current phase sensors and a hardware redundancy solution for speed sensors. None of them requires any hardware change requirement in the actual traction drive. All the fault detection and isolation approaches have been validated in a Hardware-in-the-loop platform comprising a Real Time Simulator and a commercial Traction Control Unit for a tram. In comparison to safety-critical systems in Aerospace applications, Railway applications do not need instantaneous detection, and the diagnosis is validated in a short time period for reliable decision. Combining the different approaches and existing hardware redundancy, an integral fault diagnosis solution is provided, to detect and isolate faults in all the sensors installed in the traction drive.
Integral Sensor Fault Detection and Isolation for Railway Traction Drive
del Olmo, Jon; Poza, Javier; Madina, Patxi; Almandoz, Gaizka
2018-01-01
Due to the increasing importance of reliability and availability of electric traction drives in Railway applications, early detection of faults has become an important key for Railway traction drive manufacturers. Sensor faults are important sources of failures. Among the different fault diagnosis approaches, in this article an integral diagnosis strategy for sensors in traction drives is presented. Such strategy is composed of an observer-based approach for direct current (DC)-link voltage and catenary current sensors, a frequency analysis approach for motor current phase sensors and a hardware redundancy solution for speed sensors. None of them requires any hardware change requirement in the actual traction drive. All the fault detection and isolation approaches have been validated in a Hardware-in-the-loop platform comprising a Real Time Simulator and a commercial Traction Control Unit for a tram. In comparison to safety-critical systems in Aerospace applications, Railway applications do not need instantaneous detection, and the diagnosis is validated in a short time period for reliable decision. Combining the different approaches and existing hardware redundancy, an integral fault diagnosis solution is provided, to detect and isolate faults in all the sensors installed in the traction drive. PMID:29757251
Department of Defense Precise Time and Time Interval program improvement plan
NASA Technical Reports Server (NTRS)
Bowser, J. R.
1981-01-01
The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
Hermoso, Maria; Tabacchi, Garden; Iglesia-Altaba, Iris; Bel-Serrat, Silvia; Moreno-Aznar, Luis A; García-Santos, Yurena; García-Luzardo, Ma del Rosario; Santana-Salguero, Beatriz; Peña-Quintana, Luis; Serra-Majem, Lluis; Moran, Victoria Hall; Dykes, Fiona; Decsi, Tamás; Benetou, Vassiliki; Plada, Maria; Trichopoulou, Antonia; Raats, Monique M; Doets, Esmée L; Berti, Cristiana; Cetin, Irene; Koletzko, Berthold
2010-10-01
This paper presents a review of the current knowledge regarding the macro- and micronutrient requirements of infants and discusses issues related to these requirements during the first year of life. The paper also reviews the current reference values used in European countries and the methodological approaches used to derive them by a sample of seven European and international authoritative committees from which background scientific reports are available. Throughout the paper, the main issues contributing to disparities in micronutrient reference values for infants are highlighted. The identification of these issues in relation to the specific physiological aspects of infants is important for informing future initiatives aimed at providing standardized approaches to overcome variability of micronutrient reference values across Europe for this age group. © 2010 Blackwell Publishing Ltd.
The transition to digital media in biocommunications.
Lynch, P J
1996-01-01
As digital audiovisual media become dominant in biomedical communications, the skills of human interface design and the technology of client-server multimedia data networks will underlie and influence virtually every aspect of biocommunications professional practice. The transition to digital communications media will require financial, organizational, and professional changes in current biomedical communications departments, and will require a multi-disciplinary approach that will blur the boundaries of the current biocommunications professions.
2015-04-30
approach directly contrast with the traditional DoD acquisition model designed for a single big-bang waterfall approach (Broadus, 2013). Currently...progress, reduce technical and programmatic risk, and respond to feedback and changes more quickly than traditional waterfall methods (Modigliani...requirements, and contracting. The DoD can address these barriers by utilizing a proactively tailored Agile acquisition model , implementing an IT Box
Characterization of a High Current, Long Life Hollow Cathode
NASA Technical Reports Server (NTRS)
VanNoord, Jonathan L.; Kamhawi, Hani; McEwen, Heather K.
2006-01-01
The advent of higher power spacecraft makes it desirable to use higher power electric propulsion thrusters such as ion thrusters or Hall thrusters. Higher power thrusters require cathodes that are capable of producing higher currents. One application of these higher power spacecraft is deep-space missions that require tens of thousands of hours of operation. This paper presents the approach used to design a high current, long life hollow cathode assembly for that application, along with test results from the corresponding hollow cathode. The design approach used for the candidate hollow cathode was to reduce the temperature gradient in the insert, yielding a lower peak temperature and allowing current to be produced more uniformly along the insert. The lower temperatures result in a hollow cathode with increased life. The hollow cathode designed was successfully operated at currents from 10 to 60 A with flow rates of 5 to 19 sccm with a maximum orifice temperature measured of 1100 C. Data including discharge voltage, keeper voltage, discharge current, flow rates, and orifice plate temperatures are presented.
Early detection of non-indigenous species (NIS), newly introduced species at low abundance in the monitoring area, can strengthen current management strategies including NIS control and eradication. A practical early detection strategy requires achieving balance between efficient...
Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...
Supporting Blended-Learning: Tool Requirements and Solutions with OWLish
ERIC Educational Resources Information Center
Álvarez, Ainhoa; Martín, Maite; Fernández-Castro, Isabel; Urretavizcaya, Maite
2016-01-01
Currently, most of the educational approaches applied to higher education combine face-to-face (F2F) and computer-mediated instruction in a Blended-Learning (B-Learning) approach. One of the main challenges of these approaches is fully integrating the traditional brick-and-mortar classes with online learning environments in an efficient and…
Current Source Based on H-Bridge Inverter with Output LCL Filter
NASA Astrophysics Data System (ADS)
Blahnik, Vojtech; Talla, Jakub; Peroutka, Zdenek
2015-09-01
The paper deals with a control of current source with an LCL output filter. The controlled current source is realized as a single-phase inverter and output LCL filter provides low ripple of output current. However, systems incorporating LCL filters require more complex control strategies and there are several interesting approaches to the control of this type of converter. This paper presents the inverter control algorithm, which combines model based control with a direct current control based on resonant controllers and single-phase vector control. The primary goal is to reduce the current ripple and distortion under required limits and provides fast and precise control of output current. The proposed control technique is verified by measurements on the laboratory model.
Kolahi, Jafar; Abrishami, Mohamadreza; Davidovitch, Zéev
2009-09-01
Direct electric current is a potent biologic mean to accelerate periodontal tissue turnover and orthodontic tooth movement. The main problem associated with this approach is the source of electricity. A noninvasive, removable enzymatic micro-battery, will administer minute electric currents to the alveolar bone and oral soft tissues, utilizing glucose as a fuel, becoming a possible source of the electrical power required for accelerating the velocity of orthodontic tooth movement.
ERIC Educational Resources Information Center
Pecorella, Patricia A.; Bowers, David G.
Analyses preparatory to construction of a suitable file for generating a system of future performance trend indicators are described. Such a system falls into the category of a current value approach to human resources accounting. It requires that there be a substantial body of data which: (1) uses the work group or unit, not the individual, as…
Oxygen Sensing for Industrial Safety — Evolution and New Approaches
Willett, Martin
2014-01-01
The requirement for the detection of oxygen in industrial safety applications has historically been met by electrochemical technologies based on the consumption of metal anodes. Products using this approach have been technically and commercially successful for more than three decades. However, a combination of new requirements is driving the development of alternative approaches offering fresh opportunities and challenges. This paper reviews some key aspects in the evolution of consumable anode products and highlights recent developments in alternative technologies aimed at meeting current and anticipated future needs in this important application. PMID:24681673
Oxygen sensing for industrial safety - evolution and new approaches.
Willett, Martin
2014-03-27
The requirement for the detection of oxygen in industrial safety applications has historically been met by electrochemical technologies based on the consumption of metal anodes. Products using this approach have been technically and commercially successful for more than three decades. However, a combination of new requirements is driving the development of alternative approaches offering fresh opportunities and challenges. This paper reviews some key aspects in the evolution of consumable anode products and highlights recent developments in alternative technologies aimed at meeting current and anticipated future needs in this important application.
AIR QUALITY MODELING OF HAZARDOUS POLLUTANTS: CURRENT STATUS AND FUTURE DIRECTIONS
The paper presents a review of current air toxics modeling applications and discusses possible advanced approaches. Many applications require the ability to predict hot spots from industrial sources or large roadways that are needed for community health and Environmental Justice...
A NEW APPROACH FOR BIODIESEL PRODUCTION FROM ALGAE
The supply of energy for the United States and world is currently dependent on extraction of fossil fuels. Eventually, a novel or sustainable source of energy will be required for industrial societies. In particular, transportation fuels are currently dependent on dwindling su...
The MICRO-BOSS scheduling system: Current status and future efforts
NASA Technical Reports Server (NTRS)
Sadeh, Norman M.
1993-01-01
In this paper, a micro-opportunistic approach to factory scheduling was described that closely monitors the evolution of bottlenecks during the construction of the schedule, and continuously redirects search towards the bottleneck that appears to be most critical. This approach differs from earlier opportunistic approaches, as it does not require scheduling large resource subproblems or large job subproblems before revising the current scheduling strategy. This micro-opportunistic approach was implemented in the context of the MICRO-BOSS factory scheduling system. A study comparing MICRO-BOSS against a macro-opportunistic scheduler suggests that the additional flexibility of the micro-opportunistic approach to scheduling generally yields important reductions in both tardiness and inventory.
Screening for endocrine-disrupting chemicals (EDCs) requires sensitive, scalable assays. Current high-throughput screening (HTPS) approaches for estrogenic and androgenic activity yield rapid results, but many are not sensitive to physiological hormone concentrations, suggesting ...
SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK
All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...
Planar/dpiX common military avionics AMLCDs: roadmap and production
NASA Astrophysics Data System (ADS)
Wanner, John; Gard, Allen; Roselle, Paul; Lewis, Alan
2000-08-01
This paper reviews the current production approach and status at Planar and dpiX utilizing a common design architecture within a family of cockpit AMLCD displays. The present status of low volume production requirements to support military applications, as well as the unique display formats and performance requirements dictated by the specific cockpit applications has resulted in a manufacturing approach requiring common TFT substrate design flexibility and the use of a common foundation for the assembly of AMLCD displays suitable for a variety of high performance military cockpits.
Fast Formal Analysis of Requirements via "Topoi Diagrams"
NASA Technical Reports Server (NTRS)
Menzies, Tim; Powell, John; Houle, Michael E.; Kelly, John C. (Technical Monitor)
2001-01-01
Early testing of requirements can decrease the cost of removing errors in software projects. However, unless done carefully, that testing process can significantly add to the cost of requirements analysis. We show here that requirements expressed as topoi diagrams can be built and tested cheaply using our SP2 algorithm, the formal temporal properties of a large class of topoi can be proven very quickly, in time nearly linear in the number of nodes and edges in the diagram. There are two limitations to our approach. Firstly, topoi diagrams cannot express certain complex concepts such as iteration and sub-routine calls. Hence, our approach is more useful for requirements engineering than for traditional model checking domains. Secondly, out approach is better for exploring the temporal occurrence of properties than the temporal ordering of properties. Within these restrictions, we can express a useful range of concepts currently seen in requirements engineering, and a wide range of interesting temporal properties.
Practice Patterns of Speech-Language Pathologists in Pediatric Vocal Health.
Hartley, Naomi A; Braden, Maia; Thibeault, Susan L
2017-05-17
The purpose of this study was to investigate current practices of speech-language pathologists (SLPs) in the management of pediatric vocal health, with specific analysis of the influence of clinical specialty and workplace setting on management approaches. American Speech-Language-Hearing Association-certified clinicians providing services within the United States (1%-100% voice caseload) completed an anonymous online survey detailing clinician demographics; employment location and service delivery models; approaches to continuing professional development; and specifics of case management, including assessment, treatment, and discharge procedures. Current practice patterns were analyzed for 100 SLPs (0-42 years of experience; 77 self-identifying as voice specialists) providing services in 34 U.S. states across a range of metropolitan and nonmetropolitan workplace settings. In general, SLPs favored a multidisciplinary approach to management; included perceptual, instrumental, and quality of life measures during evaluation; and tailored intervention to the individual using a combination of therapy approaches. In contrast with current practice guidelines, only half reported requiring an otolaryngology evaluation prior to initiating treatment. Both clinical specialty and workplace setting were found to affect practice patterns. SLPs in school settings were significantly less likely to consider themselves voice specialists compared with all other work environments. Those SLPs who considered themselves voice specialists were significantly more likely to utilize voice-specific assessment and treatment approaches. SLP practice largely mirrors current professional practice guidelines; however, potential exists to further enhance client care. To ensure that SLPs are best able to support children in successful communication, further research, education, and advocacy are required.
An avionics sensitivity study. Volume 1: Operational considerations
NASA Technical Reports Server (NTRS)
Scott, R. W.; Mcconkey, E. D.
1976-01-01
Equipment and operational concepts affecting aircraft in the terminal area are reported. Curved approach applications and modified climb and descent procedures for minimum fuel consumption are considered. The curved approach study involves the application of MLS guidance to enable execution of the current visual approach to Washington National Airport under instrument flight conditions. The operational significance and the flight path control requirements involved in the application of curved approach paths to this situation are considered. Alternative flight path control regimes are considered to achieve minimum fuel consumption subject to constraints related to air traffic control requirements, flight crew and passenger reactions, and airframe and powerplant limitations.
Marciano, Michael A; Adelman, Jonathan D
2017-03-01
The deconvolution of DNA mixtures remains one of the most critical challenges in the field of forensic DNA analysis. In addition, of all the data features required to perform such deconvolution, the number of contributors in the sample is widely considered the most important, and, if incorrectly chosen, the most likely to negatively influence the mixture interpretation of a DNA profile. Unfortunately, most current approaches to mixture deconvolution require the assumption that the number of contributors is known by the analyst, an assumption that can prove to be especially faulty when faced with increasingly complex mixtures of 3 or more contributors. In this study, we propose a probabilistic approach for estimating the number of contributors in a DNA mixture that leverages the strengths of machine learning. To assess this approach, we compare classification performances of six machine learning algorithms and evaluate the model from the top-performing algorithm against the current state of the art in the field of contributor number classification. Overall results show over 98% accuracy in identifying the number of contributors in a DNA mixture of up to 4 contributors. Comparative results showed 3-person mixtures had a classification accuracy improvement of over 6% compared to the current best-in-field methodology, and that 4-person mixtures had a classification accuracy improvement of over 20%. The Probabilistic Assessment for Contributor Estimation (PACE) also accomplishes classification of mixtures of up to 4 contributors in less than 1s using a standard laptop or desktop computer. Considering the high classification accuracy rates, as well as the significant time commitment required by the current state of the art model versus seconds required by a machine learning-derived model, the approach described herein provides a promising means of estimating the number of contributors and, subsequently, will lead to improved DNA mixture interpretation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
The QuEST for multi-sensor big data ISR situation understanding
NASA Astrophysics Data System (ADS)
Rogers, Steven; Culbertson, Jared; Oxley, Mark; Clouse, H. Scott; Abayowa, Bernard; Patrick, James; Blasch, Erik; Trumpfheller, John
2016-05-01
The challenges for providing war fighters with the best possible actionable information from diverse sensing modalities using advances in big-data and machine learning are addressed in this paper. We start by presenting intelligence, surveillance, and reconnaissance (ISR) related big-data challenges associated with the Third Offset Strategy. Current approaches to big-data are shown to be limited with respect to reasoning/understanding. We present a discussion of what meaning making and understanding require. We posit that for human-machine collaborative solutions to address the requirements for the strategy a new approach, Qualia Exploitation of Sensor Technology (QuEST), will be required. The requirements for developing a QuEST theory of knowledge are discussed and finally, an engineering approach for achieving situation understanding is presented.
Jackson, D; Smith, K; Wood, M D
2014-07-01
Over recent years, a number of approaches have been developed that enable the calculation of dose rates to animals and plants following the release of radioactivity to the environment. These approaches can be used to assess the potential impacts of activities that may release radioactivity to the environment, such as the operation of waste repositories. A number of national and international studies have identified screening criteria to indicate those assessment results below which further consideration is not generally required. However no internationally agreed criteria are currently available and consistency in criteria between countries has not been achieved. Furthermore, since screening criteria are not intended to be applied as limits, it is clear that they cannot always form a sufficient basis for assessing the adequacy of protection afforded. Typically, exceeding a screening value leads to a regulatory requirement to undertake a further, more detailed assessment. It does not, per se, imply that there is inadequate protection of the organism types at the specific site under assessment. Therefore, there is a need to develop a more structured approach to dealing with situations in which current screening criteria are exceeded. As a contribution to the developing international discussions, and as an interim measure for application where assessments are required currently, a two-tier, three zone framework is proposed here, relevant to the long term assessment of potential impacts from the deep disposal of radioactive wastes. The purpose of the proposed framework is to promote a proportionate and risk-based approach to the level of effort required in undertaking and interpreting an assessment. Copyright © 2013. Published by Elsevier Ltd.
A NEW APPROACH TO PIP CROP MONITORING USING REMOTE SENSING
Current plantings of 25+ million acres of transgenic corn in the United States require a new approach to monitor this important crop for the development of pest resistance. Remote sensing by aerial or satellite images may provide a method of identifying transgenic pesticidal cro...
(Q)SARs to predict environmental toxicities: current status and future needs.
Cronin, Mark T D
2017-03-22
The current state of the art of (Quantitative) Structure-Activity Relationships ((Q)SARs) to predict environmental toxicity is assessed along with recommendations to develop these models further. The acute toxicity of compounds acting by the non-polar narcotic mechanism of action can be well predicted, however other approaches, including read-across, may be required for compounds acting by specific mechanisms of action. The chronic toxicity of compounds to environmental species is more difficult to predict from (Q)SARs, with robust data sets and more mechanistic information required. In addition, the toxicity of mixtures is little addressed by (Q)SAR approaches. Developments in environmental toxicology including Adverse Outcome Pathways (AOPs) and omics responses should be utilised to develop better, more mechanistically relevant, (Q)SAR models.
DOT National Transportation Integrated Search
1987-09-01
This analysis determines the number, mix, and home ports of vessels required to replace the aging fleet of WLB (seagoing) and WLM (coastal) buoy tenders currently servicing aids to navigation. A cast study approach was used. Differing values of vesse...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-10
... that plan for any mergers, (3) obtain prior written approvals for the use of certain approaches for... of its continuing effort to reduce paperwork and respondent burden, invites the general public and... Rules: Standardized Approach for Risk-Weighted Assets; Market Discipline and Disclosure Requirements (77...
An approach for conducting PM source apportionment will be developed, tested, and applied that directly addresses limitations in current SA methods, in particular variability, biases, and intensive resource requirements. Uncertainties in SA results and sensitivities to SA inpu...
Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or alread...
Teachers' Reflections on Distributive Leadership in Public Primary Schools in Soweto
ERIC Educational Resources Information Center
Naicker, Suraiya R.; Mestry, Raj
2013-01-01
Schooling has become increasingly complex in purpose and structure and therefore requires appropriate forms of leadership to address this challenge. One current leadership approach that is receiving national and global attention is distributive leadership. A qualitative approach was employed to investigate teachers' experiences and perceptions of…
Salt, Time, and Metaphor: Examining Norms in Scientific Culture
ERIC Educational Resources Information Center
Brady, Anna G.
2017-01-01
As has been widely discussed, the National Research Council's (NRC) current policy in United States education advocates supporting students toward acquiring skills to engage in scientific practices. NRC policy also suggests that supporting students in the practices of science may require different approaches than what is required for supporting…
A GIS WEB MAPPING APPROACH FOR IDENTIFYING SPECIES AND LOCATIONS FOR ECOLOGICAL RISK ASSESSMENTS
In many countries, numerous tests are required prior to chemical registration for the protection of human health and the environment from the unintended effects of chemical releases. Currently, plant testing in the United States requires the use of ten species, selected because t...
Get Your Requirements Straight: Storyboarding Revisited
NASA Astrophysics Data System (ADS)
Haesen, Mieke; Luyten, Kris; Coninx, Karin
Current user-centred software engineering (UCSE) approaches provide many techniques to combine know-how available in multidisciplinary teams. Although the involvement of various disciplines is beneficial for the user experience of the future application, the transition from a user needs analysis to a structured interaction analysis and UI design is not always straightforward. We propose storyboards, enriched by metadata, to specify functional and non-functional requirements. Accompanying tool support should facilitate the creation and use of storyboards. We used a meta-storyboard for the verification of storyboarding approaches.
NASA Technical Reports Server (NTRS)
Webb, J. T.
1988-01-01
A new approach to the training, certification, recertification, and proficiency maintenance of the Shuttle launch team is proposed. Previous training approaches are first reviewed. Short term program goals include expanding current training methods, improving the existing simulation capability, and scheduling training exercises with the same priority as hardware tests. Long-term goals include developing user requirements which would take advantage of state-of-the-art tools and techniques. Training requirements for the different groups of people to be trained are identified, and future goals are outlined.
TOPICAL REVIEW: Progress in engineering high strain lead-free piezoelectric ceramics
NASA Astrophysics Data System (ADS)
Leontsev, Serhiy O.; Eitel, Richard E.
2010-08-01
Environmental concerns are strongly driving the need to replace the lead-based piezoelectric materials currently employed as multilayer actuators. The current review describes both compositional and structural engineering approaches to achieve enhanced piezoelectric properties in lead-free materials. The review of the compositional engineering approach focuses on compositional tuning of the properties and phase behavior in three promising families of lead-free perovskite ferroelectrics: the titanate, alkaline niobate and bismuth perovskites and their solid solutions. The 'structural engineering' approaches focus instead on optimization of microstructural features including grain size, grain orientation or texture, ferroelectric domain size and electrical bias field as potential paths to induce large piezoelectric properties in lead-free piezoceramics. It is suggested that a combination of both compositional and novel structural engineering approaches will be required in order to realize viable lead-free alternatives to current lead-based materials for piezoelectric actuator applications.
Progress in engineering high strain lead-free piezoelectric ceramics
Leontsev, Serhiy O; Eitel, Richard E
2010-01-01
Environmental concerns are strongly driving the need to replace the lead-based piezoelectric materials currently employed as multilayer actuators. The current review describes both compositional and structural engineering approaches to achieve enhanced piezoelectric properties in lead-free materials. The review of the compositional engineering approach focuses on compositional tuning of the properties and phase behavior in three promising families of lead-free perovskite ferroelectrics: the titanate, alkaline niobate and bismuth perovskites and their solid solutions. The ‘structural engineering’ approaches focus instead on optimization of microstructural features including grain size, grain orientation or texture, ferroelectric domain size and electrical bias field as potential paths to induce large piezoelectric properties in lead-free piezoceramics. It is suggested that a combination of both compositional and novel structural engineering approaches will be required in order to realize viable lead-free alternatives to current lead-based materials for piezoelectric actuator applications. PMID:27877343
Transoral robotic thyroid surgery
Clark, James H.; Kim, Hoon Yub
2015-01-01
There is currently significant demand for minimally invasive thyroid surgery; however the majority of proposed surgical approaches necessitate a compromise between minimal tissue dissection with a visible cervical scar or extensive tissue dissection with a remote, hidden scar. The development of transoral endoscopic thyroid surgery however provides an approach which is truly minimally invasive, as it conceals the incision within the oral cavity without significantly increasing the amount of required dissection. The transoral endoscopic approach however presents multiple technical challenges, which could be overcome with the incorporation of a robotic operating system. This manuscript summarizes the literature on the feasibility and current clinical experience with transoral robotic thyroid surgery. PMID:26425456
The MICRO-BOSS scheduling system: Current status and future efforts
NASA Technical Reports Server (NTRS)
Sadeh, Norman M.
1992-01-01
In this paper, a micro-opportunistic approach to factory scheduling was described that closely monitors the evolution of bottlenecks during the construction of the schedule and continuously redirects search towards the bottleneck that appears to be most critical. This approach differs from earlier opportunistic approaches, as it does not require scheduling large resource subproblems or large job subproblems before revising the current scheduling strategy. This micro-opportunistic approach was implemented in the context of the MICRO-BOSS factory scheduling system. A study comparing MICRO-BOSS against a macro-opportunistic scheduler suggests that the additional flexibility of the micro-opportunistic approach to scheduling generally yields important reductions in both tardiness and inventory. Current research efforts include: adaptation of MICRO-BOSS to deal with sequence-dependent setups and development of micro-opportunistic reactive scheduling techniques that will enable the system to patch the schedule in the presence of contingencies such as machine breakdowns, raw materials arriving late, job cancellations, etc.
Interferometer for Space Station Windows
NASA Technical Reports Server (NTRS)
Hall, Gregory
2003-01-01
Inspection of space station windows for micrometeorite damage would be a difficult task insitu using current inspection techniques. Commercially available optical profilometers and inspection systems are relatively large, about the size of a desktop computer tower, and require a stable platform to inspect the test object. Also, many devices currently available are designed for a laboratory or controlled environments requiring external computer control. This paper presents an approach using a highly developed optical interferometer to inspect the windows from inside the space station itself using a self- contained hand held device. The interferometer would be capable as a minimum of detecting damage as small as one ten thousands of an inch in diameter and depth while interrogating a relatively large area. The current developmental state of this device is still in the proof of concept stage. The background section of this paper will discuss the current state of the art of profilometers as well as the desired configuration of the self-contained, hand held device. Then, a discussion of the developments and findings that will allow the configuration change with suggested approaches appearing in the proof of concept section.
We demonstrate an approach for evaluating the level of protection attained using a variety of forms and levels of past, current, and proposed Air Quality Standards (AQSs). The U.S. Clean Air Act requires the establishment of ambient air quality standards to protect health and pub...
Institutional barriers and incentives for ecosystem management: a problem analysis.
H.J. Cortner; M.A. Shannon; M.G. Wallace; S. Burke; M.A. Moote
1996-01-01
Ecosystem management is currently being proposed as a new resource management philosophy. This approach to resource management will require changes in how society approaches nature, science, and politics. Further, if efforts to implement ecosystem management are to succeed, institutional issues must be examined. This report identifies five problem areas where social...
Expanding Omani Learners' Horizons through Project-Based Learning: A Case Study
ERIC Educational Resources Information Center
Dauletova, Victoria
2014-01-01
As a relatively innovative teaching/learning approach in the Arabian Gulf region, in general, and in Oman, in particular, project-based learning requires progressive amendments and adaptations to the national culture of the learner. This article offers analysis of the current state of the approach in the local educational environment. Furthermore,…
1991-12-01
urally. 6.5 Summary of Current or Potential Approaches Many approaches to context analysis were discussed by the group, including: * Causal Trees * SWOT ... Apple Computer, 1988 1 Aseltine, J., Beam, W.R., Palmer, J.D., Sage, A.P., 1989, Introduction To Computer Systems: Analysis, Design and Application
Global GNSS processing based on the raw observation approach
NASA Astrophysics Data System (ADS)
Strasser, Sebastian; Zehentner, Norbert; Mayer-Gürr, Torsten
2017-04-01
Many global navigation satellite system (GNSS) applications, e.g. Precise Point Positioning (PPP), require high-quality GNSS products, such as precise GNSS satellite orbits and clocks. These products are routinely determined by analysis centers of the International GNSS Service (IGS). The current processing methods of the analysis centers make use of the ionosphere-free linear combination to reduce the ionospheric influence. Some of the analysis centers also form observation differences, in general double-differences, to eliminate several additional error sources. The raw observation approach is a new GNSS processing approach that was developed at Graz University of Technology for kinematic orbit determination of low Earth orbit (LEO) satellites and subsequently adapted to global GNSS processing in general. This new approach offers some benefits compared to well-established approaches, such as a straightforward incorporation of new observables due to the avoidance of observation differences and linear combinations. This becomes especially important in view of the changing GNSS landscape with two new systems, the European system Galileo and the Chinese system BeiDou, currently in deployment. GNSS products generated at Graz University of Technology using the raw observation approach currently comprise precise GNSS satellite orbits and clocks, station positions and clocks, code and phase biases, and Earth rotation parameters. To evaluate the new approach, products generated using the Global Positioning System (GPS) constellation and observations from the global IGS station network are compared to those of the IGS analysis centers. The comparisons show that the products generated at Graz University of Technology are on a similar level of quality to the products determined by the IGS analysis centers. This confirms that the raw observation approach is applicable to global GNSS processing. Some areas requiring further work have been identified, enabling future improvements of the method.
Lopez, Felipe L; Ernest, Terry B; Tuleu, Catherine; Gul, Mine Orlu
2015-01-01
Introduction: Most conventional drug delivery systems are not acceptable for pediatric patients as they differ in their developmental status and dosing requirements from other subsets of the population. Technology platforms are required to aid the development of age-appropriate medicines to maximize patient acceptability while maintaining safety, efficacy, accessibility and affordability. Areas covered: The current approaches and novel developments in the field of age-appropriate drug delivery for pediatric patients are critically discussed including patient-centric formulations, administration devices and packaging systems. Expert opinion: Despite the incentives provided by recent regulatory modifications and the efforts of formulation scientists, there is still a need for implementation of pharmaceutical technologies that enable the manufacture of licensed age-appropriate formulations. Harmonization of endeavors from regulators, industry and academia by sharing learning associated with data obtained from pediatric investigation plans, product development pathways and scientific projects would be the way forward to speed up bench-to-market age appropriate formulation development. A collaborative approach will benefit not only pediatrics, but other patient populations such as geriatrics would also benefit from an accelerated patient-centric approach to drug delivery. PMID:26165848
Advanced Wet Tantalum Capacitors: Design, Specifications and Performance
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2016-01-01
Insertion of new types of commercial, high volumetric efficiency wet tantalum capacitors in space systems requires reassessment of the existing quality assurance approaches that have been developed for capacitors manufactured to MIL-PRF-39006 requirements. The specifics of wet electrolytic capacitors is that leakage currents flowing through electrolyte can cause gas generation resulting in building up of internal gas pressure and rupture of the case. The risk associated with excessive leakage currents and increased pressure is greater for high value advanced wet tantalum capacitors, but it has not been properly evaluated yet. This presentation gives a review of specifics of the design, performance, and potential reliability risks associated with advanced wet tantalum capacitors. Problems related to setting adequate requirements for DPA, leakage currents, hermeticity, stability at low and high temperatures, ripple currents for parts operating in vacuum, and random vibration testing are discussed. Recommendations for screening and qualification to reduce risks of failures have been suggested.
Advanced Wet Tantalum Capacitors: Design, Specifications and Performance
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2017-01-01
Insertion of new types of commercial, high volumetric efficiency wet tantalum capacitors in space systems requires reassessment of the existing quality assurance approaches that have been developed for capacitors manufactured to MIL-PRF-39006 requirements. The specifics of wet electrolytic capacitors is that leakage currents flowing through electrolyte can cause gas generation resulting in building up of internal gas pressure and rupture of the case. The risk associated with excessive leakage currents and increased pressure is greater for high value advanced wet tantalum capacitors, but it has not been properly evaluated yet. This presentation gives a review of specifics of the design, performance, and potential reliability risks associated with advanced wet tantalum capacitors. Problems related to setting adequate requirements for DPA, leakage currents, hermeticity, stability at low and high temperatures, ripple currents for parts operating in vacuum, and random vibration testing are discussed. Recommendations for screening and qualification to reduce risks of failures have been suggested.
DOT National Transportation Integrated Search
2005-03-01
The conventional approach to signal timing optimization and field deployment requires current traffic flow data, experience with optimization models, familiarity with the signal controller hardware, and knowledge of field operations including signal ...
Signal timing on a shoestring.
DOT National Transportation Integrated Search
2005-03-01
The conventional approach to signal timing optimization and field deployment requires current traffic flow data, experience with optimization models, familiarity with the signal controller hardware, and knowledge of field operations including signal ...
Structures and Materials Technologies for Extreme Environments Applied to Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.; Clay, Christopher; Rezin, Marc
2003-01-01
This paper provides an overview of the evolution of structures and materials technology approaches to survive the challenging extreme environments encountered by earth-to-orbit space transportation systems, with emphasis on more recent developments in the USA. The evolution of technology requirements and experience in the various approaches to meeting these requirements has significantly influenced the technology approaches. While previous goals were primarily performance driven, more recently dramatic improvements in costs/operations and in safety have been paramount goals. Technologies that focus on the cost/operations and safety goals in the area of hot structures and thermal protection systems for reusable launch vehicles are presented. Assessments of the potential ability of the various technologies to satisfy the technology requirements, and their current technology readiness status are also presented.
de Araújo, Brenda R S; Linares León, José J
2018-05-15
This study presents the results of the electrochemical degradation of the emulsifier cetrimonium chloride (CTAC) on a boron-doped diamond (BDD) anode under different current densities and flow rates. Higher values of these parameters result in a more rapid removal. Nevertheless, operation at low current reduces the required applied charge and increases the chemical oxygen demand (COD) removal efficiency, as there is less development of ineffective parasitic reactions. On the other hand, high flow rates reduce the required volumetric applied charge and increase the COD removal current efficiency. In order to assist and enrich the study, an economic analysis has been performed. For short expected plant lifespans, operation at low current is advantageous due to the lower investment required, whereas for longer expected lifespans, the operational costs make the lower current densities less costly. High flow rates are always advantageous from a financial point of view. Copyright © 2018 Elsevier Ltd. All rights reserved.
Development and verification of local/global analysis techniques for laminated composites
NASA Technical Reports Server (NTRS)
Griffin, O. Hayden, Jr.
1989-01-01
Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.
Development of a structured approach for decomposition of complex systems on a functional basis
NASA Astrophysics Data System (ADS)
Yildirim, Unal; Felician Campean, I.
2014-07-01
The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).
The NASA Space Launch System Program Systems Engineering Approach for Affordability
NASA Technical Reports Server (NTRS)
Hutt, John J.; Whitehead, Josh; Hanson, John
2017-01-01
The National Aeronautics and Space Administration is currently developing the Space Launch System to provide the United States with a capability to launch large Payloads into Low Earth orbit and deep space. One of the development tenets of the SLS Program is affordability. One initiative to enhance affordability is the SLS approach to requirements definition, verification and system certification. The key aspects of this initiative include: 1) Minimizing the number of requirements, 2) Elimination of explicit verification requirements, 3) Use of certified models of subsystem capability in lieu of requirements when appropriate and 4) Certification of capability beyond minimum required capability. Implementation of each aspect is described and compared to a "typical" systems engineering implementation, including a discussion of relative risk. Examples of each implementation within the SLS Program are provided.
Frontiers in the pathogenesis of Alzheimer’s disease
Sambamurti, Kumar; Jagannatha Rao, K. S.; Pappolla, Miguel A.
2009-01-01
Alzheimer’s disease (AD) is characterized by progressive dementia and brain deposits of the amyloid β protein (Aβ) as senile plaques and the microtubule-associated protein, Tau, as neurofibrillary tangles (NFT). The current treatment of AD is limited to drugs that attempt to correct deficits in the cholinergic pathway or glutamate toxicity. These drugs show some improvement over a short period of time but the disease ultimately requires treatment to prevent and stop the neurodegeneration that affects multiple pathways. The currently favored hypothesis is that Aβ aggregates to toxic forms that induce neurodegeneration. Drugs that reduce Aβ successfully treat transgenic mouse models of AD, but the most promising anti-Aβ vaccination approach did not successfully treat AD in a clinical trial. These studies suggest that AD pathogenesis is a complex phenomenon and requires a more broad-based approach to identify mechanisms of neurodegeneration. Multiple hypotheses have been proposed and the field is ready for a new generation of ideas to develop early diagnostic approaches and develop successful treatment plans. PMID:21416019
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra; Haghani, Hamid
2016-01-01
A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information.
Hierarchical Approach to 'Atomistic' 3-D MOSFET Simulation
NASA Technical Reports Server (NTRS)
Asenov, Asen; Brown, Andrew R.; Davies, John H.; Saini, Subhash
1999-01-01
We present a hierarchical approach to the 'atomistic' simulation of aggressively scaled sub-0.1 micron MOSFET's. These devices are so small that their characteristics depend on the precise location of dopant atoms within them, not just on their average density. A full-scale three-dimensional drift-diffusion atomistic simulation approach is first described and used to verify more economical, but restricted, options. To reduce processor time and memory requirements at high drain voltage, we have developed a self-consistent option based on a solution of the current continuity equation restricted to a thin slab of the channel. This is coupled to the solution of the Poisson equation in the whole simulation domain in the Gummel iteration cycles. The accuracy of this approach is investigated in comparison to the full self-consistent solution. At low drain voltage, a single solution of the nonlinear Poisson equation is sufficient to extract the current with satisfactory accuracy. In this case, the current is calculated by solving the current continuity equation in a drift approximation only, also in a thin slab containing the MOSFET channel. The regions of applicability for the different components of this hierarchical approach are illustrated in example simulations covering the random dopant-induced threshold voltage fluctuations, threshold voltage lowering, threshold voltage asymmetry, and drain current fluctuations.
Crash Certification by Analysis - Are We There Yet?
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.
2006-01-01
This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."
A new approach for instrument software at Gemini
NASA Astrophysics Data System (ADS)
Gillies, Kim; Nunez, Arturo; Dunn, Jennifer
2008-07-01
Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.
Synthesis: Intertwining product and process
NASA Technical Reports Server (NTRS)
Weiss, David M.
1990-01-01
Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.
A Pocock Approach to Sequential Meta-Analysis of Clinical Trials
ERIC Educational Resources Information Center
Shuster, Jonathan J.; Neu, Josef
2013-01-01
Three recent papers have provided sequential methods for meta-analysis of two-treatment randomized clinical trials. This paper provides an alternate approach that has three desirable features. First, when carried out prospectively (i.e., we only have the results up to the time of our current analysis), we do not require knowledge of the…
NASA Technical Reports Server (NTRS)
Meakin, Robert L.
1995-01-01
Grid related issues of the Chimera overset grid method are discussed in the context of a method of solution and analysis of unsteady three-dimensional viscous flows. The state of maturity of the various pieces of support software required to use the approach is considered. Current limitations of the approach are identified.
Enhancing the Graduate Information Systems Curriculum: A Career Skills Oriented Approach
ERIC Educational Resources Information Center
Khoo, Benjamin; Harris, Peter
2009-01-01
The Information Systems (IS) curriculum needs to be updated frequently due to the rapid rate of advances in information systems (IS) and the technologies that drive IS, and also industry's skill requirement of IS graduates. This paper describes a Career Skills Oriented Approach to enhance the graduate IS curriculum based on current information…
Morishige, Ken-ichi; Yoshioka, Taku; Kawawaki, Dai; Hiroe, Nobuo; Sato, Masa-aki; Kawato, Mitsuo
2014-11-01
One of the major obstacles in estimating cortical currents from MEG signals is the disturbance caused by magnetic artifacts derived from extra-cortical current sources such as heartbeats and eye movements. To remove the effect of such extra-brain sources, we improved the hybrid hierarchical variational Bayesian method (hyVBED) proposed by Fujiwara et al. (NeuroImage, 2009). hyVBED simultaneously estimates cortical and extra-brain source currents by placing dipoles on cortical surfaces as well as extra-brain sources. This method requires EOG data for an EOG forward model that describes the relationship between eye dipoles and electric potentials. In contrast, our improved approach requires no EOG and less a priori knowledge about the current variance of extra-brain sources. We propose a new method, "extra-dipole," that optimally selects hyper-parameter values regarding current variances of the cortical surface and extra-brain source dipoles. With the selected parameter values, the cortical and extra-brain dipole currents were accurately estimated from the simulated MEG data. The performance of this method was demonstrated to be better than conventional approaches, such as principal component analysis and independent component analysis, which use only statistical properties of MEG signals. Furthermore, we applied our proposed method to measured MEG data during covert pursuit of a smoothly moving target and confirmed its effectiveness. Copyright © 2014 Elsevier Inc. All rights reserved.
Franco, Antonio; Price, Oliver R; Marshall, Stuart; Jolliet, Olivier; Van den Brink, Paul J; Rico, Andreu; Focks, Andreas; De Laender, Frederik; Ashauer, Roman
2017-03-01
Current regulatory practice for chemical risk assessment suffers from the lack of realism in conventional frameworks. Despite significant advances in exposure and ecological effect modeling, the implementation of novel approaches as high-tier options for prospective regulatory risk assessment remains limited, particularly among general chemicals such as down-the-drain ingredients. While reviewing the current state of the art in environmental exposure and ecological effect modeling, we propose a scenario-based framework that enables a better integration of exposure and effect assessments in a tiered approach. Global- to catchment-scale spatially explicit exposure models can be used to identify areas of higher exposure and to generate ecologically relevant exposure information for input into effect models. Numerous examples of mechanistic ecological effect models demonstrate that it is technically feasible to extrapolate from individual-level effects to effects at higher levels of biological organization and from laboratory to environmental conditions. However, the data required to parameterize effect models that can embrace the complexity of ecosystems are large and require a targeted approach. Experimental efforts should, therefore, focus on vulnerable species and/or traits and ecological conditions of relevance. We outline key research needs to address the challenges that currently hinder the practical application of advanced model-based approaches to risk assessment of down-the-drain chemicals. Integr Environ Assess Manag 2017;13:233-248. © 2016 SETAC. © 2016 SETAC.
An experimental study of nonlinear dynamic system identification
NASA Technical Reports Server (NTRS)
Stry, Greselda I.; Mook, D. Joseph
1990-01-01
A technique for robust identification of nonlinear dynamic systems is developed and illustrated using both simulations and analog experiments. The technique is based on the Minimum Model Error optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature of the current work is the ability to identify nonlinear dynamic systems without prior assumptions regarding the form of the nonlinearities, in constrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.
Hemostatic strategies for traumatic and surgical bleeding
Behrens, Adam M.; Sikorski, Michael J.; Kofinas, Peter
2017-01-01
Wide interest in new hemostatic approaches has stemmed from unmet needs in the hospital and on the battlefield. Many current commercial hemostatic agents fail to fulfill the design requirements of safety, efficacy, cost, and storage. Academic focus has led to the improvement of existing strategies as well as new developments. This review will identify and discuss the three major classes of hemostatic approaches: biologically derived materials, synthetically derived materials, and intravenously administered hemostatic agents. The general class is first discussed, then specific approaches discussed in detail, including the hemostatic mechanisms and the advancement of the method. As hemostatic strategies evolve and synthetic-biologic interactions are more fully understood, current clinical methodologies will be replaced. PMID:24307256
A climate-compatible approach to development practice by international humanitarian NGOs.
Clarke, Matthew; de Cruz, Ian
2015-01-01
If current climate-change predictions prove accurate, non-linear change, including potentially catastrophic change, is possible and the environments in which international humanitarian NGOs operate will change figuratively and literally. This paper proposes that a new approach to development is required that takes changing climate into account. This 'climate-compatible approach' to development is a bleak shift from some of the current orthodox positions and will be a major challenge to international humanitarian NGOs working with the most vulnerable. However, it is necessary to address the challenges and context such NGOs face, and the need to be resilient and adaptive to these changes. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.
Patient Safety Incident Reporting: Current Trends and Gaps Within the Canadian Health System.
Boucaud, Sarah; Dorschner, Danielle
2016-01-01
Patient safety incidents are a national-level phenomenon, requiring a pan-Canadian approach to ensure that incidents are reported and lessons are learned and broadly disseminated. This work explores the variation in current provincial and local approaches to reporting through a literature review. Trends are consolidated and recommendations are offered to foster better alignment of existing systems. These include adopting a common terminology, defining the patient role in reporting, increasing system users' perception of safety and further investigating the areas of home and community care in ensuring standard approaches at the local level. These steps can promote alignment, reducing barriers to a future pan-Canadian reporting and learning system.
Strategies and Innovative Approaches for the Future of Space Weather Forecasting
NASA Astrophysics Data System (ADS)
Hoeksema, J. T.
2012-12-01
The real and potential impacts of space weather have been well documented, yet neither the required research and operations programs, nor the data, modeling and analysis infrastructure necessary to develop and sustain a reliable space weather forecasting capability for a society are in place. The recently published decadal survey "Solar and Space Physics: A Science for a Technological Society" presents a vision for the coming decade and calls for a renewed national commitment to a comprehensive program in space weather and climatology. New resources are imperative. Particularly in the current fiscal environment, implementing a responsible strategy to address these needs will require broad participation across agencies and innovative approaches to make the most of existing resources, capitalize on current knowledge, span gaps in capabilities and observations, and focus resources on overcoming immediate roadblocks.
Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha; Spacek, Martin; Stehlikova, Olga; Gambell, Peter; McIver-Brown, Neil; Villamor, Neus; Psarra, Katherina; Arroz, Maria; Milani, Raffaella; de la Serna, Javier; Cedena, M Teresa; Jaksic, Ozren; Nomdedeu, Josep; Moreno, Carol; Rigolin, Gian Matteo; Cuneo, Antonio; Johansen, Preben; Johnsen, Hans E; Rosenquist, Richard; Niemann, Carsten Utoft; Kern, Wolfgang; Westerman, David; Trneny, Marek; Mulligan, Stephen; Doubek, Michael; Pospisilova, Sarka; Hillmen, Peter; Oscier, David; Hallek, Michael; Ghia, Paolo; Montserrat, Emili
2018-01-01
The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the diagnosis of CLL. ERIC/ESCCA members classified 14 of 35 potential markers as "required" or "recommended" for CLL diagnosis, consensus being defined as >75% and >50% agreement, respectively. An approach to validate "required" markers using normal peripheral blood was developed. Responses were received from 150 participants with a diagnostic workload >20 CLL cases per week in 23/150 (15%), 5-20 in 82/150 (55%), and <5 cases per week in 45/150 (30%). The consensus for "required" diagnostic markers included: CD19, CD5, CD20, CD23, Kappa, and Lambda. "Recommended" markers potentially useful for differential diagnosis were: CD43, CD79b, CD81, CD200, CD10, and ROR1. Reproducible criteria for component reagents were assessed retrospectively in 14,643 cases from 13 different centers and showed >97% concordance with current approaches. A pilot study to validate staining quality was completed in 11 centers. Markers considered as "required" for the diagnosis of CLL by the participants in this study (CD19, CD5, CD20, CD23, Kappa, and Lambda) are consistent with current diagnostic criteria and practice. Importantly, a reproducible approach to validate and apply these markers in individual laboratories has been identified. Finally, a consensus "recommended" panel of markers to refine diagnosis in borderline cases (CD43, CD79b, CD81, CD200, CD10, and ROR1) has been defined and will be prospectively evaluated. © 2017 International Clinical Cytometry Society. © 2017 The Authors. Cytometry Part B: Clinical Cytometry published by Wiley Periodicals, Inc. on behalf of International Clinical Cytometry Society.
Identification of linearised RMS-voltage dip patterns based on clustering in renewable plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, Edward
Generation units connected to the grid are currently required to meet low-voltage ride-through (LVRT) requirements. In most developed countries, these requirements also apply to renewable sources, mainly wind power plants and photovoltaic installations connected to the grid. This study proposes an alternative characterisation solution to classify and visualise a large number of collected events in light of current limits and requirements. The authors' approach is based on linearised root-mean-square-(RMS)-voltage trajectories, taking into account LRVT requirements, and a clustering process to identify the most likely pattern trajectories. The proposed solution gives extensive information on an event's severity by providing a simplemore » but complete visualisation of the linearised RMS-voltage patterns. In addition, these patterns are compared to current LVRT requirements to determine similarities or discrepancies. A large number of collected events can then be automatically classified and visualised for comparative purposes. Real disturbances collected from renewable sources in Spain are used to assess the proposed solution. Extensive results and discussions are also included in this study.« less
China’s Comprehensive Approach: Refining the U.S. Targeting Process to Inform U.S. Strategy
2018-04-20
control demonstrated by China, the subject matter expertise required to generate a comprehensive approach like China’s does exist. However, due to a vast...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...code) NATIONAL DEFENSE UNIVERSITY JOINT FORCES STAFF COLLEGE JOINT ADVANCED WARFIGHTING SCHOOL CHINA’S COMPREHENSIVE APPROACH
Design Tools for Cost-Effective Implementation of Planetary Protection Requirements
NASA Technical Reports Server (NTRS)
Hamlin, Louise; Belz, Andrea; Evans, Michael; Kastner, Jason; Satter, Celeste; Spry, Andy
2006-01-01
Since the Viking missions to Mars in the 1970s, accounting for the costs associated with planetary protection implementation has not been done systematically during early project formulation phases, leading to unanticipated costs during subsequent implementation phases of flight projects. The simultaneous development of more stringent planetary protection requirements, resulting from new knowledge about the limits of life on Earth, together with current plans to conduct life-detection experiments on a number of different solar system target bodies motivates a systematic approach to integrating planetary protection requirements and mission design. A current development effort at NASA's Jet Propulsion Laboratory is aimed at integrating planetary protection requirements more fully into the early phases of mission architecture formulation and at developing tools to more rigorously predict associated cost and schedule impacts of architecture options chosen to meet planetary protection requirements.
NASA Technical Reports Server (NTRS)
HarrisonFleming, Cody; Spencer, Melissa; Leveson, Nancy; Wilkinson, Chris
2012-01-01
The generation of minimum operational, safety, performance, and interoperability requirements is an important aspect of safely integrating new NextGen components into the Communication Navigation Surveillance and Air Traffic Management (CNS/ATM) system. These requirements are used as part of the implementation and approval processes. In addition, they provide guidance to determine the levels of design assurance and performance that are needed for each element of the new NextGen procedures, including aircraft, operator, and Air Navigation and Service Provider. Using the enhanced Airborne Traffic Situational Awareness for InTrail Procedure (ATSA-ITP) as an example, this report describes some limitations of the current process used for generating safety requirements and levels of required design assurance. An alternative process is described, as well as the argument for why the alternative can generate more comprehensive requirements and greater safety assurance than the current approach.
Personalized medicine and chronic obstructive pulmonary disease.
Wouters, E F M; Wouters, B B R A F; Augustin, I M L; Franssen, F M E
2017-05-01
The current review summarizes ongoing developments in personalized medicine and precision medicine in chronic obstructive pulmonary disease (COPD). Our current approach is far away of personalized management algorithms as current recommendations for COPD are largely based on a reductionist disease description, operationally defined by results of spirometry. Besides precision medicine developments, a personalized medicine approach in COPD is described based on a holistic approach of the patient and considering illness as the consequence of dynamic interactions within and between multiple interacting and self-adjusting systems. Pulmonary rehabilitation is described as a model of personalized medicine. Largely based on current understanding of inflammatory processes in COPD, targeted interventions in COPD are reviewed. Augmentation therapy for α-1-antitrypsine deficiency is described as model of precision medicine in COPD based in profound understanding of the related genetic endotype. Future developments of precision medicine in COPD require identification of relevant endotypes combined with proper identification of phenotypes involved in the complex and heterogeneous manifestations of COPD.
Toward an electrical power utility for space exploration
NASA Technical Reports Server (NTRS)
Bercaw, Robert W.
1989-01-01
Plans for space exploration depend on today's technology programs addressing the novel requirements of space-based enterprise. The requirements for electrical power will be formidable: megawatts in magnitude, reliability for multi-year missions and the flexibility to adapt to needs unanticipated at design time. The reasons for considering the power management and distribution in the various systems from a total mission perspective, rather than simply extrapolating current spacecraft design practice, are discussed. A utility approach to electric power being developed at the Lewis Research Center is described. It integrates requirements from a broad selection of current development programs with studies in which both space and terrestrial technologies are conceptually applied to exploration mission scenarios.
Tomblin Murphy, Gail; Birch, Stephen; MacKenzie, Adrian; Rigby, Janet
2016-12-12
As part of efforts to inform the development of a global human resources for health (HRH) strategy, a comprehensive methodology for estimating HRH supply and requirements was described in a companion paper. The purpose of this paper is to demonstrate the application of that methodology, using data publicly available online, to simulate the supply of and requirements for midwives, nurses, and physicians in the 32 high-income member countries of the Organisation for Economic Co-operation and Development (OECD) up to 2030. A model combining a stock-and-flow approach to simulate the future supply of each profession in each country-adjusted according to levels of HRH participation and activity-and a needs-based approach to simulate future HRH requirements was used. Most of the data to populate the model were obtained from the OECD's online indicator database. Other data were obtained from targeted internet searches and documents gathered as part of the companion paper. Relevant recent measures for each model parameter were found for at least one of the included countries. In total, 35% of the desired current data elements were found; assumed values were used for the other current data elements. Multiple scenarios were used to demonstrate the sensitivity of the simulations to different assumed future values of model parameters. Depending on the assumed future values of each model parameter, the simulated HRH gaps across the included countries could range from shortfalls of 74 000 midwives, 3.2 million nurses, and 1.2 million physicians to surpluses of 67 000 midwives, 2.9 million nurses, and 1.0 million physicians by 2030. Despite important gaps in the data publicly available online and the short time available to implement it, this paper demonstrates the basic feasibility of a more comprehensive, population needs-based approach to estimating HRH supply and requirements than most of those currently being used. HRH planners in individual countries, working with their respective stakeholder groups, would have more direct access to data on the relevant planning parameters and would thus be in an even better position to implement such an approach.
Revolutionizing Toxicity Testing For Predicting Developmental Outcomes (DNT4)
Characterizing risk from environmental chemical exposure currently requires extensive animal testing; however, alternative approaches are being researched to increase throughput of chemicals screened, decrease reliance on animal testing, and improve accuracy in predicting adverse...
Current-State Constrained Filter Bank for Wald Testing of Spacecraft Conjunctions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2012-01-01
We propose a filter bank consisting of an ordinary current-state extended Kalman filter, and two similar but constrained filters: one is constrained by a null hypothesis that the miss distance between two conjuncting spacecraft is inside their combined hard body radius at the predicted time of closest approach, and one is constrained by an alternative complementary hypothesis. The unconstrained filter is the basis of an initial screening for close approaches of interest. Once the initial screening detects a possibly risky conjunction, the unconstrained filter also governs measurement editing for all three filters, and predicts the time of closest approach. The constrained filters operate only when conjunctions of interest occur. The computed likelihoods of the innovations of the two constrained filters form a ratio for a Wald sequential probability ratio test. The Wald test guides risk mitigation maneuver decisions based on explicit false alarm and missed detection criteria. Since only current-state Kalman filtering is required to compute the innovations for the likelihood ratio, the present approach does not require the mapping of probability density forward to the time of closest approach. Instead, the hard-body constraint manifold is mapped to the filter update time by applying a sigma-point transformation to a projection function. Although many projectors are available, we choose one based on Lambert-style differential correction of the current-state velocity. We have tested our method using a scenario based on the Magnetospheric Multi-Scale mission, scheduled for launch in late 2014. This mission involves formation flight in highly elliptical orbits of four spinning spacecraft equipped with antennas extending 120 meters tip-to-tip. Eccentricities range from 0.82 to 0.91, and close approaches generally occur in the vicinity of perigee, where rapid changes in geometry may occur. Testing the method using two 12,000-case Monte Carlo simulations, we found the method achieved a missed detection rate of 0.1%, and a false alarm rate of 2%.
1992-02-01
purchased from: National Tecnical Information Service 5285 Port Royal Road Springfield VA 22161 Federal Governmet agencies and their contractors registered...Engineering Incpora:ted (IME) to organize and executi a tecnical approach to the QP= 14. SUIUECT TERMS Mission Area Requiremts, REST Escape SystM IS...the aerodynamic stabilization subsystems to become effective (drogue parachutes, or fins for the S4S), and the time required for the recovery parachute
Terrestrial Planet Finder Coronagraph and Enabling Technologies
NASA Technical Reports Server (NTRS)
Ford, Virginia G.
2005-01-01
Starlight suppression research is Stowed in Delta IV-H advancing rapidly to approach the required contrast ratio. The current analysis of the TPF Coronagraph system indicates that it is feasible to achieve the stability required by using developing technologies: a) Wave Front Sensing and Control (DMs, control algorithms, and sensing); b) Laser metrology. Yet needed: a) Property data measured with great precision in the required environments; b) Modeling tools that are verified with testbeds.
Determination of eddy current response with magnetic measurements.
Jiang, Y Z; Tan, Y; Gao, Z; Nakamura, K; Liu, W B; Wang, S Z; Zhong, H; Wang, B B
2017-09-01
Accurate mutual inductances between magnetic diagnostics and poloidal field coils are an essential requirement for determining the poloidal flux for plasma equilibrium reconstruction. The mutual inductance calibration of the flux loops and magnetic probes requires time-varying coil currents, which also simultaneously drive eddy currents in electrically conducting structures. The eddy current-induced field appearing in the magnetic measurements can substantially increase the calibration error in the model if the eddy currents are neglected. In this paper, an expression of the magnetic diagnostic response to the coil currents is used to calibrate the mutual inductances, estimate the conductor time constant, and predict the eddy currents response. It is found that the eddy current effects in magnetic signals can be well-explained by the eddy current response determination. A set of experiments using a specially shaped saddle coil diagnostic are conducted to measure the SUNIST-like eddy current response and to examine the accuracy of this method. In shots that include plasmas, this approach can more accurately determine the plasma-related response in the magnetic signals by eliminating the field due to the eddy currents produced by the external field.
ERIC Educational Resources Information Center
Heslin, J. Alexander, Jr.
In senior-level undergraduate research courses in Computer Information Systems (CIS), students are required to read and assimilate a large volume of current research literature. One course objective is to demonstrate to the student that there are patterns or models or paradigms of research. A new approach in identifying research paradigms is…
Global flowfield about the V-22 Tiltrotor Aircraft
NASA Technical Reports Server (NTRS)
Meakin, Robert L.
1995-01-01
The Chimera overset grid method is reviewed and discussed in the context of a method of solution and analysis of unsteady three-dimensional viscous flows. The state of maturity of the various pieces of support software required to use the approach is discussed. A variety of recent applications of the method is presented. Current limitations of the approach are identified.
The Chimera Method of Simulation for Unsteady Three-Dimensional Viscous Flow
NASA Technical Reports Server (NTRS)
Meakin, Robert L.
1996-01-01
The Chimera overset grid method is reviewed and discussed in the context of a method of solution and analysis of unsteady three-dimensional viscous flows. The state of maturity of the various pieces of support software required to use the approach is discussed. A variety of recent applications of the method is presented. Current limitations of the approach are defined.
Current opinion in Alzheimer's disease therapy by nanotechnology-based approaches.
Ansari, Shakeel Ahmed; Satar, Rukhsana; Perveen, Asma; Ashraf, Ghulam Md
2017-03-01
Nanotechnology typically deals with the measuring and modeling of matter at nanometer scale by incorporating the fields of engineering and technology. The most prominent feature of these engineered materials involves their manipulation/modification for imparting new functional properties. The current review covers the most recent findings of Alzheimer's disease (AD) therapeutics based on nanoscience and technology. Current studies involve the application of nanotechnology in developing novel diagnostic and therapeutic tools for neurological disorders. Nanotechnology-based approaches can be exploited for limiting/reversing these diseases for promoting functional regeneration of damaged neurons. These strategies offer neuroprotection by facilitating the delivery of drugs and small molecules more effectively across the blood-brain barrier. Nanotechnology based approaches show promise in improving AD therapeutics. Further replication work on synthesis and surface modification of nanoparticles, longer-term clinical trials, and attempts to increase their impact in treating AD are required.
Environmental release of living modified organisms: current approaches and case studies.
Thomas, E; Nickson, Ph D
2005-01-01
Agricultural biotechnology is being rapidly adopted as evidenced by the acreage of genetically modified (GM) crops planted and tonnes of product (grain and fiber) harvested. Concurrent with this technological progress, is a growing concern that the worlds biological diversity is coming under increasing threat from human activities. As such, ecological risk assessment approaches are being developed for GM crop plants as international agreements regulating the transboundary movements of these products are being implemented. This paper reviews the ecological risk assessment approach that has been used to date to approve GM crops to date. The process has been case-by-case, using a comparative, science-based approach balancing the potential risks and benefits of the new technology versus those present with the currently accepted practices. The approach used to evaluate and approve these products is consistent with the conditions and requirements outlined in the Cartagena Protocol.
Advanced General Aviation Turbine Engine (GATE) study
NASA Technical Reports Server (NTRS)
Smith, R.; Benstein, E. H.
1979-01-01
The small engine technology requirements suitable for general aviation service in the 1987 to 1988 time frame were defined. The market analysis showed potential United States engines sales of 31,500 per year providing that the turbine engine sales price approaches current reciprocating engine prices. An optimum engine design was prepared for four categories of fixed wing aircraft and for rotary wing applications. A common core approach was derived from the optimum engines that maximizes engine commonality over the power spectrum with a projected price competitive with reciprocating piston engines. The advanced technology features reduced engine cost, approximately 50 percent compared with current technology.
Current state and future prospects of immunotherapy for glioma.
Kamran, Neha; Alghamri, Mahmoud S; Nunez, Felipe J; Shah, Diana; Asad, Antonela S; Candolfi, Marianela; Altshuler, David; Lowenstein, Pedro R; Castro, Maria G
2018-02-01
There is a large unmet need for effective therapeutic approaches for glioma, the most malignant brain tumor. Clinical and preclinical studies have enormously expanded our knowledge about the molecular aspects of this deadly disease and its interaction with the host immune system. In this review we highlight the wide array of immunotherapeutic interventions that are currently being tested in glioma patients. Given the molecular heterogeneity, tumor immunoediting and the profound immunosuppression that characterize glioma, it has become clear that combinatorial approaches targeting multiple pathways tailored to the genetic signature of the tumor will be required in order to achieve optimal therapeutic efficacy.
NASA Technical Reports Server (NTRS)
Rocco, David A.
1994-01-01
Redefining the approach and philosophy that operations management uses to define, develop, and implement space missions will be a central element in achieving high efficiency mission operations for the future. The goal of a cost effective space operations program cannot be realized if the attitudes and methodologies we currently employ to plan, develop, and manage space missions do not change. A management philosophy that is in synch with the environment in terms of budget, technology, and science objectives must be developed. Changing our basic perception of mission operations will require a shift in the way we view the mission. This requires a transition from current practices of viewing the mission as a unique end product, to a 'mission development concept' built on the visualization of the end-to-end mission. To achieve this change we must define realistic mission success criteria and develop pragmatic approaches to achieve our goals. Custom mission development for all but the largest and most unique programs is not practical in the current budget environment, and we simply do not have the resources to implement all of our planned science programs. We need to shift our management focus to allow us the opportunity make use of methodologies and approaches which are based on common building blocks that can be utilized in the space, ground, and mission unique segments of all missions.
Mechanistic materials modeling for nuclear fuel performance
Tonks, Michael R.; Andersson, David; Phillpot, Simon R.; ...
2017-03-15
Fuel performance codes are critical tools for the design, certification, and safety analysis of nuclear reactors. However, their ability to predict fuel behavior under abnormal conditions is severely limited by their considerable reliance on empirical materials models correlated to burn-up (a measure of the number of fission events that have occurred, but not a unique measure of the history of the material). In this paper, we propose a different paradigm for fuel performance codes to employ mechanistic materials models that are based on the current state of the evolving microstructure rather than burn-up. In this approach, a series of statemore » variables are stored at material points and define the current state of the microstructure. The evolution of these state variables is defined by mechanistic models that are functions of fuel conditions and other state variables. The material properties of the fuel and cladding are determined from microstructure/property relationships that are functions of the state variables and the current fuel conditions. Multiscale modeling and simulation is being used in conjunction with experimental data to inform the development of these models. Finally, this mechanistic, microstructure-based approach has the potential to provide a more predictive fuel performance capability, but will require a team of researchers to complete the required development and to validate the approach.« less
Kratchman, Louis B.; Schurzig, Daniel; McRackan, Theodore R.; Balachandran, Ramya; Noble, Jack H.; Webster, Robert J.; Labadie, Robert F.
2014-01-01
The current technique for cochlear implantation (CI) surgery requires a mastoidectomy to gain access to the cochlea for electrode array insertion. It has been shown that microstereotactic frames can enable an image-guided, minimally invasive approach to CI surgery called percutaneous cochlear implantation (PCI) that uses a single drill hole for electrode array insertion, avoiding a more invasive mastoidectomy. Current clinical methods for electrode array insertion are not compatible with PCI surgery because they require a mastoidectomy to access the cochlea; thus, we have developed a manually operated electrode array insertion tool that can be deployed through a PCI drill hole. The tool can be adjusted using a preoperative CT scan for accurate execution of the advance off-stylet (AOS) insertion technique and requires less skill to operate than is currently required to implant electrode arrays. We performed three cadaver insertion experiments using the AOS technique and determined that all insertions were successful using CT and microdissection. PMID:22851233
A Node Linkage Approach for Sequential Pattern Mining
Navarro, Osvaldo; Cumplido, René; Villaseñor-Pineda, Luis; Feregrino-Uribe, Claudia; Carrasco-Ochoa, Jesús Ariel
2014-01-01
Sequential Pattern Mining is a widely addressed problem in data mining, with applications such as analyzing Web usage, examining purchase behavior, and text mining, among others. Nevertheless, with the dramatic increase in data volume, the current approaches prove inefficient when dealing with large input datasets, a large number of different symbols and low minimum supports. In this paper, we propose a new sequential pattern mining algorithm, which follows a pattern-growth scheme to discover sequential patterns. Unlike most pattern growth algorithms, our approach does not build a data structure to represent the input dataset, but instead accesses the required sequences through pseudo-projection databases, achieving better runtime and reducing memory requirements. Our algorithm traverses the search space in a depth-first fashion and only preserves in memory a pattern node linkage and the pseudo-projections required for the branch being explored at the time. Experimental results show that our new approach, the Node Linkage Depth-First Traversal algorithm (NLDFT), has better performance and scalability in comparison with state of the art algorithms. PMID:24933123
Industrial WSN Based on IR-UWB and a Low-Latency MAC Protocol
NASA Astrophysics Data System (ADS)
Reinhold, Rafael; Underberg, Lisa; Wulf, Armin; Kays, Ruediger
2016-07-01
Wireless sensor networks for industrial communication require high reliability and low latency. As current wireless sensor networks do not entirely meet these requirements, novel system approaches need to be developed. Since ultra wideband communication systems seem to be a promising approach, this paper evaluates the performance of the IEEE 802.15.4 impulse-radio ultra-wideband physical layer and the IEEE 802.15.4 Low Latency Deterministic Network (LLDN) MAC for industrial applications. Novel approaches and system adaptions are proposed to meet the application requirements. In this regard, a synchronization approach based on circular average magnitude difference functions (CAMDF) and on a clean template (CT) is presented for the correlation receiver. An adapted MAC protocol titled aggregated low latency (ALL) MAC is proposed to significantly reduce the resulting latency. Based on the system proposals, a hardware prototype has been developed, which proves the feasibility of the system and visualizes the real-time performance of the MAC protocol.
Realizing steady-state tokamak operation for fusion energy
NASA Astrophysics Data System (ADS)
Luce, T. C.
2011-03-01
Continuous operation of a tokamak for fusion energy has clear engineering advantages but requires conditions beyond those sufficient for a burning plasma. The fusion reactions and external sources must support both the pressure and the current equilibrium without inductive current drive, leading to demands on stability, confinement, current drive, and plasma-wall interactions that exceed those for pulsed tokamaks. These conditions have been met individually, and significant progress has been made in the past decade to realize scenarios where the required conditions are obtained simultaneously. Tokamaks are operated routinely without disruptions near pressure limits, as needed for steady-state operation. Fully noninductive sustainment with more than half of the current from intrinsic currents has been obtained for a resistive time with normalized pressure and confinement approaching those needed for steady-state conditions. One remaining challenge is handling the heat and particle fluxes expected in a steady-state tokamak without compromising the core plasma performance.
Digital Repositories and the Question of Data Usefulness
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Downs, R. R.
2017-12-01
The advent of ISO standards for trustworthy long-term digital repositories provides both a set of principles to develop long-term data repositories and the instruments to assess them for trustworthiness. Such mandatory high-level requirements are broad enough to be achievable, to some extent, by many scientific data centers, archives, and other repositories. But the requirement that the data be useful in the future, the requirement that is usually considered to be most relevant to the value of the repository for its user communities, largely remains subject to various interpretations and misunderstanding. However, current and future users will be relying on repositories to preserve and disseminate the data and information needed to discover, understand, and utilize these resources to support their research, learning, and decision-making objectives. Therefore, further study is needed to determine the approaches that can be adopted by repositories to make data useful to future communities of users. This presentation will describe approaches for enabling scientific data and related information, such as software, to be useful for current and potential future user communities and will present the methodology chosen to make one science discipline's data useful for both current and future users. The method uses an ontology-based information model to define and capture the information necessary to make the data useful for contemporary and future users.
Oud, Emerentiana Veronica; de Vrieze, Nynke Hesselina Neeltje; de Meij, Arjan; de Vries, Henry John C
2014-06-01
Current lymphogranuloma venereum (LGV) guidelines mainly focus on anorectal infections. Inguinal LGV infections have been rare in the current epidemic among men who have sex with men (MSM), but might require a different approach not yet recommended in current guidelines for the treatment and diagnosis of LGV. We describe 4 inguinal LGV cases. Three MSM developed inguinal LGV infection several weeks after a previous consultation, of which two had received azithromycin after being notified for LGV. Three failed the recommended 21 days doxycycline treatment. These inguinal LGV cases highlight 3 pitfalls in the current standard management of LGV: (1) Urethral chlamydia infections in MSM can be caused by LGV biovars that in contrast to non-LGV biovars require prolonged antibiotic therapy. (2) The recommended one gram azithromycin contact treatment seems insufficient to prevent established infections. (3) Inguinal LGV may require prolonged courses of doxycycline, exceeding the currently advised 21 days regimen. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A risk-based classification scheme for genetically modified foods. II: Graded testing.
Chao, Eunice; Krewski, Daniel
2008-12-01
This paper presents a graded approach to the testing of crop-derived genetically modified (GM) foods based on concern levels in a proposed risk-based classification scheme (RBCS) and currently available testing methods. A graded approach offers the potential for more efficient use of testing resources by focusing less on lower concern GM foods, and more on higher concern foods. In this proposed approach to graded testing, products that are classified as Level I would have met baseline testing requirements that are comparable to what is widely applied to premarket assessment of GM foods at present. In most cases, Level I products would require no further testing, or very limited confirmatory analyses. For products classified as Level II or higher, additional testing would be required, depending on the type of the substance, prior dietary history, estimated exposure level, prior knowledge of toxicity of the substance, and the nature of the concern related to unintended changes in the modified food. Level III testing applies only to the assessment of toxic and antinutritional effects from intended changes and is tailored to the nature of the substance in question. Since appropriate test methods are not currently available for all effects of concern, future research to strengthen the testing of GM foods is discussed.
Hain, Christopher R; Anderson, Martha C
2017-10-16
Observations of land surface temperature (LST) are crucial for the monitoring of surface energy fluxes from satellite. Methods that require high temporal resolution LST observations (e.g., from geostationary orbit) can be difficult to apply globally because several geostationary sensors are required to attain near-global coverage (60°N to 60°S). While these LST observations are available from polar-orbiting sensors, providing global coverage at higher spatial resolutions, the temporal sampling (twice daily observations) can pose significant limitations. For example, the Atmosphere Land Exchange Inverse (ALEXI) surface energy balance model, used for monitoring evapotranspiration and drought, requires an observation of the morning change in LST - a quantity not directly observable from polar-orbiting sensors. Therefore, we have developed and evaluated a data-mining approach to estimate the mid-morning rise in LST from a single sensor (2 observations per day) of LST from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on the Aqua platform. In general, the data-mining approach produced estimates with low relative error (5 to 10%) and statistically significant correlations when compared against geostationary observations. This approach will facilitate global, near real-time applications of ALEXI at higher spatial and temporal coverage from a single sensor than currently achievable with current geostationary datasets.
NASA Astrophysics Data System (ADS)
Ginzburg, D.; Knafo, Y.; Manor, A.; Seif, R.; Ghelman, M.; Ellenbogen, M.; Pushkarsky, V.; Ifergan, Y.; Semyonov, N.; Wengrowicz, U.; Mazor, T.; Kadmon, Y.; Cohen, Y.; Osovizky, A.
2015-06-01
There is a need to develop new personal radiation detector (PRD) technologies that can be mass produced. On August 2013, DARPA released a request for information (RFI) seeking innovative radiation detection technologies. In addition, on December 2013, a Broad Agency Announcement (BAA) for the SIGMA program was released. The RFI requirements focused on a sensor that should possess three main properties: low cost, high compactness and radioisotope identification capabilities. The identification performances should facilitate the detection of a hidden threat, ranging from special nuclear materials (SNM) to commonly used radiological sources. Subsequently, the BAA presented the specific requirements at an instrument level and provided a comparison between the current market status (state-of-the-art) and the SIGMA program objectives. This work presents an optional alternative for both the detection technology (sensor with communication output and without user interface) for DARPA's initial RFI and for the PRD required by the SIGMA program. A broad discussion is dedicated to the method proposed to fulfill the program objectives and to the selected alternative that is based on the PDS-GO design and technology. The PDS-GO is the first commercially available PRD that is based on a scintillation crystal optically coupled with a silicon photomultiplier (SiPM), a solid-state light sensor. This work presents the current performance of the instrument and possible future upgrades based on recent technological improvements in the SiPM design. The approach of utilizing the SiPM with a commonly available CsI(Tl) crystal is the key for achieving the program objectives. This approach provides the appropriate performance, low cost, mass production and small dimensions; however, it requires a creative approach to overcome the obstacles of the solid-state detector dark current (noise) and gain stabilization over a wide temperature range. Based on the presented results, we presume that the proposed approach of SiPM, with pixel size of 35 μm, coupled to a scintillation material (for gamma and neutron detection) ensures the availability and low cost of the key components. Furthermore, automated manufacturing process enables mass production, thereby fulfilling the SIGMA program requirements, both as a sensor (assimilated with mobile device) and as a full detection device.
Toward a standard lexicon for ecosystem services
The complex, widely dispersed, and cumulative environmental challenges currently facing society require holistic, transdisciplinary approaches to resolve. The concept of ecosystem services (ES) has become more widely accepted both as a framework that cuts across the dimensions of...
PI controller design for indirect vector controlled induction motor: A decoupling approach.
Jain, Jitendra Kr; Ghosh, Sandip; Maity, Somnath; Dworak, Pawel
2017-09-01
Decoupling of the stator currents is important for smoother torque response of indirect vector controlled induction motors. Typically, feedforward decoupling is used to take care of current coupling that requires exact knowledge of motor parameters, additional circuitry and signal processing. In this paper, a method is proposed to design the regulating proportional-integral gains that minimize coupling without any requirement of the additional decoupler. The variation of the coupling terms for change in load torque is considered as the performance measure. An iterative linear matrix inequality based H ∞ control design approach is used to obtain the controller gains. A comparison between the feedforward and the proposed decoupling schemes is presented through simulation and experimental results. The results show that the proposed scheme is simple yet effective even without additional block or burden on signal processing. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
An approach for configuring space photovoltaic tandem arrays based on cell layer performance
NASA Technical Reports Server (NTRS)
Flora, C. S.; Dillard, P. A.
1991-01-01
Meeting solar array performance goals of 300 W/Kg requires use of solar cells with orbital efficiencies greater than 20 percent. Only multijunction cells and cell layers operating in tandem produce this required efficiency. An approach for defining solar array design concepts that use tandem cell layers involve the following: transforming cell layer performance at standard test conditions to on-orbit performance; optimizing circuit configuration with tandem cell layers; evaluating circuit sensitivity to cell current mismatch; developing array electrical design around selected circuit; and predicting array orbital performance including seasonal variations.
NASA Technical Reports Server (NTRS)
Kezirian, Michael; Cook, Anthony; Dick, Brandon; Phoenix, S. Leigh
2012-01-01
To supply oxygen and nitrogen to the International Space Station, a COPV tank is being developed to meet requirements beyond that which have been flown. In order to "Ship Full' and support compatibility with a range of launch site operations, the vessel was designed for certification to International Standards (ISO) that have a different approach than current NASA certification approaches. These requirements were in addition to existing NASA certification standards had to be met. Initial risk-reduction development tests have been successful. Qualification is in progress.
Mimura, C; Griffiths, P
2003-01-01
The effectiveness of current approaches to workplace stress management for nurses was assessed through a systematic review. Seven randomised controlled trials and three prospective cohort studies assessing the effectiveness of a stress management programmes were identified and reviewed. The quality of research identified was weak. There is more evidence for the effectiveness of programmes based on providing personal support than environmental management to reduce stressors. However, since the number and quality of studies is low, the question as to which, if any, approach is more effective cannot be answered definitively. Further research is required before clear recommendations for the use of particular interventions for nursing work related stress can be made. PMID:12499451
Telerehabilitation and emerging virtual reality approaches to stroke rehabilitation.
Putrino, David
2014-12-01
Stroke is the leading cause of permanent motor disability in the United States, and the rapidly aging population makes finding large-scale treatment solutions to this problem a national priority. Telerehabilitation is an emerging approach that is being used for the effective treatment of multiple diseases, and is beginning to show promise for stroke. The purpose of this review is to identify and highlight the areas of telerehabilitation that require the most research attention. Although there are many different forms of telerehabilitation approaches being attempted for stroke, the only approach that is currently showing moderate-strong evidence for efficacy is videogame-driven telerehabilitation (VGDT). However, targeted research is still required to determine the feasibility of VGDT: metrics regarding system usability, cost-effectiveness, and data privacy concerns still require major attention. VGDT is an emerging approach that shows enormous promise for stroke rehabilitation. Future studies should focus less on developing custom task controllers and therapy games and more on developing innovative, online data acquisition and analytics pipelines, as well as understanding the patient population so that the rehabilitation experience can be better customized.
Linking Goal-Oriented Requirements and Model-Driven Development
NASA Astrophysics Data System (ADS)
Pastor, Oscar; Giachetti, Giovanni
In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra
2016-01-01
Objectives A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. Methods This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Results Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. Conclusions The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information. PMID:26893945
Designing eHealth that Matters via a Multidisciplinary Requirements Development Approach.
Van Velsen, Lex; Wentzel, Jobke; Van Gemert-Pijnen, Julia Ewc
2013-06-24
Requirements development is a crucial part of eHealth design. It entails all the activities devoted to requirements identification, the communication of requirements to other developers, and their evaluation. Currently, a requirements development approach geared towards the specifics of the eHealth domain is lacking. This is likely to result in a mismatch between the developed technology and end user characteristics, physical surroundings, and the organizational context of use. It also makes it hard to judge the quality of eHealth design, since it makes it difficult to gear evaluations of eHealth to the main goals it is supposed to serve. In order to facilitate the creation of eHealth that matters, we present a practical, multidisciplinary requirements development approach which is embedded in a holistic design approach for eHealth (the Center for eHealth Research roadmap) that incorporates both human-centered design and business modeling. Our requirements development approach consists of five phases. In the first, preparatory, phase the project team is composed and the overall goal(s) of the eHealth intervention are decided upon. Second, primary end users and other stakeholders are identified by means of audience segmentation techniques and our stakeholder identification method. Third, the designated context of use is mapped and end users are profiled by means of requirements elicitation methods (eg, interviews, focus groups, or observations). Fourth, stakeholder values and eHealth intervention requirements are distilled from data transcripts, which leads to phase five, in which requirements are communicated to other developers using a requirements notation template we developed specifically for the context of eHealth technologies. The end result of our requirements development approach for eHealth interventions is a design document which includes functional and non-functional requirements, a list of stakeholder values, and end user profiles in the form of personas (fictitious end users, representative of a primary end user group). The requirements development approach presented in this article enables eHealth developers to apply a systematic and multi-disciplinary approach towards the creation of requirements. The cooperation between health, engineering, and social sciences creates a situation in which a mismatch between design, end users, and the organizational context can be avoided. Furthermore, we suggest to evaluate eHealth on a feature-specific level in order to learn exactly why such a technology does or does not live up to its expectations.
Surgical quality assessment. A simplified approach.
DeLong, D L
1991-10-01
The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.
CAD/CAM data management needs, requirements and options
NASA Technical Reports Server (NTRS)
Lopatka, R. S.; Johnson, T. G.
1978-01-01
The requirements for a data management system in support of technical or scientific applications and possible courses of action were reviewed. Specific requirements were evolved while working towards higher level integration impacting all phases of the current design process and through examination of commercially marketed systems and related data base research. Arguments are proposed for varied approaches in implementing data base systems ranging from no action necessary to immediate procurement of an existing data base management system.
Cross-cultural training of general practitioner registrars: how does it happen?
Watt, Kelly; Abbott, Penny; Reath, Jenny
2016-01-01
An equitable multicultural society requires general practitioners (GPs) to be proficient in providing health care to patients from diverse backgrounds. GPs are required to have a certain attitudes, knowledge and skills known as cultural competence. Given its importance to registrar training, the aim of this study was to explore ways in which GP registrars are currently developing cultural competence. This study employed a survey design for GP registrars in Western Sydney. Training approaches to cultural competence that are relevant to the Australian General Practice setting include exposure to diversity, attitudes, knowledge and skills development. The 43 GP registrar respondents in Western Sydney are exposed to a culturally diverse patient load during training. Registrars report a variety of teachings related to cross-cultural training, but there is little consistency, with the most common approach entailing listening to patients' personal stories. Exposure to cultural diversity appears to be an important way in which cultural competency is developed. However, guidance and facilitation of skills development throughout this exposure is required and currently may occur opportunistically rather than consistently.
Space Telescope Sensitivity and Controls for Exoplanet Imaging
NASA Technical Reports Server (NTRS)
Lyon, Richard G.; Clampin, Mark
2012-01-01
Herein we address design considerations and outline requirements for space telescopes with capabilities for high contrast imaging of exoplanets. The approach taken is to identify the span of potentially detectable Earth-sized terrestrial planets in the habitable zone of the nearest stars within 30 parsecs and estimate their inner working angles, flux ratios, SNR, sensitivities, wavefront error requirements and sensing and control times parametrically versus aperture size. We consider 1, 2, 4, 8 and 16-meter diameter telescope apertures. The achievable science, range of telescope architectures, and the coronagraphic approach are all active areas of research and are all subject to change in a rapidly evolving field. Thus, presented is a snapshot of our current understanding with the goal of limiting the choices to those that appear currently technically feasible. We describe the top-level metrics of inner working angle, contrast and photometric throughput and explore how they are related to the range of target stars. A critical point is that for each telescope architecture and coronagraphic choice the telescope stability requirements have differing impacts on the design for open versus closed-loop sensing and control.
A-type potassium currents in smooth muscle.
Amberg, Gregory C; Koh, Sang Don; Imaizumi, Yuji; Ohya, Susumu; Sanders, Kenton M
2003-03-01
A-type currents are voltage-gated, calcium-independent potassium (Kv) currents that undergo rapid activation and inactivation. Commonly associated with neuronal and cardiac cell-types, A-type currents have also been identified and characterized in vascular, genitourinary, and gastrointestinal smooth muscle cells. This review examines the molecular identity, biophysical properties, pharmacology, regulation, and physiological function of smooth muscle A-type currents. In general, this review is intended to facilitate the comparison of A-type currents present in different smooth muscles by providing a comprehensive report of the literature to date. This approach should also aid in the identification of areas of research requiring further attention.
Transforming the Way DOD Looks at Energy. An Approach to Establishing an Energy Strategy
2007-04-01
than the aromatic-containing, petroleum-based gasoline and diesel, reducing emissions . Two biofuels in current use are B20 and E85.6 These fuels...penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE APR 2007 2...areas of disconnect between DoD’s current energy consump- tion practices and the capability requirements of its strategic goals: Strategic. DoD seeks
Development of the Releasable Asbestos Field Sampler
A risk assessment for intermittent, low-level exposure to asbestos requires personal breathing concentration data. Currently, activity-based sampling (ABS) is the preferred approach to measurement of a person’s inhalation exposure; i.e., asbestos structures per cubic centimeter ...
Combustor materials requirements and status of ceramic matrix composites
NASA Technical Reports Server (NTRS)
Hecht, Ralph J.; Johnson, Andrew M.
1992-01-01
The HSCT combustor will be required to operate with either extremely rich or lean fuel/air ratios to reduce NO(x) emission. NASA High Speed Research (HSR) sponsored programs at Pratt & Whitney (P&W) and GE Aircraft Engines (GEAE) have been studying rich and lean burn combustor design approaches which are capable of achieving the aggressive HSCT NO(x) emission goals. In both of the combustor design approaches under study, high temperature (2400-3000 F) materials are necessary to meet the HSCT emission goals of 3-8 gm/kg. Currently available materials will not meet the projected requirements for the HSCT combustor. The development of new materials is an enabling technology for the successful introduction to service of the HSCT.
Development and analysis of SCR requirements tables for system scenarios
NASA Technical Reports Server (NTRS)
Callahan, John R.; Morrison, Jeffery L.
1995-01-01
We describe the use of scenarios to develop and refine requirement tables for parts of the Earth Observing System Data and Information System (EOSDIS). The National Aeronautics and Space Administration (NASA) is developing EOSDIS as part of its Mission-To-Planet-Earth (MTPE) project to accept instrument/platform observation requests from end-user scientists, schedule and perform requested observations of the Earth from space, collect and process the observed data, and distribute data to scientists and archives. Current requirements for the system are managed with tools that allow developers to trace the relationships between requirements and other development artifacts, including other requirements. In addition, the user community (e.g., earth and atmospheric scientists), in conjunction with NASA, has generated scenarios describing the actions of EOSDIS subsystems in response to user requests and other system activities. As part of a research effort in verification and validation techniques, this paper describes our efforts to develop requirements tables from these scenarios for the EOSDIS Core System (ECS). The tables specify event-driven mode transitions based on techniques developed by the Naval Research Lab's (NRL) Software Cost Reduction (SCR) project. The SCR approach has proven effective in specifying requirements for large systems in an unambiguous, terse format that enhance identification of incomplete and inconsistent requirements. We describe development of SCR tables from user scenarios and identify the strengths and weaknesses of our approach in contrast to the requirements tracing approach. We also evaluate the capabilities of both approach to respond to the volatility of requirements in large, complex systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2010-05-23
The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.
Solid Waste Program technical baseline description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlson, A.B.
1994-07-01
The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saifee, T.; Konnerth, A. III
1991-11-01
Solar Kinetics, Inc. (SKI) has been developing point-focus concentrating PV modules since 1986. SKI is currently in position to manufacture between 200 to 600 kilowatts annually of the current design by a combination of manual and semi-automated methods. This report reviews the current status of module manufacture and specifies the required approach to achieve a high-volume manufacturing capability and low cost. The approach taken will include process development concurrent with module design for automated manufacturing. The current effort reviews the major manufacturing costs and identifies components and processes whose improvements would produce the greatest effect on manufacturability and cost reduction.more » The Fresnel lens is one such key component. Investigating specific alternative manufacturing methods and sources has substantially reduced the lens costs and has exceeded the DOE cost-reduction goals. 15 refs.« less
Advances in mechanisms, diagnosis, and treatment of pernicious anemia.
Rojas Hernandez, Cristhiam M; Oo, Thein Hlaing
2015-03-01
Pernicious anemia (PA) is an entity initially described in 1849 as a condition that consisted of pallor, weakness, and progressive health decline. Since then several advances led to the conclusion that PA is an autoimmune disease characterized by the deficient absorption of dietary cobalamin. It is currently recognized as the most common cause of cobalamin deficiency worldwide. We hereby review the current understanding of the disease and its neurological, hematological, and biochemical manifestations with emphasis on the diagnostic approach, treatment, and monitoring strategies. We propose an algorithm for the diagnostic approach considering the current performance and limitations of the available diagnostic tools for evaluation of cobalamin status and the presence of autoimmune chronic atrophic gastritis (CAG). Patients with PA require lifelong treatment with cobalamin replacement therapy. The current widely available treatment can be provided through enteral or parenteral cobalamin supplements, with comparable efficacy and tolerability.
Fatigue Risk Management: A Maritime Framework
Grech, Michelle Rita
2016-01-01
It is evident that despite efforts directed at mitigating the risk of fatigue through the adoption of hours of work and rest regulations and development of codes and guidelines, fatigue still remains a concern in shipping. Lack of fatigue management has been identified as a contributory factor in a number of recent accidents. This is further substantiated through research reports with shortfalls highlighted in current fatigue management approaches. These approaches mainly focus on prescriptive hours of work and rest and include an individualistic approach to managing fatigue. The expectation is that seafarers are responsible to manage and tolerate fatigue as part of their working life at sea. This attitude is an accepted part of a seafarer’s role. Poor compliance is one manifest of this problem with shipboard demands making it hard for seafarers to follow hours of work and rest regulations, forcing them into this “poor compliance” trap. This makes current fatigue management approaches ineffective. This paper proposes a risk based approach and way forward for the implementation of a fatigue risk management framework for shipping, aiming to support the hours of work and rest requirements. This forms part of the work currently underway to review and update the International Maritime Organization, Guidelines on Fatigue. PMID:26840326
Fatigue Risk Management: A Maritime Framework.
Grech, Michelle Rita
2016-01-29
It is evident that despite efforts directed at mitigating the risk of fatigue through the adoption of hours of work and rest regulations and development of codes and guidelines, fatigue still remains a concern in shipping. Lack of fatigue management has been identified as a contributory factor in a number of recent accidents. This is further substantiated through research reports with shortfalls highlighted in current fatigue management approaches. These approaches mainly focus on prescriptive hours of work and rest and include an individualistic approach to managing fatigue. The expectation is that seafarers are responsible to manage and tolerate fatigue as part of their working life at sea. This attitude is an accepted part of a seafarer's role. Poor compliance is one manifest of this problem with shipboard demands making it hard for seafarers to follow hours of work and rest regulations, forcing them into this "poor compliance" trap. This makes current fatigue management approaches ineffective. This paper proposes a risk based approach and way forward for the implementation of a fatigue risk management framework for shipping, aiming to support the hours of work and rest requirements. This forms part of the work currently underway to review and update the International Maritime Organization, Guidelines on Fatigue.
Lock-in thermography approach for imaging the efficiency of light emitters and optical coolers
NASA Astrophysics Data System (ADS)
Radevici, Ivan; Tiira, Jonna; Oksanen, Jani
2017-02-01
Developing optical cooling technologies requires access to reliable efficiency measurement techniques and ability to detect spatial variations in the efficiency and light emission of the devices. We investigate the possibility to combine the calorimetric efficiency measurement principles with lock-in thermography (LIT) and conventional luminescence microscopy to enable spatially resolved measurement of the efficiency, current spreading and local device heating of double diode structures (DDS) serving as test vessels for developing thermophotonic cooling devices. Our approach enables spatially resolved characterization and localization of the losses of the double diode structures as well as other light emitting semiconductor devices. In particular, the approach may allow directly observing effects like current crowding and surface recombination on the light emission and heating of the DDS devices.
A prototype analysis of forgiveness.
Kearns, Jill N; Fincham, Frank D
2004-07-01
Many definitions of forgiveness currently exist in the literature. The current research adds to this discussion by utilizing a prototype approach to examine lay conceptions of forgiveness. A prototype approach involves categorizing objects or events in terms of their similarity to a good example, whereas a classical approach requires that there are essential elements that must be present. In Study 1, participants listed the features of forgiveness. Study 2 obtained centrality ratings for these features. In Studies 3 and 4, central features were found to be more salient in memory than peripheral features. Study 5 showed that feature centrality influenced participants' ratings of victims involved in hypothetical transgressions. Thus, the two criteria for demonstrating prototype structure (that participants find it meaningful to judge features in terms of their centrality and that centrality affects cognition) were met.
Kim, Katherine K; Browe, Dennis K; Logan, Holly C; Holm, Roberta; Hack, Lori; Ohno-Machado, Lucila
2014-01-01
There is currently limited information on best practices for the development of governance requirements for distributed research networks (DRNs), an emerging model that promotes clinical data reuse and improves timeliness of comparative effectiveness research. Much of the existing information is based on a single type of stakeholder such as researchers or administrators. This paper reports on a triangulated approach to developing DRN data governance requirements based on a combination of policy analysis with experts, interviews with institutional leaders, and patient focus groups. This approach is illustrated with an example from the Scalable National Network for Effectiveness Research, which resulted in 91 requirements. These requirements were analyzed against the Fair Information Practice Principles (FIPPs) and Health Insurance Portability and Accountability Act (HIPAA) protected versus non-protected health information. The requirements addressed all FIPPs, showing how a DRN's technical infrastructure is able to fulfill HIPAA regulations, protect privacy, and provide a trustworthy platform for research. PMID:24302285
Biomarkers for optimal requirements of amino acids by animals and humans.
Lin, Gang; Liu, Chuang; Wang, Taiji; Wu, Guoyao; Qiao, Shiyan; Li, Defa; Wang, Junjun
2011-06-01
Amino acids are building blocks of proteins and key regulators of nutrient metabolism in cells. However, excessive intake of amino acids can be toxic to the body. Therefore, it is important to precisely determine amino acid requirements by organisms. To date, none of the methods is completely satisfactory to generate comprehensive data on amino acid requirements of animals or humans. Because of many influencing factors, amino acid requirements remain a complex and controversial issue in nutrition that warrants further investigations. Benefiting from the rapid advances in the emerging omics technologies and bioinformatics, biomarker discovery shows great potential in obtaining in-depth understanding of regulatory networks in protein metabolism. This review summarizes the current approaches to assess amino acid requirements of animals and humans, as well as the recent development of biomarkers as potentially functional parameters for recommending requirements of individual amino acids in health and disease. Identification of biomarkers in plasma or serum, which is a noninvasive approach, holds great promise in rapidly advancing the field of protein nutrition.
Feasibility of a fetal measurement electrode system
NASA Technical Reports Server (NTRS)
1977-01-01
Findings of the study are summarized and conclude that all monitoring requirements are not currently satisfied. An approach is presented to provide a multiparametric monitoring system through combinations of existing transducers. This monitoring system would be appropriate, not only for intrapartum monitoring, but also for neonatal and adult blood gas evaluations. A literature search was conducted to provide an insight into current state-of-the-art in fetal monitoring.
NASA Technical Reports Server (NTRS)
Roman, Monserrate C.; Jones, Kathy U.; Oubre, Cherie M.; Castro, Victoria; Ott, Mark C.; Birmele, Michele; Venkateswaran, Kasthuri J.; Vaishampayan, Parag A.
2013-01-01
Current methods for microbial detection: a) Labor & time intensive cultivation-based approaches that can fail to detect or characterize all cells present. b) Requires collection of samples on orbit and transportation back to ground for analysis. Disadvantages to current detection methods: a) Unable to perform quick and reliable detection on orbit. b) Lengthy sampling intervals. c) No microbe identification.
Proactive detection of bones in poultry processing
NASA Astrophysics Data System (ADS)
Daley, W. D. R.; Stewart, John
2009-05-01
Bones continue to be a problem of concern for the poultry industry. Most further processed products begin with the requirement for raw material with minimal bones. The current process for generating deboned product requires systems for monitoring and inspecting the output product. The current detection systems are either people palpitating the product or X-ray systems. The current performance of these inspection techniques are below the desired levels of accuracies and are costly. We propose a technique for monitoring bones that conduct the inspection operation in the deboning the process so as to have enough time to take action to reduce the probability that bones will end up in the final product. This is accomplished by developing active cones with built in illumination to backlight the cage (skeleton) on the deboning line. If the bones of interest are still on the cage then the bones are not in the associated meat. This approach also allows for the ability to practice process control on the deboning operation to keep the process under control as opposed to the current system where the detection is done post production and does not easily present the opportunity to adjust the process. The proposed approach shows overall accuracies of about 94% for the detection of the clavicle bones.
Direct Reconstruction of Two-Dimensional Currents in Thin Films from Magnetic-Field Measurements
NASA Astrophysics Data System (ADS)
Meltzer, Alexander Y.; Levin, Eitan; Zeldov, Eli
2017-12-01
An accurate determination of microscopic transport and magnetization currents is of central importance for the study of the electric properties of low-dimensional materials and interfaces, of superconducting thin films, and of electronic devices. Current distribution is usually derived from the measurement of the perpendicular component of the magnetic field above the surface of the sample, followed by numerical inversion of the Biot-Savart law. The inversion is commonly obtained by deriving the current stream function g , which is then differentiated in order to obtain the current distribution. However, this two-step procedure requires filtering at each step and, as a result, oversmooths the solution. To avoid this oversmoothing, we develop a direct procedure for inversion of the magnetic field that avoids use of the stream function. This approach provides enhanced accuracy of current reconstruction over a wide range of noise levels. We further introduce a reflection procedure that allows for the reconstruction of currents that cross the boundaries of the measurement window. The effectiveness of our approach is demonstrated by several numerical examples.
Toward an electrical power utility for space exploration
NASA Technical Reports Server (NTRS)
Bercaw, Robert W.
1989-01-01
Future electrical power requirements for space exploration are discussed. Megawatts of power with enough reliability for multi-year missions and with enough flexibility to adapt to needs unanticipated at design time are some of the criteria which space power systems must be able to meet. The reasons for considering the power management and distribution in the various systems, from a total mission perspective rather than simply extrapolating current spacecraft design practice, are discussed. A utility approach to electric power integrating requirements from a broad selection of current development programs, with studies in which both space and terrestrial technologies are conceptually applied to exploration mission scenarios, is described.
Propulsion Trade Studies for Spacecraft Swarm Mission Design
NASA Technical Reports Server (NTRS)
Dono, Andres; Plice, Laura; Mueting, Joel; Conn, Tracie; Ho, Michael
2018-01-01
Spacecraft swarms constitute a challenge from an orbital mechanics standpoint. Traditional mission design involves the application of methodical processes where predefined maneuvers for an individual spacecraft are planned in advance. This approach does not scale to spacecraft swarms consisting of many satellites orbiting in close proximity; non-deterministic maneuvers cannot be preplanned due to the large number of units and the uncertainties associated with their differential deployment and orbital motion. For autonomous small sat swarms in LEO, we investigate two approaches for controlling the relative motion of a swarm. The first method involves modified miniature phasing maneuvers, where maneuvers are prescribed that cancel the differential delta V of each CubeSat's deployment vector. The second method relies on artificial potential functions (APFs) to contain the spacecraft within a volumetric boundary and avoid collisions. Performance results and required delta V budgets are summarized, indicating that each method has advantages and drawbacks for particular applications. The mini phasing maneuvers are more predictable and sustainable. The APF approach provides a more responsive and distributed performance, but at considerable propellant cost. After considering current state of the art CubeSat propulsion systems, we conclude that the first approach is feasible, but the modified APF method of requires too much control authority to be enabled by current propulsion systems.
Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini
2013-01-01
Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182
NASA Astrophysics Data System (ADS)
Contos, Adam R.; Acton, D. Scott; Atcheson, Paul D.; Barto, Allison A.; Lightsey, Paul A.; Shields, Duncan M.
2006-06-01
The opto-mechanical design of the 6.6 meter James Webb Space Telescope (JWST), with its actively-controlled secondary and 18-segment primary mirror, presents unique challenges from a system engineering perspective. To maintain the optical alignment of the telescope on-orbit, a process called wavefront sensing and control (WFS&C) is employed to determine the current state of the mirrors and calculate the optimal mirror move updates. The needed imagery is downloaded to the ground, where the WFS&C algorithms to process the images reside, and the appropriate commands are uploaded to the observatory. Rather than use a dedicated wavefront sensor for the imagery as is done in most other applications, a science camera is used instead. For the success of the mission, WFS&C needs to perform flawlessly using the assets available among the combination of separate elements (ground operations, spacecraft, science instruments, optical telescope, etc.) that cross institutional as well as geographic borders. Rather than be yet another distinct element with its own set of requirements to flow to the other elements as was originally planned, a novel approach was selected. This approach entails reviewing and auditing other documents for the requirements needed to satisfy the needs of WFS&C. Three actions are taken: (1) when appropriate requirements exist, they are tracked by WFS&C ; (2) when an existing requirement is insufficient to meet the need, a requirement change is initiated; and finally (3) when a needed requirement is missing, a new requirement is established in the corresponding document. This approach, deemed a "best practice" at the customer's independent audit, allows for program confidence that the necessary requirements are complete, while still maintaining the responsibility for the requirement with the most appropriate entity. This paper describes the details and execution of the approach; the associated WFS&C requirements and verification documentation; and the implementation of the primary database tool for the project, DOORS (Dynamic Object-Oriented Requirements System).
NASA Astrophysics Data System (ADS)
Ruuskanen, J.; Stenvall, A.; Lahtinen, V.; Pardo, E.
2017-02-01
Superconducting magnets are the most expensive series of components produced in the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). When developing such magnets beyond state-of-the-art technology, one possible option is to use high-temperature superconductors (HTS) that are capable of tolerating much higher magnetic fields than low-temperature superconductors (LTS), carrying simultaneously high current densities. Significant cost reductions due to decreased prototype construction needs can be achieved by careful modelling of the magnets. Simulations are used, e.g. for designing magnets fulfilling the field quality requirements of the beampipe, and adequate protection by studying the losses occurring during charging and discharging. We model the hysteresis losses and the magnetic field nonlinearity in the beampipe as a function of the magnet’s current. These simulations rely on the minimum magnetic energy variation principle, with optimization algorithms provided by the open-source optimization library interior point optimizer. We utilize this methodology to investigate a research and development accelerator magnet prototype made of REBCO Roebel cable. The applicability of this approach, when the magnetic field dependence of the superconductor’s critical current density is considered, is discussed. We also scrutinize the influence of the necessary modelling decisions one needs to make with this approach. The results show that different decisions can lead to notably different results, and experiments are required to study the electromagnetic behaviour of such magnets further.
MOEX: Solvent extraction approach for recycling enriched 98Mo/ 100Mo material
Tkac, Peter; Brown, M. Alex; Momen, Abdul; ...
2017-03-20
Several promising pathways exist for the production of 99Mo/ 99mTc using enriched 98Mo or 100Mo. Use of Mo targets require a major change in current generator technology, and the necessity for an efficient recycle pathway to recover valuable enriched Mo material. High recovery yields, purity, suitable chemical form and particle size are required. Results on the development of the MOEX– molybdenum solvent extraction – approach to recycle enriched Mo material are presented. Furthermore, the advantages of the MOEX process are very high decontamination factors from potassium and other elements, high throughput, easy scalability, automation, and minimal waste generation.
A modeling technique for STOVL ejector and volume dynamics
NASA Technical Reports Server (NTRS)
Drummond, C. K.; Barankiewicz, W. S.
1990-01-01
New models for thrust augmenting ejector performance prediction and feeder duct dynamic analysis are presented and applied to a proposed Short Take Off and Vertical Landing (STOVL) aircraft configuration. Central to the analysis is the nontraditional treatment of the time-dependent volume integrals in the otherwise conventional control-volume approach. In the case of the thrust augmenting ejector, the analysis required a new relationship for transfer of kinetic energy from the primary flow to the secondary flow. Extraction of the required empirical corrections from current steady-state experimental data is discussed; a possible approach for modeling insight through Computational Fluid Dynamics (CFD) is presented.
MOEX: Solvent extraction approach for recycling enriched 98Mo/ 100Mo material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tkac, Peter; Brown, M. Alex; Momen, Abdul
Several promising pathways exist for the production of 99Mo/ 99mTc using enriched 98Mo or 100Mo. Use of Mo targets require a major change in current generator technology, and the necessity for an efficient recycle pathway to recover valuable enriched Mo material. High recovery yields, purity, suitable chemical form and particle size are required. Results on the development of the MOEX– molybdenum solvent extraction – approach to recycle enriched Mo material are presented. Furthermore, the advantages of the MOEX process are very high decontamination factors from potassium and other elements, high throughput, easy scalability, automation, and minimal waste generation.
Beyond the Natural Proteome: Nondegenerate Saturation Mutagenesis-Methodologies and Advantages.
Ferreira Amaral, M M; Frigotto, L; Hine, A V
2017-01-01
Beyond the natural proteome, high-throughput mutagenesis offers the protein engineer an opportunity to "tweak" the wild-type activity of a protein to create a recombinant protein with required attributes. Of the various approaches available, saturation mutagenesis is one of the core techniques employed by protein engineers, and in recent times, nondegenerate saturation mutagenesis is emerging as the approach of choice. This review compares the current methodologies available for conducting nondegenerate saturation mutagenesis with traditional, degenerate saturation and briefly outlines the options available for screening the resulting libraries, to discover a novel protein with the required activity and/or specificity. © 2017 Elsevier Inc. All rights reserved.
Stem cells as biological heart pacemakers.
Gepstein, Lior
2005-12-01
Abnormalities in the pacemaker function of the heart or in cardiac impulse conduction may result in the appearance of a slow heart rate, traditionally requiring the implantation of a permanent electronic pacemaker. In recent years, a number of experimental approaches have been developed in an attempt to generate biological alternatives to implantable electronic devices. These strategies include, initially, a number of gene therapy approaches (aiming to manipulate the expression of ionic currents or their modulators and thereby convert quiescent cardiomyocytes into pacemaking cells) and, more recently, the use of cell therapy and tissue engineering. The latter approach explored the possibility of grafting pacemaking cells, either derived directly during the differentiation of human embryonic stem cells or engineered from mesenchymal stem cells, into the myocardium. This review will describe each of these approaches, focusing mainly on the stem cell strategies, their possible advantages and shortcomings, as well as the avenues required to make biological pacemaking a clinical reality.
Biology-inspired Architecture for Situation Management
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng
2006-01-01
Situation Management is a rapidly developing science combining new techniques for data collection with advanced methods of data fusion to facilitate the process leading to correct decisions prescribing action. Current research focuses on reducing increasing amounts of diverse data to knowledge used by decision makers and on reducing time between observations, decisions and actions. No new technology is more promising for increasing the diversity and fidelity of observations than sensor networks. However, current research on sensor networks concentrates on a centralized network architecture. We believe this trend will not realize the full potential of situation management. We propose a new architecture modeled after biological ecosystems where motes are autonomous and intelligent, yet cooperate with local neighborhoods. Providing a layered approach, they sense and act independently when possible, and cooperate with neighborhoods when necessary. The combination of their local actions results in global effects. While situation management research is currently dominated by military applications, advances envisioned for industrial and business applications have similar requirements. NASA has requirements for intelligent and autonomous systems in future missions that can benefit from advances in situation management. We describe requirements for the Integrated Vehicle Health Management program where our biology-inspired architecture provides a layered approach and decisions can be made at the proper level to improve safety, reduce costs, and improve efficiency in making diagnostic and prognostic assessments of the structural integrity, aerodynamic characteristics, and operation of aircraft.
Design, Analysis and Testing of a PRSEUS Pressure Cube to Investigate Assembly Joints
NASA Technical Reports Server (NTRS)
Yovanof, Nicolette; Lovejoy, Andrew E.; Baraja, Jaime; Gould, Kevin
2012-01-01
Due to its potential to significantly increase fuel efficiency, the current focus of NASA's Environmentally Responsible Aviation Program is the hybrid wing body (HWB) aircraft. Due to the complex load condition that exists in HWB structure, as compared to traditional aircraft configurations, light-weight, cost-effective and manufacturable structural concepts are required to enable the HWB. The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) concept is one such structural concept. A building block approach for technology development of the PRSEUS concept is being conducted. As part of this approach, a PRSEUS pressure cube was developed as a risk reduction test article to examine a new integral cap joint concept. This paper describes the design, analysis and testing of the PRSEUS pressure cube test article. The pressure cube was required to withstand a 2P, 18.4 psi, overpressure load requirement. The pristine pressure cube was tested to 2.2P with no catastrophic failure. After the addition of barely visible impact damage, the cube was pressure loaded to 48 psi where catastrophic failure occurred, meeting the scale-up requirement. Comparison of pretest and posttest analyses with the cube test response agree well, and indicate that current analysis methods can be used to accurately analyze PRSEUS structure for initial failure response.
Energy balance framework for Net Zero Energy buildings
Approaching a Net Zero Energy (NZE) building goal based on current definitions is flawed for two principal reasons - they only deal with energy quantities required for operations, and they do not establish a threshold, which ensures that buildings are optimized for reduced consum...
An approach to the origin of self-replicating system. I - Intermolecular interactions
NASA Technical Reports Server (NTRS)
Macelroy, R. D.; Coeckelenbergh, Y.; Rein, R.
1978-01-01
The present paper deals with the characteristics and potentialities of a recently developed computer-based molecular modeling system. Some characteristics of current coding systems are examined and are extrapolated to the apparent requirements of primitive prebiological coding systems.
Read-across predictions require high quality measured data for source analogues. These data are typically retrieved from structured databases, but biomedical literature data are often untapped because current literature mining approaches are resource intensive. Our high-throughpu...
Is Special Education Certification a Guarantee of Teaching Excellence?
ERIC Educational Resources Information Center
Maple, Cathe Cross
1983-01-01
Based on experiences in Kansas, the problems discussed include: discrepancies between competency-based teacher education and current certification practices; categorical approaches to training and certification; reciprocal agreements for coursework and certification requirements; and the supply/demand of teachers. Possible solutions cited include…
Better understanding of toxicological mechanisms, enhanced testing capabilities, and demands for more sophisticated data for safety and health risk assessment have generated international interest in improving the current testing paradigm for agricultural chemicals. To address th...
Current Approaches to Bone Tissue Engineering: The Interface between Biology and Engineering.
Li, Jiao Jiao; Ebied, Mohamed; Xu, Jen; Zreiqat, Hala
2018-03-01
The successful regeneration of bone tissue to replace areas of bone loss in large defects or at load-bearing sites remains a significant clinical challenge. Over the past few decades, major progress is achieved in the field of bone tissue engineering to provide alternative therapies, particularly through approaches that are at the interface of biology and engineering. To satisfy the diverse regenerative requirements of bone tissue, the field moves toward highly integrated approaches incorporating the knowledge and techniques from multiple disciplines, and typically involves the use of biomaterials as an essential element for supporting or inducing bone regeneration. This review summarizes the types of approaches currently used in bone tissue engineering, beginning with those primarily based on biology or engineering, and moving into integrated approaches in the areas of biomaterial developments, biomimetic design, and scalable methods for treating large or load-bearing bone defects, while highlighting potential areas for collaboration and providing an outlook on future developments. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Physics-of-Failure Approach to Prognostics
NASA Technical Reports Server (NTRS)
Kulkarni, Chetan S.
2017-01-01
As more and more electric vehicles emerge in our daily operation progressively, a very critical challenge lies in accurate prediction of the electrical components present in the system. In case of electric vehicles, computing remaining battery charge is safety-critical. In order to tackle and solve the prediction problem, it is essential to have awareness of the current state and health of the system, especially since it is necessary to perform condition-based predictions. To be able to predict the future state of the system, it is also required to possess knowledge of the current and future operations of the vehicle. In this presentation our approach to develop a system level health monitoring safety indicator for different electronic components is presented which runs estimation and prediction algorithms to determine state-of-charge and estimate remaining useful life of respective components. Given models of the current and future system behavior, the general approach of model-based prognostics can be employed as a solution to the prediction problem and further for decision making.
Designing eHealth that Matters via a Multidisciplinary Requirements Development Approach
Wentzel, Jobke; Van Gemert-Pijnen, Julia EWC
2013-01-01
Background Requirements development is a crucial part of eHealth design. It entails all the activities devoted to requirements identification, the communication of requirements to other developers, and their evaluation. Currently, a requirements development approach geared towards the specifics of the eHealth domain is lacking. This is likely to result in a mismatch between the developed technology and end user characteristics, physical surroundings, and the organizational context of use. It also makes it hard to judge the quality of eHealth design, since it makes it difficult to gear evaluations of eHealth to the main goals it is supposed to serve. Objective In order to facilitate the creation of eHealth that matters, we present a practical, multidisciplinary requirements development approach which is embedded in a holistic design approach for eHealth (the Center for eHealth Research roadmap) that incorporates both human-centered design and business modeling. Methods Our requirements development approach consists of five phases. In the first, preparatory, phase the project team is composed and the overall goal(s) of the eHealth intervention are decided upon. Second, primary end users and other stakeholders are identified by means of audience segmentation techniques and our stakeholder identification method. Third, the designated context of use is mapped and end users are profiled by means of requirements elicitation methods (eg, interviews, focus groups, or observations). Fourth, stakeholder values and eHealth intervention requirements are distilled from data transcripts, which leads to phase five, in which requirements are communicated to other developers using a requirements notation template we developed specifically for the context of eHealth technologies. Results The end result of our requirements development approach for eHealth interventions is a design document which includes functional and non-functional requirements, a list of stakeholder values, and end user profiles in the form of personas (fictitious end users, representative of a primary end user group). Conclusions The requirements development approach presented in this article enables eHealth developers to apply a systematic and multi-disciplinary approach towards the creation of requirements. The cooperation between health, engineering, and social sciences creates a situation in which a mismatch between design, end users, and the organizational context can be avoided. Furthermore, we suggest to evaluate eHealth on a feature-specific level in order to learn exactly why such a technology does or does not live up to its expectations. PMID:23796508
Gaussian Processes for Data-Efficient Learning in Robotics and Control.
Deisenroth, Marc Peter; Fox, Dieter; Rasmussen, Carl Edward
2015-02-01
Autonomous learning has been a promising direction in control and robotics for more than a decade since data-driven learning allows to reduce the amount of engineering knowledge, which is otherwise required. However, autonomous reinforcement learning (RL) approaches typically require many interactions with the system to learn controllers, which is a practical limitation in real systems, such as robots, where many interactions can be impractical and time consuming. To address this problem, current learning approaches typically require task-specific knowledge in form of expert demonstrations, realistic simulators, pre-shaped policies, or specific knowledge about the underlying dynamics. In this paper, we follow a different approach and speed up learning by extracting more information from data. In particular, we learn a probabilistic, non-parametric Gaussian process transition model of the system. By explicitly incorporating model uncertainty into long-term planning and controller learning our approach reduces the effects of model errors, a key problem in model-based learning. Compared to state-of-the art RL our model-based policy search method achieves an unprecedented speed of learning. We demonstrate its applicability to autonomous learning in real robot and control tasks.
FRED: an innovative approach to nursing home level-of-care assignments.
Morris, J N; Sherwood, S; May, M I; Bernstein, E
1987-04-01
A clear need currently exists to consider new approaches for classifying nursing home residents. The traditional intermediate care facility/skilled nursing facility (ICF/SNF) dichotomy cannot provide adequate information on the type of care required by any one individual, and it provides only the most limited information required to address the care and quality-of-life needs of the total patient population within a facility, as well as the level of reimbursement appropriate for their care. This article describes an alternative procedure for allocating nursing home residents according to a more comprehensive array of internally homogeneous categories. This system is based on an operational perspective focused on the total nursing and staffing requirements for types of nursing home residents. The tool is titled "Functionally Ranked Explanatory Designations," or FRED.
FRED: an innovative approach to nursing home level-of-care assignments.
Morris, J N; Sherwood, S; May, M I; Bernstein, E
1987-01-01
A clear need currently exists to consider new approaches for classifying nursing home residents. The traditional intermediate care facility/skilled nursing facility (ICF/SNF) dichotomy cannot provide adequate information on the type of care required by any one individual, and it provides only the most limited information required to address the care and quality-of-life needs of the total patient population within a facility, as well as the level of reimbursement appropriate for their care. This article describes an alternative procedure for allocating nursing home residents according to a more comprehensive array of internally homogeneous categories. This system is based on an operational perspective focused on the total nursing and staffing requirements for types of nursing home residents. The tool is titled "Functionally Ranked Explanatory Designations," or FRED. PMID:3570811
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
New technologies for HWIL testing of WFOV, large-format FPA sensor systems
NASA Astrophysics Data System (ADS)
Fink, Christopher
2016-05-01
Advancements in FPA density and associated wide-field-of-view infrared sensors (>=4000x4000 detectors) have outpaced the current-art HWIL technology. Whether testing in optical projection or digital signal injection modes, current-art technologies for infrared scene projection, digital injection interfaces, and scene generation systems simply lack the required resolution and bandwidth. For example, the L3 Cincinnati Electronics ultra-high resolution MWIR Camera deployed in some UAV reconnaissance systems features 16MP resolution at 60Hz, while the current upper limit of IR emitter arrays is ~1MP, and single-channel dual-link DVI throughput of COTs graphics cards is limited to 2560x1580 pixels at 60Hz. Moreover, there are significant challenges in real-time, closed-loop, physics-based IR scene generation for large format FPAs, including the size and spatial detail required for very large area terrains, and multi - channel low-latency synchronization to achieve the required bandwidth. In this paper, the author's team presents some of their ongoing research and technical approaches toward HWIL testing of large-format FPAs with wide-FOV optics. One approach presented is a hybrid projection/injection design, where digital signal injection is used to augment the resolution of current-art IRSPs, utilizing a multi-channel, high-fidelity physics-based IR scene simulator in conjunction with a novel image composition hardware unit, to allow projection in the foveal region of the sensor, while non-foveal regions of the sensor array are simultaneously stimulated via direct injection into the post-detector electronics.
Modeling Requirements for Cohort and Register IT.
Stäubert, Sebastian; Weber, Ulrike; Michalik, Claudia; Dress, Jochen; Ngouongo, Sylvie; Stausberg, Jürgen; Winter, Alfred
2016-01-01
The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; b) to analyze the current state models and to find simplifications within the generic catalog. Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as requirements specification for bids, is supported, too.
Virtual reality: past, present and future.
Gobbetti, E; Scateni, R
1998-01-01
This report provides a short survey of the field of virtual reality, highlighting application domains, technological requirements, and currently available solutions. The report is organized as follows: section 1 presents the background and motivation of virtual environment research and identifies typical application domain, section 2 discusses the characteristics a virtual reality system must have in order to exploit the perceptual and spatial skills of users, section 3 surveys current input/output devices for virtual reality, section 4 surveys current software approaches to support the creation of virtual reality systems, and section 5 summarizes the report.
Assisted Writing in Spin Transfer Torque Magnetic Tunnel Junctions
NASA Astrophysics Data System (ADS)
Ganguly, Samiran; Ahmed, Zeeshan; Datta, Supriyo; Marinero, Ernesto E.
2015-03-01
Spin transfer torque driven MRAM devices are now in an advanced state of development, and the importance of reducing the current requirement for writing information is well recognized. Different approaches to assist the writing process have been proposed such as spin orbit torque, spin Hall effect, voltage controlled magnetic anisotropy and thermal excitation. In this work,we report on our comparative study using the Spin-Circuit Approach regarding the total energy, the switching speed and energy-delay products for different assisted writing approaches in STT-MTJ devices using PMA magnets.
A Hybrid Satellite-Terrestrial Approach to Aeronautical Communication Networks
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Chomos, Gerald J.; Griner, James H.; Mainger, Steven W.; Martzaklis, Konstantinos S.; Kachmar, Brian A.
2000-01-01
Rapid growth in air travel has been projected to continue for the foreseeable future. To maintain a safe and efficient national and global aviation system, significant advances in communications systems supporting aviation are required. Satellites will increasingly play a critical role in the aeronautical communications network. At the same time, current ground-based communications links, primarily very high frequency (VHF), will continue to be employed due to cost advantages and legacy issues. Hence a hybrid satellite-terrestrial network, or group of networks, will emerge. The increased complexity of future aeronautical communications networks dictates that system-level modeling be employed to obtain an optimal system fulfilling a majority of user needs. The NASA Glenn Research Center is investigating the current and potential future state of aeronautical communications, and is developing a simulation and modeling program to research future communications architectures for national and global aeronautical needs. This paper describes the primary requirements, the current infrastructure, and emerging trends of aeronautical communications, including a growing role for satellite communications. The need for a hybrid communications system architecture approach including both satellite and ground-based communications links is explained. Future aeronautical communication network topologies and key issues in simulation and modeling of future aeronautical communications systems are described.
Towards tailored and targeted adherence assessment to optimise asthma management
van Boven, Job FM; Trappenburg, Jaap CA; van der Molen, Thys; Chavannes, Niels H
2015-01-01
In this paper, we aim to emphasise the need for a more comprehensive and tailored approach to manage the broad nature of non-adherence, to personalise current asthma management. Although currently several methods are available to measure the extent of asthma patients’ adherence, the vast majority do not incorporate confirmation of the actual inhalation, dose and inhalation technique. Moreover, most current measures lack detailed information on the individual consequences of non-adherence and on when and how to take action if non-adherence is identified. Notably, one has to realise there are several forms of non-adherence (erratic non-adherence, intelligent non-adherence and unwitting non-adherence), each requiring a different approach. To improve asthma management, more accurate methods are needed that integrate measures of non-adherence, asthma disease control and patient preferences. Integrating information from the latest inhaler devices and patient-reported outcomes using mobile monitoring- and feedback systems (‘mHealth’) is considered a promising strategy, but requires careful implementation. Key issues to be considered before large-scale implementation include patient preferences, large heterogeneity in patient and disease characteristics, economic consequences, and long-term persistence with new digital technologies. PMID:26181850
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendell, Mark J.
This report briefly summarizes, based on recent review articles and selected more recent research reports, current scientific knowledge on two topics: assessing unhealthy levels of indoor D/M in homes and remediating home dampness-related problems to protect health. Based on a comparison of current scientific knowledge to that required to support effective, evidence-based, health-protective policies on home D/M, gaps in knowledge are highlighted, prior questions and research questions specified, and necessary research activities and approaches recommended.
Redefining cancer: a new paradigm for better and faster treatment innovation.
Stewart, David J; Batist, Gerald
2014-01-01
Common cancers may arise from several different mutations, and each causative mutation may require different treatment approaches. There are also several mechanisms by which malignancies may become resistant to therapy, and each mechanism will also require a different therapeutic strategy. Hence, the paradigm of devising therapies based on tumor type is suboptimal. Each common malignancy may now be regarded as a collection of morphologically similar but molecularly distinct orphan diseases, each requiring unique approaches. Current strategies that employ randomized clinical trials (RCTs) in unselected patients carry a high risk of misleading results. Available data suggest that it is reasonable to grant marketing approval for new anticancer agents based solely on high single-agent response rates in small phase I-II studies involving molecularly-defined patient groups where benefit from other therapies is unlikely. This could markedly speed patient access to important therapies while reducing health care costs by slashing drug development costs. Feasible post-approval surveillance procedures could provide ongoing monitoring of drug safety. While assessment of drug combinations would be more complex due to variable contributions of each component, new strategies have been proposed. In addition to savings from more efficient clinical trials methods, it is essential that we also markedly reduce costs of complying with clinical research regulations. Compliance is too cumbersome and expensive, and current regulatory inflexibility markedly slows progress while escalating health care costs. This requires urgent attention. Regulatory approaches intended to enhance safety may instead potentially cost far more life-years than they save by delaying approval of effective therapies.
Treatment options for moderate-to-very severe chronic obstructive pulmonary disease.
Cazzola, Mario; Rogliani, Paola; Ora, Josuel; Matera, Maria Gabriella
2016-01-01
The appropriate drug management of COPD is still based on the use of bronchodilators, possibly associated with an anti-inflammatory agent. However, there are still fundamental questions that require clarification to optimise their use and major unmet clinical needs that must be addressed. The advances obtained with the pharmacological options currently consolidated and the different approaches that are often used in an attempt to respond to unmet therapeutic needs are reviewed Expert opinion: In view of the unsatisfactory status of current treatments for COPD, there is an urgent need for alternative and more effective therapeutic approaches that will help to relieve patient symptoms and affect the natural course of COPD, inhibiting chronic inflammation and reversing the disease process or preventing its progression. However, new pharmacologic options have proved difficult to develop. Therefore, it is mandatory to optimize the use of the treatment options at our disposal. However, there are still fundamental questions regarding their use, including the step-up and step-down pharmacological approach, that require clarification to optimise the use of these drugs. It is likely that phenotyping COPD patients would help in identifying the right treatment for each COPD patient and improve the effectiveness of therapies.
Modular techniques for dynamic fault-tree analysis
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Dugan, Joanne B.
1992-01-01
It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.
DOT National Transportation Integrated Search
2015-06-01
Diverse states like Virginia, with a mix of urban, suburban, and rural environments and transportation systems, cannot rely on a single approach to increasing transportation sustainability, but require an understanding of what has worked and what mig...
The Required Business Computing Course: Peer-Group Learning with a Managerial Emphasis
ERIC Educational Resources Information Center
Schonberger, Richard J.; Franz, Lori
1978-01-01
This paper describes course design aimed at catching up with current business practice. Discussion concerns differences between old and new pedagogies, as well as obstacles in the way of adopting the new approach, particularly in the introductory course. (Author/IRT)
NASA Technical Reports Server (NTRS)
Lee, S. C.; Lollar, Louis F.
1988-01-01
The overall approach currently being taken in the development of AMPERES (Autonomously Managed Power System Extendable Real-time Expert System), a knowledge-based expert system for fault monitoring and diagnosis of space power systems, is discussed. The system architecture, knowledge representation, and fault monitoring and diagnosis strategy are examined. A 'component-centered' approach developed in this project is described. Critical issues requiring further study are identified.
NASA Astrophysics Data System (ADS)
Isern-Fontanet, Jordi; Ballabrera-Poy, Joaquim; Turiel, Antonio; García-Ladona, Emilio
2017-10-01
Ocean currents play a key role in Earth's climate - they impact almost any process taking place in the ocean and are of major importance for navigation and human activities at sea. Nevertheless, their observation and forecasting are still difficult. First, no observing system is able to provide direct measurements of global ocean currents on synoptic scales. Consequently, it has been necessary to use sea surface height and sea surface temperature measurements and refer to dynamical frameworks to derive the velocity field. Second, the assimilation of the velocity field into numerical models of ocean circulation is difficult mainly due to lack of data. Recent experiments that assimilate coastal-based radar data have shown that ocean currents will contribute to increasing the forecast skill of surface currents, but require application in multidata assimilation approaches to better identify the thermohaline structure of the ocean. In this paper we review the current knowledge in these fields and provide a global and systematic view of the technologies to retrieve ocean velocities in the upper ocean and the available approaches to assimilate this information into ocean models.
Managing Complex IT Security Processes with Value Based Measures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali
2009-01-01
Current trends indicate that IT security measures will need to greatly expand to counter the ever increasingly sophisticated, well-funded and/or economically motivated threat space. Traditional risk management approaches provide an effective method for guiding courses of action for assessment, and mitigation investments. However, such approaches no matter how popular demand very detailed knowledge about the IT security domain and the enterprise/cyber architectural context. Typically, the critical nature and/or high stakes require careful consideration and adaptation of a balanced approach that provides reliable and consistent methods for rating vulnerabilities. As reported in earlier works, the Cyberspace Security Econometrics System provides amore » comprehensive measure of reliability, security and safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. This paper advocates a dependability measure that acknowledges the aggregate structure of complex system specifications, and accounts for variations by stakeholder, by specification components, and by verification and validation impact.« less
MONTANO, Diego
2016-01-01
The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach. PMID:26860787
Montano, Diego
2016-08-05
The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach.
What makes a hospital manager competent at the middle and senior levels?
Liang, Zhanming; Leggat, Sandra G; Howard, Peter F; Koh, Lee
2013-11-01
The purpose of this paper is to confirm the core competencies required for middle to senior level managers in Victorian public hospitals in both metropolitan and regional/rural areas. This exploratory mixed-methods study used a three-step approach which included position description content analysis, focus group discussions and online competency verification and identification survey. The study validated a number of key tasks required for senior and middle level hospital managers (levels II, III and IV) and identified and confirmed the essential competencies for completing these key tasks effectively. As a result, six core competencies have been confirmed as common to the II, III and IV management levels in both the Melbourne metropolitan and regional/rural areas. Six core competencies are required for middle to senior level managers in public hospitals which provide guidance to the further development of the competency-based educational approach for training the current management workforce and preparing future health service managers. With the detailed descriptions of the six core competencies, healthcare organisations and training institutions will be able to assess the competency gaps and managerial training needs of current health service managers and develop training programs accordingly.
Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard; de Sousa, Gracinda
2015-04-01
The screening laboratory has a critical role in the post-transfusion safety. The success of its targets and efficiency depends on the management system used. Even though the European Union directive 2002/98/EC requires a quality management system in blood establishments, its requirements for screening laboratories are generic. Complementary approaches are needed to implement a quality management system focused on screening laboratories. This article briefly discusses the current good manufacturing practices and good laboratory practices, as well as the trends in quality management system standards. ISO 9001 is widely accepted in some European Union blood establishments as the quality management standard, however this is not synonymous of its successful application. The ISO "risk-based thinking" is interrelated with the quality risk-management process of the EuBIS "Standards and criteria for the inspection of blood establishments". ISO 15189 should be the next step on the quality assurance of a screening laboratory, since it is focused on medical laboratory. To standardize the quality management systems in blood establishments' screening laboratories, new national and European claims focused on technical requirements following ISO 15189 is needed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Current concepts in cleft care: A multicenter analysis.
Thiele, Oliver C; Kreppel, Matthias; Dunsche, Anton; Eckardt, Andre M; Ehrenfeld, Michael; Fleiner, Bernd; Gaßling, Volker; Gehrke, Gerd; Gerressen, Marcus; Gosau, Martin; Gröbe, Alexander; Haßfeld, Stefan; Heiland, Max; Hoffmeister, Bodo; Hölzle, Frank; Klein, Cornelius; Krüger, Maximilian; Kübler, Alexander C; Kübler, Norbert R; Kuttenberger, Johannes J; Landes, Constantin; Lauer, Günter; Martini, Markus; Merholz, Erich T; Mischkowski, Robert A; Al-Nawas, Bilal; Nkenke, Emeka; Piesold, Jörn U; Pradel, Winnie; Rasse, Michael; Rachwalski, Martin; Reich, Rudolf H; Rothamel, Daniel; Rustemeyer, Jan; Scheer, Martin; Schliephake, Henning; Schmelzeisen, Rainer; Schramm, Alexander; Schupp, Wiebke; Spitzer, Wolfgang J; Stocker, Erwin; Stoll, Christian; Terheyden, Hendrik; Voigt, Alexander; Wagner, Wilfried; Weingart, Dieter; Werkmeister, Richard; Wiltfang, Jörg; Ziegler, Christoph M; Zöller, Joachim E
2018-04-01
The current surgical techniques used in cleft repair are well established, but different centers use different approaches. To determine the best treatment for patients, a multi-center comparative study is required. In this study, we surveyed all craniofacial departments registered with the German Society of Maxillofacial Surgery to determine which cleft repair techniques are currently in use. Our findings revealed much variation in cleft repair between different centers. Although most centers did use a two-stage approach, the operative techniques and timing of lip and palate closure were different in every center. This shows that a retrospective comparative analysis of patient outcome between the participating centers is not possible and illustrates the need for prospective comparative studies to establish the optimal technique for reconstructive cleft surgery. Copyright © 2018 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1976-01-01
The six themes identified by the Workshop have many common navigation guidance and control needs. All the earth orbit themes have a strong requirement for attitude, figure and stabilization control of large space structures, a requirement not currently being supported. All but the space transportation theme have need for precision pointing of spacecraft and instruments. In addition all the themes have requirements for increasing autonomous operations for such activities as spacecraft and experiment operations, onboard mission modification, rendezvous and docking, spacecraft assembly and maintenance, navigation and guidance, and self-checkout, test and repair. Major new efforts are required to conceptualize new approaches to large space antennas and arrays that are lightweight, readily deployable, and capable of precise attitude and figure control. Conventional approaches offer little hope of meeting these requirements. Functions that can benefit from increasing automation or autonomous operations are listed.
Intelligent fault diagnosis and failure management of flight control actuation systems
NASA Technical Reports Server (NTRS)
Bonnice, William F.; Baker, Walter
1988-01-01
The real-time fault diagnosis and failure management (FDFM) of current operational and experimental dual tandem aircraft flight control system actuators was investigated. Dual tandem actuators were studied because of the active FDFM capability required to manage the redundancy of these actuators. The FDFM methods used on current dual tandem actuators were determined by examining six specific actuators. The FDFM capability on these six actuators was also evaluated. One approach for improving the FDFM capability on dual tandem actuators may be through the application of artificial intelligence (AI) technology. Existing AI approaches and applications of FDFM were examined and evaluated. Based on the general survey of AI FDFM approaches, the potential role of AI technology for real-time actuator FDFM was determined. Finally, FDFM and maintainability improvements for dual tandem actuators were recommended.
Self-Recovery Experiments in Extreme Environments Using a Field Programmable Transistor Array
NASA Technical Reports Server (NTRS)
Stoica, Adrian; Keymeulen, Didier; Arslan, Tughrul; Duong, Vu; Zebulum, Ricardo; Ferguson, Ian; Guo, Xin
2004-01-01
Temperature and radiation tolerant electronics, as well as long life survivability are key capabilities required for future NASA missions. Current approaches to electronics for extreme environments focus on component level robustness and hardening. However, current technology can only ensure very limited lifetime in extreme environments. This paper describes novel experiments that allow adaptive in-situ circuit redesign/reconfiguration during operation in extreme temperature and radiation environments. This technology would complement material/device advancements and increase the mission capability to survive harsh environments. The approach is demonstrated on a mixed-signal programmable chip (FPTA-2), which recovers functionality for temperatures until 28 C and with total radiation dose up to 250kRad.
Beaconless Pointing for Deep-Space Optical Communication
NASA Technical Reports Server (NTRS)
Swank, Aaron J.; Aretskin-Hariton, Eliot; Le, Dzu K.; Sands, Obed S.; Wroblewski, Adam
2016-01-01
Free space optical communication is of interest to NASA as a complement to existing radio frequency communication methods. The potential for an increase in science data return capability over current radio-frequency communications is the primary objective. Deep space optical communication requires laser beam pointing accuracy on the order of a few microradians. The laser beam pointing approach discussed here operates without the aid of a terrestrial uplink beacon. Precision pointing is obtained from an on-board star tracker in combination with inertial rate sensors and an outgoing beam reference vector. The beaconless optical pointing system presented in this work is the current approach for the Integrated Radio and Optical Communication (iROC) project.
A Segmented Ion-Propulsion Engine
NASA Technical Reports Server (NTRS)
Brophy, John R.
1992-01-01
New design approach for high-power (100-kW class or greater) ion engines conceptually divides single engine into combination of smaller discharge chambers integrated to operate as single large engine. Analogous to multicylinder automobile engine, benefits include reduction in required accelerator system span-to-gap ratio for large-area engines, reduction in required hollow-cathode emission current, mitigation of plasma-uniformity problem, increased tolerance to accelerator system faults, and reduction in vacuum-system pumping speed.
NASA Technical Reports Server (NTRS)
Christiansen, Eric
2006-01-01
This paper describes International Space Station (ISS) shielding for micrometeoroid orbital debris (MMOD) protection, requirements for protection, and the technical approach to meeting requirements. Current activities in MMOD protection for ISS will be described, including efforts to augment MMOD protection by adding shields on-orbit. Observed MMOD impacts on ISS elements such as radiators, modules and returned hardware will be described. Comparisons of the observed damage with predicted damage using risk assessment software will be made.
Utilizing inheritance in requirements engineering
NASA Technical Reports Server (NTRS)
Kaindl, Hermann
1994-01-01
The scope of this paper is the utilization of inheritance for requirements specification, i.e., the tasks of analyzing and modeling the domain, as well as forming and defining requirements. Our approach and the tool supporting it are named RETH (Requirements Engineering Through Hypertext). Actually, RETH uses a combination of various technologies, including object-oriented approaches and artificial intelligence (in particular frames). We do not attempt to exclude or replace formal representations, but try to complement and provide means for gradually developing them. Among others, RETH has been applied in the CERN (Conseil Europeen pour la Rechereche Nucleaire) Cortex project. While it would be impossible to explain this project in detail here, it should be sufficient to know that it deals with a generic distributed control system. Since this project is not finished yet, it is difficult to state its size precisely. In order to give an idea, its final goal is to substitute the many existing similar control systems at CERN by this generic approach. Currently, RETH is also tested using real-world requirements for the Pastel Mission Planning System at ESOC in Darmstadt. First, we outline how hypertext is integrated into a frame system in our approach. Moreover, the usefulness of inheritance is demonstrated as performed by the tool RETH. We then summarize our experiences of utilizing inheritance in the Cortex project. Lastly, RETH will be related to existing work.
Low-dielectric constant insulators for future integrated circuits and packages.
Kohl, Paul A
2011-01-01
Future integrated circuits and packages will require extraordinary dielectric materials for interconnects to allow transistor advances to be translated into system-level advances. Exceedingly low-permittivity and low-loss materials are required at every level of the electronic system, from chip-level insulators to packages and printed wiring boards. In this review, the requirements and goals for future insulators are discussed followed by a summary of current state-of-the-art materials and technical approaches. Much work needs to be done for insulating materials and structures to meet future needs.
Generic Engineering Competencies: A Review and Modelling Approach
ERIC Educational Resources Information Center
Male, Sally A.
2010-01-01
This paper puts forward the view that engineering educators have a responsibility to prepare graduates for engineering work and careers. The current literature reveals gaps between the competencies required for engineering work and those developed in engineering education. Generic competencies feature in these competency gaps. Literature suggests…
75 FR 60617 - Review and Approval of Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-01
... cease taking water at given flow levels are in fact abiding by passby limitations. In addition, the... notice than the current contiguous property owner requirement that is based on proximity, not science... conservative management approach helping to ensure that applications are supported by science. Rather than...
Mapping More than Aboriginal Studies: Pedagogy, Professional Practice and Knowledge
ERIC Educational Resources Information Center
Norman, Heidi
2014-01-01
As undergraduate curriculum is increasingly required to meet a range of intellectual, professional practice and personal learning outcomes, what purpose does Australian Aboriginal Studies have in curriculum? Most Australian universities are currently in the process of developing institution-wide approaches to Indigenous Australian content in…
State-of-Science Approaches to Determine Sensitive Taxa for Water Quality Criteria Derivation
Current Ambient Water Quality Criteria (AWQC) guidelines specify pre-defined taxa diversity requirements, which has limited chemical-specific criteria development in the U.S. to less than 100 chemicals. A priori knowledge of sensitive taxa to toxicologically similar groups of che...
Differentiation from First Principles Using Spreadsheets
ERIC Educational Resources Information Center
Lim, Kieran F.
2008-01-01
In the teaching of calculus, the algebraic derivation of the derivative (gradient function) enables the student to obtain an analytic "global" gradient function. However, to the best of this author's knowledge, all current technology-based approaches require the student to obtain the derivative (gradient) at a single point by…
Testing the efficacy of eGFP-transformed Aspergillus flavus as biocontrol strains
USDA-ARS?s Scientific Manuscript database
Current biological control methods to prevent pre-harvest aflatoxin contamination of corn, cottonseed, and ground and tree nuts involve field inoculation of non-aflatoxigenic Aspergillus flavus. To date, the efficacy of this approach requires annual reapplication of the biocontrol agent. The reason ...
Psychosocial interventions in patients with dual diagnosis
Subodh, B.N; Sharma, Nidhi; Shah, Raghav
2018-01-01
Management of patients with dual diagnosis (Mental illness and substance use disorders) is a challenge. A lack of improvement in either disorder can lead to a relapse in both. The current consensus opinion favours an integrated approach to management of both the disorders wherein the same team of professionals manages both the disorders in the same setting. The role of pharmacotherapy for such dual diagnosis patients is well established but the non-pharmacological approaches for their management are still evolving. After stabilization of the acute phase of illnesses, non-pharmacological management takes centre stage. Evidence points to the beneficial effect of psychosocial approaches in maintaining abstinence, adherence to medication, maintenance of a healthy life style, better integration in to community, occupational rehabilitation and an overall improvement in functioning. Psychosocial approaches although beneficial, are difficult to implement. They require teamwork, involving professionals other than psychiatrists and psychologists alone. These approaches need to be comprehensive, individualized and require training to various levels that is difficult to achieve in most Indian settings. In this article we provide a brief review of these approaches. PMID:29540920
NASA Technical Reports Server (NTRS)
Wilson, D. A.
1976-01-01
Specific requirements for a wash/rinse capability to support Spacelab biological experimentation and to identify various concepts for achieving this capability were determined. This included the examination of current state-of-the-art and emerging technology designs that would meet the wash/rinse requirements. Once several concepts were identified, including the disposable utensils, tools and gloves or other possible alternatives, a tradeoff analysis involving system cost, weight, volume utilization, functional performance, maintainability, reliability, power utilization, safety, complexity, etc., was performed so as to determine an optimum approach for achieving a wash/rinse capability to support future space flights. Missions of varying crew size and durations were considered.
Examples of current radar technology and applications, chapter 5, part B
NASA Technical Reports Server (NTRS)
1975-01-01
Basic principles and tradeoff considerations for SLAR are summarized. There are two fundamental types of SLAR sensors available to the remote sensing user: real aperture and synthetic aperture. The primary difference between the two types is that a synthetic aperture system is capable of significant improvements in target resolution but requires equally significant added complexity and cost. The advantages of real aperture SLAR include long range coverage, all-weather operation, in-flight processing and image viewing, and lower cost. The fundamental limitation of the real aperture approach is target resolution. Synthetic aperture processing is the most practical approach for remote sensing problems that require resolution higher than 30 to 40 m.
Quantum sequencing: opportunities and challenges
NASA Astrophysics Data System (ADS)
di Ventra, Massimiliano
Personalized or precision medicine refers to the ability of tailoring drugs to the specific genome and transcriptome of each individual. It is however not yet feasible due the high costs and slow speed of present DNA sequencing methods. I will discuss a sequencing protocol that requires the measurement of the distributions of transverse tunneling currents during the translocation of single-stranded DNA into nanochannels. I will show that such a quantum sequencing approach can reach unprecedented speeds, without requiring any chemical preparation, amplification or labeling. I will discuss recent experiments that support these theoretical predictions, the advantages of this approach over other sequencing methods, and stress the challenges that need to be overcome to render it commercially viable.
Stoney, David A; Stoney, Paul L
2015-08-01
An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1978-01-01
Cost effective approaches for placing automated payloads into circular and elliptical orbits using energy requirements significantly lower than that provided by the smallest, currently planned shuttle upper stage, SSUS-D, were investigated. Launch costs were derived using both NASA existing/planned launch approaches as well as new propulsion concepts meeting low-energy regime requirements. Candidate new propulsion approaches considered were solid (tandem, cluster, and controlled), solid/liquid combinations and all-liquid stages. Results show that the most economical way to deliver the 129 low energy payloads is basically with a new modular, short liquid bipropellant stage system for the large majority of the payloads. For the remainder of the payloads, use the shuttle with integral OMS and the Scout form for a few specialized payloads until the Shuttle becomes operational.
Novel optical strategies for biodetection
NASA Astrophysics Data System (ADS)
Sakamuri, Rama M.; Wolfenden, Mark S.; Anderson, Aaron S.; Swanson, Basil I.; Schmidt, Jurgen S.; Mukundan, Harshini
2013-09-01
Although bio-detection strategies have significantly evolved in the past decade, they still suffer from many disadvantages. For one, current approaches still require confirmation of pathogen viability by culture, which is the `gold-standard' method, and can take several days to result. Second, current methods typically target protein and nucleic acid signatures and cannot be applied to other biochemical categories of biomarkers (e.g.; lipidated sugars). Lipidated sugars (e.g.; lipopolysaccharide, lipoarabinomannan) are bacterial virulence factors that are significant to pathogenicity. Herein, we present two different optical strategies for biodetection to address these two limitations. We have exploited bacterial iron sequestration mechanisms to develop a simple, specific assay for the selective detection of viable bacteria, without the need for culture. We are currently working on the use of this technology for the differential detection of two different bacteria, using siderophores. Second, we have developed a novel strategy termed `membrane insertion' for the detection of amphiphilic biomarkers (e.g. lipidated glycans) that cannot be detected by conventional approaches. We have extended this technology to the detection of small molecule amphiphilic virulence factors, such as phenolic glycolipid-1 from leprosy, which could not be directly detected before. Together, these strategies address two critical limitations in current biodetection approaches. We are currently working on the optimization of these methods, and their extension to real-world clinical samples.
Advances in chemical labeling of proteins in living cells.
Yan, Qi; Bruchez, Marcel P
2015-04-01
The pursuit of quantitative biological information via imaging requires robust labeling approaches that can be used in multiple applications and with a variety of detectable colors and properties. In addition to conventional fluorescent proteins, chemists and biologists have come together to provide a range of approaches that combine dye chemistry with the convenience of genetic targeting. This hybrid-tagging approach amalgamates the rational design of properties available through synthetic dye chemistry with the robust biological targeting available with genetic encoding. In this review, we discuss the current range of approaches that have been exploited for dye targeting or for targeting and activation and some of the recent applications that are uniquely permitted by these hybrid-tagging approaches.
NASA Technical Reports Server (NTRS)
Rogers, Ralph V.
1992-01-01
This research project addresses the need to provide an efficient and safe mechanism to investigate the effects and requirements of the tiltrotor aircraft's commercial operations on air transportation infrastructures, particularly air traffic control. The mechanism of choice is computer simulation. Unfortunately, the fundamental paradigms of the current air traffic control simulation models do not directly support the broad range of operational options and environments necessary to study tiltrotor operations. Modification of current air traffic simulation models to meet these requirements does not appear viable given the range and complexity of issues needing resolution. As a result, the investigation of systemic, infrastructure issues surrounding the effects of tiltrotor commercial operations requires new approaches to simulation modeling. These models should be based on perspectives and ideas closer to those associated with tiltrotor air traffic operations.
Field of Psychiatry: Current Trends and Future Directions: An Indian Perspective.
Dave, Kishore P
2016-01-01
Attempting to predict future is dangerous. This is particularly true in medical science where change is a result of chance discoveries. Currently, practicing psychiatrists are aware of deficiencies in psychiatric practice. However, we have a number of genuine reasons for optimism and excitement. Genetics, novel treatment approaches, new investigative techniques, large-scale treatment trials, and research in general medicine and neurology will give better insights in psychiatric disorders and its management. Psychiatric services in rural India can be reached by telemedicine. There are some threat perceptions which require solving and remedying. Subspecialties in psychiatry are the need of the hour. There is also a requirement for common practice guidelines. Mental Health Care Bill, 2013, requires suitable amendments before it is passed in the Indian Parliament. Research in psychiatry is yet to be developed as adequate resources are not available.
Alternative approaches to control--quo vadit?
Jackson, Frank; Miller, Jim
2006-07-31
The increasing prevalence of anthelmintic resistance has provided a spur for research into 'alternative/novel' approaches to the control of helminthoses that are intended to reduce our reliance upon using chemoprophylaxis. The different approaches either target the parasite population in the host or on pasture, but the goal of all of them is to restrict host parasite contact to levels which minimise the impact of helminths on host welfare and/or performance. Infrapopulation regulation can be achieved through methods that enhance immunity such as optimised nutrition (immunonutrition), genetic selection and vaccination, or by an 'anthelmintic' route using bioactive forages, copper oxide wire particles, or use of targeted selective treatment strategies such as FAMACHA, which reduce the selection pressure for the development of resistance by maintaining a population in refugia. Suprapopulation control can be achieved through grazing management, or by using predacious fungi such as Duddingtonia flagrans. All of these approaches have been developed beyond the proof of concept stage and some are capable of being employed currently. However, some still require knowledge transfer, or commercialisation before they can be tested and widely applied in the field. All of the different approaches present unique challenges to the researchers engaged in developing them, and in comparison to simple prescriptive anthelmintic treatments, their use appears complex and requires some expertise on behalf of the advisor and/or end user. At present, most of our data are derived from trials using single approaches, but it is apparent that we need to move towards integrating some of these technologies which again represents a further challenge to the extension/advisory services. Progress in establishing different approaches requires not only the funding to support their scientific development but also to support the development of computer based models which can be used to highlight deficiencies in our understanding of the control mechanisms and to identify impediments to their introduction. It is inevitable that some of the approaches currently under investigation will fail to become widely applied for a variety of reasons that are not solely financial. These include issues concerned with practicability/applicability, affordability/appropriateness, availability/deliverability and above all, the failure to provide a consistent, reliable effect when used under commercial farming conditions.
Self-Regulated Learning: Examining the Baccalaureate Millennial Nursing Student's Approach.
Robb, Meigan K
2016-01-01
Pre-licensure baccalaureate nursing programs are facing the demand to retain and graduate students with the skills needed for the complex health care environment. Nursing faculty are challenged to identify the best pedagogical methods for educating the current generation of students. The influence of student-centered approaches is documented in the literature. However, the effective use of these methods requires a collaborative partnership. The cognitive, self-regulated approaches used by millennial nursing students is not well understood. This article describes the findings of a study that examined the relationship between self-regulated approaches to learning, self-efficacy, independent study behaviors, and grade point average.
Automated Power Assessment for Helicopter Turboshaft Engines
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Litt, Jonathan S.
2008-01-01
An accurate indication of available power is required for helicopter mission planning purposes. Available power is currently estimated on U.S. Army Blackhawk helicopters by performing a Maximum Power Check (MPC), a manual procedure performed by maintenance pilots on a periodic basis. The MPC establishes Engine Torque Factor (ETF), an indication of available power. It is desirable to replace the current manual MPC procedure with an automated approach that will enable continuous real-time assessment of available power utilizing normal mission data. This report presents an automated power assessment approach which processes data currently collected within helicopter Health and Usage Monitoring System (HUMS) units. The overall approach consists of: 1) a steady-state data filter which identifies and extracts steady-state operating points within HUMS data sets; 2) engine performance curve trend monitoring and updating; and 3) automated ETF calculation. The algorithm is coded in MATLAB (The MathWorks, Inc.) and currently runs on a PC. Results from the application of this technique to HUMS mission data collected from UH-60L aircraft equipped with T700-GE-701C engines are presented and compared to manually calculated ETF values. Potential future enhancements are discussed.
Study of advanced atmospheric entry systems for Mars
NASA Technical Reports Server (NTRS)
1978-01-01
Entry system designs are described for various advanced Mars missions including sample return, hard lander, and Mars airplane. The Mars exploration systems for sample return and the hard lander require decleration from direct approach entry velocities of about 6 km/s to terminal velocities consistent with surface landing requirements. The Mars airplane entry system is decelerated from orbit at 4.6 km/s to deployment near the surface. Mass performance characteristics of major elements of the Mass performance characteristics are estimated for the major elements of the required entry systems using Viking technology or logical extensions of technology in order to provide a common basis of comparison for the three entry modes mission mode approaches. The entry systems, although not optimized, are based on Viking designs and reflect current hardware performance capability and realistic mass relationships.
Generic Sensor Failure Modeling for Cooperative Systems.
Jäger, Georg; Zug, Sebastian; Casimiro, António
2018-03-20
The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application's fault tolerance and thereby promises maintainability of such system's safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques.
Generic Sensor Failure Modeling for Cooperative Systems
Jäger, Georg; Zug, Sebastian
2018-01-01
The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application’s fault tolerance and thereby promises maintainability of such system’s safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques. PMID:29558435
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
Romero, Javier A; Domínguez, Gabriela A; Anoardo, Esteban
2017-03-01
An important requirement for a gradient coil is that the uniformity of the generated magnetic field gradient should be maximal within the active volume of the coil. For a cylindrical geometry, the radial uniformity of the gradient turns critic, particularly in cases where the gradient-unit has to be designed to fit into the inner bore of a compact magnet of reduced dimensions, like those typically used in fast-field-cycling NMR. In this paper we present two practical solutions aimed to fulfill this requirement. We propose a matrix-inversion optimization algorithm based on the Biot-Savart law, that using a proper cost function, allows maximizing the uniformity of the gradient and power efficiency. The used methodology and the simulation code were validated in a single-current design, by comparing the computer simulated field map with the experimental data measured in a real prototype. After comparing the obtained results with the target field approach, a multiple-element coil driven by independent current sources is discussed, and a real prototype evaluated. Opposed equispaced independent windings are connected in pairs conforming an arrangement of independent anti-Helmholtz units. This last coil seizes 80% of its radial dimension with a gradient uniformity better than 5%. The design also provides an adaptable region of uniformity along with adjustable coil efficiency. Copyright © 2017 Elsevier Inc. All rights reserved.
Disclosure: what works now and what can work even better.
2004-01-01
This three-part series on disclosure of unanticipated outcomes in health care is intended to provide an overview of the current thinking about disclosure and steps the organization can take to develop an approach to disclosure that is comprehensive and supportive of the needs of patients, families and providers. What should be apparent is that disclosure is not simply a requirement--it is a philosophy and part of a comprehensive approach to patient/family communication.
Can a Focus on Preventable Events Help Untangle the Quality Measurement Mess?
Miller, Michael
2016-01-01
The success of a shift from paying for volume to paying for value depends on our ability to measure quality. Unfortunately, current approaches to measuring quality and linking quality to payment have frustrated providers and failed to provide essential information to patients. Shifting to a focus on preventable events could go a long way toward clarifying and simplifying quality measurement, but successful adoption of that approach requires overcoming several substantive and political challenges.
Bridging paradigms: hybrid mechanistic-discriminative predictive models.
Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa
2013-03-01
Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.
Richard W. Haynes; Kenneth E. Skog; Richard Aubuchon
2016-01-01
The USDA Forest Service is required to appraise timber prior to it being offered for sale. Currently the Forest Service uses a transaction evidence based approach, but concerns have been raised about availabilityâboth in number and applicabilityâof timber sales used as the basis of this approach. In addition to the problem of few sales, in certain situations a notable...
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann
1988-01-01
Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.
Braddy, April C; Davit, Barbara M; Stier, Ethan M; Conner, Dale P
2015-01-01
The objective of this article is to discuss the similarities and differences in accepted bioequivalence (BE) approaches for generic topical dermatological drug products between international regulatory authorities and organizations. These drug products are locally applied and not intended for systemic absorption. Therefore, the BE approaches which serve as surrogates to establish safety and efficacy for topical dosage forms tend to differ from the traditional solid oral dosage forms. We focused on 15 different international jurisdictions and organizations that currently participate in the International Generic Drug Regulators Pilot Project. These are Australia, Brazil, Canada, China, Chinese Taipei, the European Medicines Association (EMA), Japan, Mexico, New Zealand, Singapore (a member of the Association of Southeast Asian Nations), South Africa, South Korea, Switzerland, the USA and the World Health Organization (WHO). Upon evaluation, we observed that currently only Canada, the EMA, Japan, and the USA have specific guidance documents for topical drug products. Across all jurisdictions and organizations, the three approaches consistently required are (1) BE studies with clinical endpoints for most topical drug products; (2) in vivo pharmacodynamic studies, in particular the vasoconstrictor assay for topical corticosteroids; and (3) waivers from BE study requirements for topical solutions. Japan, South Africa, the USA, and the WHO are also making strides to accept other BE approaches such as in vivo pharmacokinetic studies for BE assessment, in vivo dermatopharmacokinetic studies and/or BE studies with in vitro endpoints.
Kim, Katherine K; Browe, Dennis K; Logan, Holly C; Holm, Roberta; Hack, Lori; Ohno-Machado, Lucila
2014-01-01
There is currently limited information on best practices for the development of governance requirements for distributed research networks (DRNs), an emerging model that promotes clinical data reuse and improves timeliness of comparative effectiveness research. Much of the existing information is based on a single type of stakeholder such as researchers or administrators. This paper reports on a triangulated approach to developing DRN data governance requirements based on a combination of policy analysis with experts, interviews with institutional leaders, and patient focus groups. This approach is illustrated with an example from the Scalable National Network for Effectiveness Research, which resulted in 91 requirements. These requirements were analyzed against the Fair Information Practice Principles (FIPPs) and Health Insurance Portability and Accountability Act (HIPAA) protected versus non-protected health information. The requirements addressed all FIPPs, showing how a DRN's technical infrastructure is able to fulfill HIPAA regulations, protect privacy, and provide a trustworthy platform for research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Astrophysics Data System (ADS)
Lambert, Jean-Christopher; Bojkov, Bojan
The Committee on Earth Observation Satellites (CEOS)/Working Group on Calibration and Validation (WGCV) is developing a global data quality strategy for the Global Earth Obser-vation System of Systems (GEOSS). In this context, CEOS WGCV elaborated the GEOSS Quality Assurance framework for Earth Observation (QA4EO, http://qa4eo.org). QA4EO en-compasses a documentary framework and a set of ten guidelines, which describe the top-level approach of QA activities and key requirements that drive the QA process. QA4EO is appli-cable virtually to all Earth Observation data. Calibration and validation activities are a cornerstone of the GEOSS data quality strategy. Proper uncertainty assessment of the satellite measurements and their derived data products is essential, and needs to be continuously monitored and traceable to standards. As a practical application of QA4EO, CEOS WGCV has undertaken to establish a set of best practices, methodologies and guidelines for satellite calibration and validation. The present paper reviews current developments of best practices and guidelines for the vali-dation of atmospheric composition satellites. Aimed as a community effort, the approach is to start with current practices that could be improved with time. The present review addresses current validation capabilities, achievements, caveats, harmonization efforts, and challenges. Terminologies and general principles of validation are reminded. Going beyond elementary def-initions of validation like the assessment of uncertainties, the specific GEOSS context requires considering also the validation of individual service components and against user requirements.
NASA Astrophysics Data System (ADS)
Rastogi, Richa; Srivastava, Abhishek; Khonde, Kiran; Sirasala, Kirannmayi M.; Londhe, Ashutosh; Chavhan, Hitesh
2015-07-01
This paper presents an efficient parallel 3D Kirchhoff depth migration algorithm suitable for current class of multicore architecture. The fundamental Kirchhoff depth migration algorithm exhibits inherent parallelism however, when it comes to 3D data migration, as the data size increases the resource requirement of the algorithm also increases. This challenges its practical implementation even on current generation high performance computing systems. Therefore a smart parallelization approach is essential to handle 3D data for migration. The most compute intensive part of Kirchhoff depth migration algorithm is the calculation of traveltime tables due to its resource requirements such as memory/storage and I/O. In the current research work, we target this area and develop a competent parallel algorithm for post and prestack 3D Kirchhoff depth migration, using hybrid MPI+OpenMP programming techniques. We introduce a concept of flexi-depth iterations while depth migrating data in parallel imaging space, using optimized traveltime table computations. This concept provides flexibility to the algorithm by migrating data in a number of depth iterations, which depends upon the available node memory and the size of data to be migrated during runtime. Furthermore, it minimizes the requirements of storage, I/O and inter-node communication, thus making it advantageous over the conventional parallelization approaches. The developed parallel algorithm is demonstrated and analysed on Yuva II, a PARAM series of supercomputers. Optimization, performance and scalability experiment results along with the migration outcome show the effectiveness of the parallel algorithm.
Regression Analysis of a Disease Onset Distribution Using Diagnosis Data
Young, Jessica G.; Jewell, Nicholas P.; Samuels, Steven J.
2008-01-01
Summary We consider methods for estimating the effect of a covariate on a disease onset distribution when the observed data structure consists of right-censored data on diagnosis times and current status data on onset times amongst individuals who have not yet been diagnosed. Dunson and Baird (2001, Biometrics 57, 306–403) approached this problem using maximum likelihood, under the assumption that the ratio of the diagnosis and onset distributions is monotonic nondecreasing. As an alternative, we propose a two-step estimator, an extension of the approach of van der Laan, Jewell, and Petersen (1997, Biometrika 84, 539–554) in the single sample setting, which is computationally much simpler and requires no assumptions on this ratio. A simulation study is performed comparing estimates obtained from these two approaches, as well as that from a standard current status analysis that ignores diagnosis data. Results indicate that the Dunson and Baird estimator outperforms the two-step estimator when the monotonicity assumption holds, but the reverse is true when the assumption fails. The simple current status estimator loses only a small amount of precision in comparison to the two-step procedure but requires monitoring time information for all individuals. In the data that motivated this work, a study of uterine fibroids and chemical exposure to dioxin, the monotonicity assumption is seen to fail. Here, the two-step and current status estimators both show no significant association between the level of dioxin exposure and the hazard for onset of uterine fibroids; the two-step estimator of the relative hazard associated with increasing levels of exposure has the least estimated variance amongst the three estimators considered. PMID:17680832
Supporting Positive Behaviour in Alberta Schools: An Intensive Individualized Approach
ERIC Educational Resources Information Center
Souveny, Dwaine
2008-01-01
Drawing on current research and best practices, this third part of the three-part resource, "Supporting Positive Behaviour in Alberta Schools," provides information and strategies for providing intensive, individualized support and instruction for the small percentage of students requiring a high degree of intervention. This system of…
Memory Span and General Intelligence: A Latent-Variable Approach
ERIC Educational Resources Information Center
Colom, Roberto; Abad, Francisco J.; Rebollo, Irene; Chun Shih, Pei
2005-01-01
There are several studies showing that working memory and intelligence are strongly related. However, working memory tasks require simultaneous processing and storage, so the causes of their relationship with intelligence are currently a matter of discussion. The present study examined the simultaneous relationships among short-term memory (STM),…
DOT National Transportation Integrated Search
2012-03-31
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
DOT National Transportation Integrated Search
2012-03-01
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
Using the Scientific Method to Improve Mentoring
ERIC Educational Resources Information Center
McGuire, Saundra Yancy
2007-01-01
Many students who enter colleges and universities seem to be focused on memorizing and regurgitating information rather than on developing critical thinking and problem solving skills. Mentoring is crucial to help these students transition from the current approach to one that will be successful in college. Successful mentoring requires a…
Hassanein, Tarek
2017-04-01
Hepatic Encephalopathy is a devastating complication of End-Stage Liver Disease. In its severe grades it requires extra intervention beyond the standard medical approaches. In this article were view the role of liver support systems in managing hepatic encephalopthy.
Guidelines for line-oriented flight training, volume 2
NASA Technical Reports Server (NTRS)
Lauber, J. K.; Foushee, H. C.
1981-01-01
Current approaches to line-oriented flight training used by six American airlines are described. This recurrent training methodology makes use of a full-crew and full-mission simulation to teach and assess resource management skills, but does not necessarily fulfill requirements for the training and manipulation of all skills.
USDA-ARS?s Scientific Manuscript database
Satellite remote sensing technologies have been widely used to map spatiotemporal variability in consumptive water use (or evapotranspiration; ET) for agricultural water management applications. However, current satellite-based sensors with the high spatial resolution required to map ET at sub-field...
USDA-ARS?s Scientific Manuscript database
Satellite remote sensing technologies have been widely used to map spatiotemporal variability in consumptive water use (or evapotranspiration; ET) for agricultural water management applications. However, current satellite-based sensors with the high spatial resolution required to map ET at sub-field...
Simulating potato gas exchange as influenced by CO2 and irrigation
USDA-ARS?s Scientific Manuscript database
Recent research suggests that an energy balance approach is required for crop models to adequately respond to current and future climatic conditions associated with elevated CO2, higher temperatures, and water scarcity. More realistic models are needed in order to understand the impact of, and deve...
2011-04-11
countries. PepsiCo, IBM and Nike are current examples of the so-called "game planning" approach to succession and talent ~anagement.21 Annual...Rothwell and associates, inc .; a full-service consulting Succession Planning finn provides other partner industries with the value of strategic
ERIC Educational Resources Information Center
Khribi, Mohamed Koutheair; Jemni, Mohamed; Nasraoui, Olfa
2009-01-01
In this paper, we describe an automatic personalization approach aiming to provide online automatic recommendations for active learners without requiring their explicit feedback. Recommended learning resources are computed based on the current learner's recent navigation history, as well as exploiting similarities and dissimilarities among…
The European Project Semester at ISEP: The Challenge of Educating Global Engineers
ERIC Educational Resources Information Center
Malheiro, Benedita; Silva, Manuel; Ribeiro, Maria Cristina; Guedes, Pedro; Ferreira, Paulo
2015-01-01
Current engineering education challenges require approaches that promote scientific, technical, design and complementary skills while fostering autonomy, innovation and responsibility. The European Project Semester (EPS) at Instituto Superior de Engenharia do Porto (ISEP) (EPS@ISEP) is a one semester project-based learning programme (30 European…
Childhood Lead Poisoning: Developing Prevention Programs and Mobilizing Resources.
ERIC Educational Resources Information Center
Rochow, K. W. James
The current approach to dealing with childhood lead poisoning has led to repeated diagnoses of poisoning because such children are treated and then returned to their hazardous environments. This handbook describes in detail the program requirements for effective childhood lead poisoning prevention programs at the local level based on the…
Using Longitudinal Scales Assessment for Instrumental Music Students
ERIC Educational Resources Information Center
Simon, Samuel H.
2014-01-01
In music education, current assessment trends emphasize student reflection, tracking progress over time, and formative as well as summative measures. This view of assessment requires instrumental music educators to modernize their approaches without interfering with methods that have proven to be successful. To this end, the Longitudinal Scales…
A Comparative Survey of Education Systems: Structure, Organization and Development.
ERIC Educational Resources Information Center
King, Edmund
1990-01-01
Education must disengage from current accountancy concerns and serve learners now and in their real contexts. The massive efficiency of our industrialized apparatus for processing people in formal education prevents us from recognizing that a new approach is needed to satisfy tomorrow's uncertain and unlimited requirements. An international…
Current approaches to the management of new-onset ulcerative colitis
Marchioni Beery, Renée; Kane, Sunanda
2014-01-01
Ulcerative colitis (UC) is an idiopathic, inflammatory gastrointestinal disease of the colon. As a chronic condition, UC follows a relapsing and remitting course with medical maintenance during periods of quiescent disease and appropriate escalation of therapy during times of flare. Initial treatment strategies must not only take into account current clinical presentation (with specific regard for extent and severity of disease activity) but must also take into consideration treatment options for the long-term. The following review offers an approach to new-onset UC with a focus on early treatment strategies. An introduction to the disease entity is provided along with an approach to initial diagnosis. Stratification of patients based on clinical parameters, disease extent, and severity of illness is paramount to determining course of therapy. Frequent assessments are required to determine clinical response, and treatment intensification may be warranted if expected improvement goals are not appropriately reached. Mild-to- moderate UC can be managed with aminosalicylates, mesalamine, and topical corticosteroids with oral corticosteroids reserved for unresponsive cases. Moderate-to-severe UC generally requires oral or intravenous corticosteroids in the short-term with consideration of long-term management options such as biologic agents (as initial therapy or in transition from steroids) or thiopurines (as bridging therapy). Patients with severe or fulminant UC who are recalcitrant to medical therapy or who develop disease complications (such as toxic megacolon) should be considered for colectomy. Early surgical referral in severe or refractory UC is crucial, and colectomy may be a life-saving procedure. The authors provide a comprehensive evidence-based approach to current treatment options for new-onset UC with discussion of long-term therapeutic efficacy and safety, patient-centered perspectives including quality of life and medication compliance, and future directions in related inflammatory bowel disease care. PMID:24872716
Pentti, Marita; Muller, Jennifer; Janda, Monika; Newman, Beth
2009-02-01
To describe the views of supervisors of colonoscopy training in regard to colonoscopy training capacity and quality in Australia. Anonymous postal surveys from March to May 2007 were posted to 127 colonoscopy training supervisors (30.2% estimated response rate). The surveys queried colonoscopy training capacity and quality, supervisors' views and opinions on innovative approaches to colonoscopy training, number of colonoscopies and time required by trainees to gain competence in colonoscopy. Approximately 50% of trainers agreed and 27% disagreed that current numbers of training places were adequate to maintain a skilled colonoscopy workforce in preparation for the National Bowel Cancer Screening Program (NBCSP). A collaborative approach with the private sector was seen as beneficial by 65%. Non-gastroenterologists (non-GEs) were more likely than gastroenterologists (GEs) to be of the opinion that simulators are beneficial for colonoscopy training (chi(2)-test = 5.55, P = 0.026). The majority of trainers did not support training either nurses (73%) or general practitioners (GPs) in colonoscopy (71%). Approximately 60% of trainers considered that the current requirements for recognition of training in colonoscopy could be insufficient for trainees to gain competence and 80% of those indicated that > or = 200 colonoscopies were needed. Colonoscopy training in Australia has traditionally followed the apprenticeship model. Projected increases in demand for colonoscopy with the introduction of the NBCSP may require additional training places and new and innovative approaches to training in order to ensure the provision of high-quality colonoscopy services under the NBCSP.
Current treatment paradigms in rheumatoid arthritis.
Fries, J F
2000-06-01
Rheumatoid arthritis (RA) has traditionally been treated using the pyramid approach, in which non-steroidal anti-inflammatory drugs (NSAIDs) are the first-line treatment and disease-modifying anti-rheumatic drugs (DMARDs) are introduced relatively late in the disease. This approach is no longer valid. Previously regarded as a benign disease, RA is now recognized as causing substantial morbidity and mortality, as do the NSAIDs used in treatment. DMARDs are more effective in controlling the pain and disability of RA than NSAIDs, and are often no more toxic. The current treatment paradigm emphasizes early, consistent use of DMARDs. A 'sawtooth' strategy of DMARD use has been proposed, in which a rising but low level of disability triggers a change in therapy. Determining the most clinically useful DMARD combinations and the optimal sequence of DMARD use requires effectiveness studies, Bayesian approaches and analyses of long-term outcomes. Such approaches will allow optimization of multiple drug therapies in RA, and should substantially improve the long-term outcome for many patients.
Bianchi, Paolo Pietro; Petz, Wanda; Luca, Fabrizio; Biffi, Roberto; Spinoglio, Giuseppe; Montorsi, Marco
2014-01-01
The current standard treatment for rectal cancer is based on a multimodality approach with preoperative radiochemotherapy in advanced cases and complete surgical removal through total mesorectal excision (TME). The most frequent surgical approach is traditional open surgery, as laparoscopic TME requires high technical skill, a long learning curve, and is not widespread, still being confined to centers with great experience in minimally invasive techniques. Nevertheless, in several studies, the laparoscopic approach, when compared to open surgery, has shown some better short-term clinical outcomes and at least comparable oncologic results. Robotic surgery for the treatment of rectal cancer is an emerging technique, which could overcome some of the technical difficulties posed by standard laparoscopy, but evidence from the literature regarding its oncologic safety and clinical outcomes is still lacking. This brief review analyses the current status of minimally invasive surgery for rectal cancer therapy, focusing on oncologic safety and the new robotic approach. PMID:24834429
A Fully Automated Approach to Spike Sorting.
Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F
2017-09-13
Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.
New techniques for assessing response after hypofractionated radiotherapy for lung cancer
Mattonen, Sarah A.; Huang, Kitty; Ward, Aaron D.; Senan, Suresh
2014-01-01
Hypofractionated radiotherapy (HFRT) is an effective and increasingly-used treatment for early stage non-small cell lung cancer (NSCLC). Stereotactic ablative radiotherapy (SABR) is a form of HFRT and delivers biologically effective doses (BEDs) in excess of 100 Gy10 in 3-8 fractions. Excellent long-term outcomes have been reported; however, response assessment following SABR is complicated as radiation induced lung injury can appear similar to a recurring tumor on CT. Current approaches to scoring treatment responses include Response Evaluation Criteria in Solid Tumors (RECIST) and positron emission tomography (PET), both of which appear to have a limited role in detecting recurrences following SABR. Novel approaches to assess response are required, but new techniques should be easily standardized across centers, cost effective, with sensitivity and specificity that improves on current CT and PET approaches. This review examines potential novel approaches, focusing on the emerging field of quantitative image feature analysis, to distinguish recurrence from fibrosis after SABR. PMID:24688782
Information requirements and methodology for development of an EVA crewmember's heads up display
NASA Astrophysics Data System (ADS)
Petrek, J. S.
This paper presents a systematic approach for developing a Heads Up Display (HUD) to be used within the helmet of the Extra Vehicular Activity (EVA) crewmember. The information displayed on the EVA HUD will be analogous to EVA Flight Data File (FDF) information, which is an integral part of NASA's current Space Transportation System. Another objective is to determine information requirements and media techniques ultimately leading to the helmet-mounted HUD presentation technique.
Current State of the Art Historic Building Information Modelling
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2017-08-01
In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.
NASA Astrophysics Data System (ADS)
Rouillon, M.; Taylor, M. P.; Dong, C.
2016-12-01
This research assesses the advantages of integrating field portable X-ray Fluorescence (pXRF) technology for reducing the risk and increase confidence of decision making for metal-contaminated site assessments. Metal-contaminated sites are often highly heterogeneous and require a high sampling density to accurately characterize the distribution and concentration of contaminants. The current regulatory assessment approaches rely on a small number of samples processed using standard wet-chemistry methods. In New South Wales (NSW), Australia, the current notification trigger for characterizing metal-contaminated sites require the upper 95% confidence interval of the site mean to equal or exceed the relevant guidelines. The method's low `minimum' sampling requirements can misclassify sites due to the heterogeneous nature of soil contamination, leading to inaccurate decision making. To address this issue, we propose integrating infield pXRF analysis with the established sampling method to overcome sampling limitations. This approach increases the minimum sampling resolution and reduces the 95% CI of the site mean. Infield pXRF analysis at contamination hotspots enhances sample resolution efficiently and without the need to return to the site. In this study, the current and proposed pXRF site assessment methods are compared at five heterogeneous metal-contaminated sites by analysing the spatial distribution of contaminants, 95% confidence intervals of site means, and the sampling and analysis uncertainty associated with each method. Finally, an analysis of costs associated with both the current and proposed methods is presented to demonstrate the advantages of incorporating pXRF into metal-contaminated site assessments. The data shows that pXRF integrated site assessments allows for faster, cost-efficient, characterisation of metal-contaminated sites with greater confidence for decision making.
Scholz, Stefan; Sela, Erika; Blaha, Ludek; Braunbeck, Thomas; Galay-Burgos, Malyka; García-Franco, Mauricio; Guinea, Joaquin; Klüver, Nils; Schirmer, Kristin; Tanneberger, Katrin; Tobor-Kapłon, Marysia; Witters, Hilda; Belanger, Scott; Benfenati, Emilio; Creton, Stuart; Cronin, Mark T D; Eggen, Rik I L; Embry, Michelle; Ekman, Drew; Gourmelon, Anne; Halder, Marlies; Hardy, Barry; Hartung, Thomas; Hubesch, Bruno; Jungmann, Dirk; Lampi, Mark A; Lee, Lucy; Léonard, Marc; Küster, Eberhard; Lillicrap, Adam; Luckenbach, Till; Murk, Albertinka J; Navas, José M; Peijnenburg, Willie; Repetto, Guillermo; Salinas, Edward; Schüürmann, Gerrit; Spielmann, Horst; Tollefsen, Knut Erik; Walter-Rohde, Susanne; Whale, Graham; Wheeler, James R; Winter, Matthew J
2013-12-01
Tests with vertebrates are an integral part of environmental hazard identification and risk assessment of chemicals, plant protection products, pharmaceuticals, biocides, feed additives and effluents. These tests raise ethical and economic concerns and are considered as inappropriate for assessing all of the substances and effluents that require regulatory testing. Hence, there is a strong demand for replacement, reduction and refinement strategies and methods. However, until now alternative approaches have only rarely been used in regulatory settings. This review provides an overview on current regulations of chemicals and the requirements for animal tests in environmental hazard and risk assessment. It aims to highlight the potential areas for alternative approaches in environmental hazard identification and risk assessment. Perspectives and limitations of alternative approaches to animal tests using vertebrates in environmental toxicology, i.e. mainly fish and amphibians, are discussed. Free access to existing (proprietary) animal test data, availability of validated alternative methods and a practical implementation of conceptual approaches such as the Adverse Outcome Pathways and Integrated Testing Strategies were identified as major requirements towards the successful development and implementation of alternative approaches. Although this article focusses on European regulations, its considerations and conclusions are of global relevance. Copyright © 2013 Elsevier Inc. All rights reserved.
Traister, Russell S.
2008-01-01
Arthritis is among the leading causes of disability in the developed world. There remains no cure for this disease and the current treatments are only modestly effective at slowing the disease's progression and providing symptomatic relief. The clinical effectiveness of current treatment regimens has been limited by short half-lives of the drugs and the requirement for repeated systemic administration. Utilizing gene transfer approaches for the treatment of arthritis may overcome some of the obstacles associated with current treatment strategies. The present review examines recent developments in gene therapy for arthritis. Delivery strategies, gene transfer vectors, candidate genes, and safety are also discussed. PMID:18176779
Biomarkers of tolerance: searching for the hidden phenotype.
Perucha, Esperanza; Rebollo-Mesa, Irene; Sagoo, Pervinder; Hernandez-Fuentes, Maria P
2011-08-01
Induction of transplantation tolerance remains the ideal long-term clinical and logistic solution to the current challenges facing the management of renal allograft recipients. In this review, we describe the recent studies and advances made in identifying biomarkers of renal transplant tolerance, from study inceptions, to the lessons learned and their implications for current and future studies with the same goal. With the age of biomarker discovery entering a new dimension of high-throughput technologies, here we also review the current approaches, developments, and pitfalls faced in the subsequent statistical analysis required to identify valid biomarker candidates.
Firmware Development Improves System Efficiency
NASA Technical Reports Server (NTRS)
Chern, E. James; Butler, David W.
1993-01-01
Most manufacturing processes require physical pointwise positioning of the components or tools from one location to another. Typical mechanical systems utilize either stop-and-go or fixed feed-rate procession to accomplish the task. The first approach achieves positional accuracy but prolongs overall time and increases wear on the mechanical system. The second approach sustains the throughput but compromises positional accuracy. A computer firmware approach has been developed to optimize this point wise mechanism by utilizing programmable interrupt controls to synchronize engineering processes 'on the fly'. This principle has been implemented in an eddy current imaging system to demonstrate the improvement. Software programs were developed that enable a mechanical controller card to transmit interrupts to a system controller as a trigger signal to initiate an eddy current data acquisition routine. The advantages are: (1) optimized manufacturing processes, (2) increased throughput of the system, (3) improved positional accuracy, and (4) reduced wear and tear on the mechanical system.
Overview of Heavy Ion Fusion Accelerator Research in the U. S.
NASA Astrophysics Data System (ADS)
Friedman, Alex
2002-12-01
This article provides an overview of current U.S. research on accelerators for Heavy Ion Fusion, that is, inertial fusion driven by intense beams of heavy ions with the goal of energy production. The concept, beam requirements, approach, and major issues are introduced. An overview of a number of new experiments is presented. These include: the High Current Experiment now underway at Lawrence Berkeley National Laboratory; studies of advanced injectors (and in particular an approach based on the merging of multiple beamlets), being investigated experimentally at Lawrence Livermore National Laboratory); the Neutralized (chamber) Transport Experiment being assembled at Lawrence Berkeley National Laboratory; and smaller experiments at the University of Maryland and at Princeton Plasma Physics Laboratory. The comprehensive program of beam simulations and theory is outlined. Finally, prospects and plans for further development of this promising approach to fusion energy are discussed.
Multidisciplinary treatment approach in Treacher Collins syndrome.
Hylton, Joseph B; Leon-Salazar, Vladimir; Anderson, Gary C; De Felippe, Nanci L O
2012-01-01
Treacher Collins syndrome (TCS) is a common genetic disorder with high penetrance and phenotypic variability. First and second branchial arches are affected in TCS, resulting in craniofacial and intraoral anomalies such as: severe convex facial profile; mid-face hypoplasia; microtia; eyelid colobomas; mandibular retrognathism; cleft palate; dental hypoplasia; heterotopic teeth; maxillary transverse hypoplasia; anterior open bite; and Angle Class II molar relationship. A high incidence of caries is also a typical finding in TCS patients. Nonetheless, even simple dental restorative procedures can be challenging in this patient population due to other associated medical conditions, such as: congenital heart defects; decreased oropharyngeal airways; hearing loss; and anxiety toward treatment. These patients often require a multidisciplinary treatment approach, including: audiology; speech and language pathology; otorhinolaryngology; general dentistry; orthodontics; oral and maxillofacial surgery; and plastic and reconstructive surgeries to improve facial appearance. This paper's purpose was to present a current understanding of Treacher Collins syndrome etiology, phenotype, and current treatment approaches.
[The current approach to hemangiomas and vascular malformations of the head and neck].
Raveh, E; Waner, M; Kornreich, L; Segal, K; Ben-Amitai, D; Kalish, E; Lapidot, M; Mimon, S; Shalev, B; Feinmesser, R
2002-09-01
Though most hemangiomas do not need treatment, a significant minority are associated with complications and external deformities that demand intervention. Steroids play an important role in therapy, but not infrequently afford only partial and temporary benefit. Thanks to improvements in the surgical approach and equipment, hemostasis control devices and laser techniques, we can now treat patients who would otherwise go untreated. Moreover, in certain cases, we can now recommend earlier intervention, saving patients from years of living with deformities and the concomitant psychosocial problems. Vascular anomalies of the head and neck include venular, venous and arteriovenous malformations. These lesions are slow growing vascular ectasia that never involute spontaneously and almost always require intervention. Treatment includes laser therapy, injection of sclerosing agents, embolization through angiography and surgery, which in many cases is the only definitive treatment. We present the current treatment approach and describe our experience in the treatment of 16 patients.
Input current shaped ac-to-dc converters
NASA Technical Reports Server (NTRS)
1985-01-01
Input current shaping techniques for ac-to-dc converters were investigated. Input frequencies much higher than normal, up to 20 kHz were emphasized. Several methods of shaping the input current waveform in ac-to-dc converters were reviewed. The simplest method is the LC filter following the rectifier. The next simplest method is the resistor emulation approach in which the inductor size is determined by the converter switching frequency and not by the line input frequency. Other methods require complicated switch drive algorithms to construct the input current waveshape. For a high-frequency line input, on the order of 20 kHz, the simple LC cannot be discarded so peremptorily, since the inductor size can be compared with that for the resistor emulation method. In fact, since a dc regulator will normally be required after the filter anyway, the total component count is almost the same as for the resistor emulation method, in which the filter is effectively incorporated into the regulator.
Environmental flow assessments for transformed estuaries
NASA Astrophysics Data System (ADS)
Sun, Tao; Zhang, Heyue; Yang, Zhifeng; Yang, Wei
2015-01-01
Here, we propose an approach to environmental flow assessment that considers spatial pattern variations in potential habitats affected by river discharges and tidal currents in estuaries. The approach comprises four steps: identifying and simulating the distributions of critical environmental factors for habitats of typical species in an estuary; mapping of suitable habitats based on spatial distributions of the Habitat Suitability Index (HSI) and adopting the habitat aggregation index to understand fragmentation of potential suitable habitats; defining variations in water requirements for a certain species using trade-off analysis for different protection objectives; and recommending environmental flows in the estuary considering the compatibility and conflict of freshwater requirements for different species. This approach was tested using a case study in the Yellow River Estuary. Recommended environmental flows were determined by incorporating the requirements of four types of species into the assessments. Greater variability in freshwater inflows could be incorporated into the recommended environmental flows considering the adaptation of potential suitable habitats with variations in the flow regime. Environmental flow allocations should be conducted in conjunction with land use conflict management in estuaries. Based on the results presented here, the proposed approach offers flexible assessment of environmental flow for aquatic ecosystems that may be subject to future change.
Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B
2015-03-04
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.
Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.
2015-01-01
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092
Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen
Here, we propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator–coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.
Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits
Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen; ...
2018-03-12
Here, we propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator–coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.
Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits
NASA Astrophysics Data System (ADS)
Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen; Gauthier, Daniel J.
2018-03-01
We propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator-coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.
Life extending control for rocket engines
NASA Technical Reports Server (NTRS)
Lorenzo, C. F.; Saus, J. R.; Ray, A.; Carpino, M.; Wu, M.-K.
1992-01-01
The concept of life extending control is defined. A brief discussion of current fatigue life prediction methods is given and the need for an alternative life prediction model based on a continuous functional relationship is established. Two approaches to life extending control are considered: (1) the implicit approach which uses cyclic fatigue life prediction as a basis for control design; and (2) the continuous life prediction approach which requires a continuous damage law. Progress on an initial formulation of a continuous (in time) fatigue model is presented. Finally, nonlinear programming is used to develop initial results for life extension for a simplified rocket engine (model).
Advanced order management in ERM systems: the tic-tac-toe algorithm
NASA Astrophysics Data System (ADS)
Badell, Mariana; Fernandez, Elena; Puigjaner, Luis
2000-10-01
The concept behind improved enterprise resource planning systems (ERP) systems is the overall integration of the whole enterprise functionality into the management systems through financial links. Converting current software into real management decision tools requires crucial changes in the current approach to ERP systems. This evolution must be able to incorporate the technological achievements both properly and in time. The exploitation phase of plants needs an open web-based environment for collaborative business-engineering with on-line schedulers. Today's short lifecycles of products and processes require sharp and finely tuned management actions that must be guided by scheduling tools. Additionally, such actions must be able to keep track of money movements related to supply chain events. Thus, the necessary outputs require financial-production integration at the scheduling level as proposed in the new approach of enterprise management systems (ERM). Within this framework, the economical analysis of the due date policy and its optimization become essential to manage dynamically realistic and optimal delivery dates with price-time trade-off during the marketing activities. In this work we propose a scheduling tool with web-based interface conducted by autonomous agents when precise economic information relative to plant and business actions and their effects are provided. It aims to attain a better arrangement of the marketing and production events in order to face the bid/bargain process during e-commerce. Additionally, management systems require real time execution and an efficient transaction-oriented approach capable to dynamically adopt realistic and optimal actions to support marketing management. To this end the TicTacToe algorithm provides sequence optimization with acceptable tolerances in realistic time.
NASA Astrophysics Data System (ADS)
Ellery, A.
Since the remarkable British Interplanetary Society starship study of the late 1970s - Daedalus - there have been significant developments in the areas of artificial intelligence and robotics. These will be critical technologies for any starship as indeed they are for the current generation of exploratory spacecraft and in-situ planetary robotic explorers. Although early visions of truly intelligent robots have yet to materialize (reasons for which will be outlined), there are nonetheless revolutionary developments which have attempted to address at least some of these earlier unperceived deficiencies. The current state of the art comprises a number of separate strands of research which provide components of robotic intelligence though no over- arching approach has been forthcoming. The first question to be considered is the level of intelligent functionality required to support a long-duration starship mission. This will, at a minimum, need to be extensive imposed by the requirement for complex reconfigurability and repair. The second question concerns the tools that we have at our disposal to implement the required intelligent functions of the starship. These are based on two very different approaches - good old-fashioned artificial intelligence (GOFAI) based on logical theorem-proving and knowledge-encoding recently augmented by modal, temporal, circumscriptive and fuzzy logics to address the well-known “frame problem”; and the more recent soft computing approaches based on artificial neural networks, evolutionary algorithms and immunity models and their variants to implement learning. The former has some flight heritage through the Remote Agent architecture whilst the latter has yet to be deployed on any space mission. However, the notion of reconfigurable hardware of recent interest in the space community warrants the use of evolutionary algorithms and neural networks implemented on field programmable gate array technology, blurring the distinction between hardware and software. The primary question in space engineering has traditionally been one of predictability and controllability which online learning compromises. A further factor to be accounted for is the notion that intelligence is derived primarily from robot-environment interaction which stresses the sensory and actuation capabilities (exemplified by the behavioural or situated robotics paradigm). One major concern is whether the major deficiency of current methods in terms of lack of scalability can be overcome using a highly distributed approach rather than the hierarchical approach suggested by the NASREM architecture. It is contended here that a mixed solution will be required where a priori programming is augmented by a posteriori learning resembling the biological distinction between fixed genetically inherited and learned neurally implemented behaviour in animals. In particular, a biomimetic approach is proferred which exploits the neural processes and architecture of the human brain through the use of forward models which attempts to marry the conflicting requirements of learning with predictability. Some small-scale efforts in this direction will be outlined.
NASA Technical Reports Server (NTRS)
Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.
1981-01-01
An active controls technology (ACT) system architecture was selected based on current technology system elements and optimal control theory was evaluated for use in analyzing and synthesizing ACT multiple control laws. The system selected employs three redundant computers to implement all of the ACT functions, four redundant smaller computers to implement the crucial pitch-augmented stability function, and a separate maintenance and display computer. The reliability objective of probability of crucial function failure of less than 1 x 10 to the -9th power per flight of 1 hr can be met with current technology system components, if the software is assumed fault free and coverage approaching 1.0 can be provided. The optimal control theory approach to ACT control law synthesis yielded comparable control law performance much more systematically and directly than the classical s-domain approach. The ACT control law performance, although somewhat degraded by the inclusion of representative nonlinearities, remained quite effective. Certain high-frequency gust-load alleviation functions may require increased surface rate capability.
Space Solar Power Concepts: Demonstrations to Pilot Plants
NASA Technical Reports Server (NTRS)
Carrington, Connie K.; Feingold, Harvey; Howell, Joe T. (Technical Monitor)
2002-01-01
The availability of abundant, affordable power where needed is a key to the future exploration and development of space as well as future sources of clean terrestrial power. One innovative approach to providing such power is the use of wireless power transmission (WPT). There are at least two possible WPT methods that appear feasible; microwave and laser. Microwave concepts have been generated, analyzed and demonstrated. Technologies required to provide an end-to-end system have been identified and roadmaps generated to guide technology development requirements. Recently, laser W T approaches have gained an increased interest. These approaches appear to be very promising and will possibly solve some of the major challenges that exist with the microwave option. Therefore, emphasis is currently being placed on the laser WPT activity. This paper will discuss the technology requirements, technology roadmaps and technology flight experiments demonstrations required to lead toward a pilot plant demonstration. Concepts will be discussed along with the modeling techniques that are used in developing them. Feasibility will be addressed along with the technology needs, issues and capabilities for particular concepts. Flight experiments and demonstrations will be identified that will pave the road from demonstrations to pilot plants and beyond.
Depression and Anxiety in Parkinson's Disease.
Schrag, Anette; Taddei, Raquel N
2017-01-01
Depression and anxiety are some of the most common comorbidities arising in patients with Parkinson's disease. However, their timely recognition and diagnosis are often hindered by overlap with other somatic features and a low rate of self-report. There is a need for greater awareness and for better assessment and treatment options are highly required. Currently available scales can serve as tools to monitor change over time and the effect of interventional strategies. Development of new therapeutic strategies, including nonpharmacological approaches such as transcranial magnetic stimulation and deep brain stimulation, may provide alternatives to currently available treatment approaches. In this chapter we will give an overview of the most recent advances in the diagnosis and treatment of these important nonmotor symptoms. © 2017 Elsevier Inc. All rights reserved.
Bendable X-ray Optics for High Resolution Imaging
NASA Technical Reports Server (NTRS)
Gubarev, M.; Ramsey, B.; Kilaru, K.; Atkins, C.; Broadway, D.
2014-01-01
Current state-of the-art for x-ray optics fabrication calls for either the polishing of massive substrates into high-angular-resolution mirrors or the replication of thin, lower-resolution, mirrors from perfectly figured mandrels. Future X-ray Missions will require a change in this optics fabrication paradigm in order to achieve sub-arcsecond resolution in light-weight optics. One possible approach to this is to start with perfectly flat, light-weight surface, bend it into a perfect cone, form the desired mirror figure by material deposition, and insert the resulting mirror into a telescope structure. Such an approach is currently being investigated at MSFC, and a status report will be presented detailing the results of finite element analyses, bending tests and differential deposition experiments.
A survey of hard X-ray imaging concepts currently proposed for viewing solar flares
NASA Technical Reports Server (NTRS)
Campbell, Jonathan W.; Davis, John M.; Emslie, A. G.
1991-01-01
Several approaches to imaging hard X-rays emitted from solar flares have been proposed. These include the fixed modulation collimator, the rotating modulation collimator, the spiral fresnel zone pattern, and the redundantly coded aperture. These techniques are under consideration for use in the Solar Maximum '91 balloon program, the Japanese Solar-A satellite, the Controls, Astrophysics, and Structures Experiment in Space, and the Pinhole/Occulter Facility and are outlined and discussed in the context of preliminary results from numerical modeling and the requirements derived from current ideas as to the expected hard X-ray structures in the impulsive phase of solar flares. Preliminary indications are that all of the approaches are promising, but each has its own unique set of limitations.
NASA Technical Reports Server (NTRS)
Doxley, Charles A.
2016-01-01
In the current world of applications that use reconfigurable technology implemented on field programmable gate arrays (FPGAs), there is a need for flexible architectures that can grow as the systems evolve. A project has limited resources and a fixed set of requirements that development efforts are tasked to meet. Designers must develop robust solutions that practically meet the current customer demands and also have the ability to grow for future performance. This paper describes the development of a high speed serial data streaming algorithm that allows for transmission of multiple data channels over a single serial link. The technique has the ability to change to meet new applications developed for future design considerations. This approach uses the Xilinx Serial RapidIO LOGICORE Solution to implement a flexible infrastructure to meet the current project requirements with the ability to adapt future system designs.
Ginsburg, Paul B
2012-09-01
Many health policy analysts envision provider payment reforms currently under development as replacements for the traditional fee-for-service payment system. Reforms include per episode bundled payment and elements of capitation, such as global payments or accountable care organizations. But even if these approaches succeed and are widely adopted, the core method of payment to many physicians for the services they provide is likely to remain fee-for-service. It is therefore critical to address the current shortcomings in the Medicare physician fee schedule, because it will affect physician incentives and will continue to play an important role in determining the payment amounts under payment reform. This article reviews how the current payment system developed and is applied, and it highlights areas that require careful review and modification to ensure the success of broader payment reform.
NASA Astrophysics Data System (ADS)
Murphy, K. L.; Rygalov, V. Ye.; Johnson, S. B.
2009-04-01
All artificial systems and components in space degrade at higher rates than on Earth, depending in part on environmental conditions, design approach, assembly technologies, and the materials used. This degradation involves not only the hardware and software systems but the humans that interact with those systems. All technological functions and systems can be expressed through functional dependence: [Function]˜[ERU]∗[RUIS]∗[ISR]/[DR];where [ERU]efficiency (rate) of environmental resource utilization[RUIS]resource utilization infrastructure[ISR]in situ resources[DR]degradation rateThe limited resources of spaceflight and open space for autonomous missions require a high reliability (maximum possible, approaching 100%) for system functioning and operation, and must minimize the rate of any system degradation. To date, only a continuous human presence with a system in the spaceflight environment can absolutely mitigate those degradations. This mitigation is based on environmental amelioration for both the technology systems, as repair of data and spare parts, and the humans, as exercise and psychological support. Such maintenance now requires huge infrastructures, including research and development complexes and management agencies, which currently cannot move beyond the Earth. When considering what is required to move manned spaceflight from near Earth stations to remote locations such as Mars, what are the minimal technologies and infrastructures necessary for autonomous restoration of a degrading system in space? In all of the known system factors of a mission to Mars that reduce the mass load, increase the reliability, and reduce the mission’s overall risk, the current common denominator is the use of undeveloped or untested technologies. None of the technologies required to significantly reduce the risk for critical systems are currently available at acceptable readiness levels. Long term interplanetary missions require that space programs produce a craft with all systems integrated so that they are of the highest reliability. Right now, with current technologies, we cannot guarantee this reliability for a crew of six for 1000 days to Mars and back. Investigation of the technologies to answer this need and a focus of resources and research on their advancement would significantly improve chances for a safe and successful mission.
Space science in the 1990's and beyond
NASA Astrophysics Data System (ADS)
Huntress, Wesley T., Jr.; Kicza, Mary E.; Feeley, T. Jens
NASA's Office of Space Sciences is changing its approach to our missions, both current and future. Budget realities are necessitating that we change the way we do business and the way we look at our role in the Federal Government. These challenges are being met by a new and innovative approach that focuses on achieving a balanced world-class space science program that requires far less resources while providing an enhanced role for technology and education as integral components of our Research and Development (R&D) programs.
Campos, Samuel K; Barry, Michael A
2004-11-01
There are extensive efforts to develop cell-targeting adenoviral vectors for gene therapy wherein endogenous cell-binding ligands are ablated and exogenous ligands are introduced by genetic means. Although current approaches can genetically manipulate the capsid genes of adenoviral vectors, these approaches can be time-consuming and require multiple steps to produce a modified viral genome. We present here the use of the bacteriophage lambda Red recombination system as a valuable tool for the easy and rapid construction of capsid-modified adenoviral genomes.
Crew considerations in the design for Space Station Freedom modules on-orbit maintenance
NASA Technical Reports Server (NTRS)
Stokes, Jack W.; Williams, Katherine A.
1992-01-01
The paper presents an approach to the maintenance process currently planned for the Space Station Freedom modules. In particular, it describes the planned crew interfaces with maintenance items, and the anticipated implications for the crew in performing the interior and exterior maintenance of modules developed by U.S., ESA, and NASDA. Special consideration is given to the maintenance requirements, allocations, and approach; the maintenance design; the Maintenance Workstation; the robotic mechanisms; and the developemnt of maintenance techniques.
Pandey, Manisha; Sekuloski, Silvana; Batzloff, Michael R
2009-07-01
Infections caused by group A streptococcus (GAS) represent a public health problem in both developing and developed countries. The current available methods of prevention are either inadequate or ineffective, which is highlighted by the resurgence in invasive GAS infections over the past two decades. The management of GAS and associated diseases requires new and improved approaches. This review discusses various potential approaches in controlling GAS infections, ranging from prophylactic vaccines to antibody immunotherapy.
Dark matter in the coming decade: Complementary paths to discovery and beyond
Bauer, Daniel; Buckley, James; Cahill-Rowley, Matthew; ...
2015-05-27
Here, we summarize the many dark matter searches currently being pursued through four complementary approaches: direct detection, indirect detection, collider experiments, and astrophysical probes. The essential features of broad classes of experiments are described, each with their own strengths and weaknesses. Furthermore, we discuss the complementarity of the different dark matter searches qualitatively and illustrated quantitatively in two simple theoretical frameworks. Our primary conclusion is that the diversity of possible dark matter candidates requires a balanced program drawing from all four approaches.
Space science in the 1990's and beyond
NASA Technical Reports Server (NTRS)
Huntress, Wesley T., Jr.; Kicza, Mary E.; Feeley, T. Jens
1994-01-01
NASA's Office of Space Sciences is changing its approach to our missions, both current and future. Budget realities are necessitating that we change the way we do business and the way we look at our role in the Federal Government. These challenges are being met by a new and innovative approach that focuses on achieving a balanced world-class space science program that requires far less resources while providing an enhanced role for technology and education as integral components of our Research and Development (R&D) programs.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Quality, risk management and governance in mental health: an overview.
Callaly, Tom; Arya, Dinesh; Minas, Harry
2005-03-01
To consider the origin, current emphasis and relevance of the concepts of quality, risk management and clinical governance in mental health. Increasingly, health service boards and management teams are required to give attention to clinical governance rather than corporate governance alone. Clinical governance is a unifying quality concept that aims to produce a structure and systems to assure and improve the quality of clinical services by promoting an integrated and organization-wide approach towards continuous quality improvement. Many psychiatrists will find the reduction in clinical autonomy, the need to consider the welfare of the whole population as well as the individual patient for whom they are responsible, and the requirement that they play a part in a complex systems approach to quality improvement to be a challenge. Avoiding or ignoring this challenge will potentially lead to conflict with modern management approaches and increased loss of influence on future developments in mental health services.
Consistent approach to describing aircraft HIRF protection
NASA Technical Reports Server (NTRS)
Rimbey, P. R.; Walen, D. B.
1995-01-01
The high intensity radiated fields (HIRF) certification process as currently implemented is comprised of an inconsistent combination of factors that tend to emphasize worst case scenarios in assessing commercial airplane certification requirements. By examining these factors which include the process definition, the external HIRF environment, the aircraft coupling and corresponding internal fields, and methods of measuring equipment susceptibilities, activities leading to an approach to appraising airplane vulnerability to HIRF are proposed. This approach utilizes technically based criteria to evaluate the nature of the threat, including the probability of encountering the external HIRF environment. No single test or analytic method comprehensively addresses the full HIRF threat frequency spectrum. Additional tools such as statistical methods must be adopted to arrive at more realistic requirements to reflect commercial aircraft vulnerability to the HIRF threat. Test and analytic data are provided to support the conclusions of this report. This work was performed under NASA contract NAS1-19360, Task 52.
Meckel's cave access: anatomic study comparing the endoscopic transantral and endonasal approaches.
Van Rompaey, Jason; Suruliraj, Anand; Carrau, Ricardo; Panizza, Benedict; Solares, C Arturo
2014-04-01
Recent advances in endonasal endoscopy have facilitated the surgical access to the lateral skull base including areas such as Meckel's cave. This approach has been well documented, however, few studies have outlined transantral specific access to Meckel's. A transantral approach provides a direct pathway to this region obviating the need for extensive endonasal and transsphenoidal resection. Our aim in this study is to compare the anatomical perspectives obtained in endonasal and transantral approaches. We prepared 14 cadaveric specimens with intravascular injections of colored latex. Eight cadavers underwent endoscopic endonasal transpterygoid approaches to Meckel's cave. Six additional specimens underwent an endoscopic transantral approach to the same region. Photographic evidence was obtained for review. 30 CT scans were analyzed to measure comparative distances to Meckel's cave for both approaches. The endoscopic approaches provided a direct access to the anterior and inferior portions of Meckel's cave. However, the transantral approach required shorter instrumentation, and did not require clearing of the endonasal corridor. This approach gave an anterior view of Meckel's cave making posterior dissection more difficult. A transantral approach to Meckel's cave provides access similar to the endonasal approach with minimal invasiveness. Some of the morbidity associated with extensive endonasal resection could possibly be avoided. Better understanding of the complex skull base anatomy, from different perspectives, helps to improve current endoscopic skull base surgery and to develop new alternatives, consequently, leading to improvements in safety and efficacy.
Reducing the Bottleneck in Discovery of Novel Antibiotics.
Jones, Marcus B; Nierman, William C; Shan, Yue; Frank, Bryan C; Spoering, Amy; Ling, Losee; Peoples, Aaron; Zullo, Ashley; Lewis, Kim; Nelson, Karen E
2017-04-01
Most antibiotics were discovered by screening soil actinomycetes, but the efficiency of the discovery platform collapsed in the 1960s. By now, more than 3000 antibiotics have been described and most of the current discovery effort is focused on the rediscovery of known compounds, making the approach impractical. The last marketed broad-spectrum antibiotics discovered were daptomycin, linezolid, and fidaxomicin. The current state of the art in the development of new anti-infectives is a non-existent pipeline in the absence of a discovery platform. This is particularly troubling given the emergence of pan-resistant pathogens. The current practice in dealing with the problem of the background of known compounds is to use chemical dereplication of extracts to assess the relative novelty of a compound it contains. Dereplication typically requires scale-up, extraction, and often fractionation before an accurate mass and structure can be produced by MS analysis in combination with 2D NMR. Here, we describe a transcriptome analysis approach using RNA sequencing (RNASeq) to identify promising novel antimicrobial compounds from microbial extracts. Our pipeline permits identification of antimicrobial compounds that produce distinct transcription profiles using unfractionated cell extracts. This efficient pipeline will eliminate the requirement for purification and structure determination of compounds from extracts and will facilitate high-throughput screen of cell extracts for identification of novel compounds.
Nováková, Lucie; Pavlík, Jakub; Chrenková, Lucia; Martinec, Ondřej; Červený, Lukáš
2018-01-05
This review is a Part II of the series aiming to provide comprehensive overview of currently used antiviral drugs and to show modern approaches to their analysis. While in the Part I antivirals against herpes viruses and antivirals against respiratory viruses were addressed, this part concerns antivirals against hepatitis viruses (B and C) and human immunodeficiency virus (HIV). Many novel antivirals against hepatitis C virus (HCV) and HIV have been introduced into the clinical practice over the last decade. The recent broadening portfolio of these groups of antivirals is reflected in increasing number of developed analytical methods required to meet the needs of clinical terrain. Part II summarizes the mechanisms of action of antivirals against hepatitis B virus (HBV), HCV, and HIV, their use in clinical practice, and analytical methods for individual classes. It also provides expert opinion on state of art in the field of bioanalysis of these drugs. Analytical methods reflect novelty of these chemical structures and use by far the most current approaches, such as simple and high-throughput sample preparation and fast separation, often by means of UHPLC-MS/MS. Proper method validation based on requirements of bioanalytical guidelines is an inherent part of the developed methods. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Piłatowicz, Grzegorz; Budde-Meiwes, Heide; Kowal, Julia; Sarfert, Christel; Schoch, Eberhard; Königsmann, Martin; Sauer, Dirk Uwe
2016-11-01
Micro-hybrid vehicles (μH) are currently starting to dominate the European market and seize constantly growing share of other leading markets in the world. On the one hand, the additional functionality of μH reduces the CO2 emissions and improves the fuel economy, but, on the other hand, the additional stress imposed on the lead-acid battery reduces significantly its expected service life in comparison to conventional vehicles. Because of that μH require highly accurate battery state detection solutions. They are necessary to ensure the vehicle reliability requirements, prolong service life and reduce warranty costs. This paper presents an electrical model based on Butler-Volmer equation. The main novelty of the presented approach is its ability to predict accurately dynamic response of a battery considering a wide range of discharge current rates, state-of-charges and temperatures. Presented approach is fully implementable and adaptable in state-of-the-art low-cost platforms. Additionally, shown results indicate that it is applicable as a supporting tool for state-of-charge and state-of-health estimation and scalable for the different battery technologies and sizes. Validation using both static pulses and dynamic driving profile resulted in average absolute error of 124 mV regarding cranking current rate of 800 A respectively.
The implementation of problem-based learning: changing pedagogy in nurse education.
Creedy, D; Hand, B
1994-10-01
Problem-based learning (PBL) employs approaches to teaching and learning in nurse education that develop meaningful links between theory and practice. The adoption of such approaches, however, may require changes in pedagogical beliefs and practices which reflect a student-centred approach to teaching and learning. This paper focuses on a group of volunteer nurse educators (n = 14) who attended a 7-month professional development programme centred on introducing pedagogical changes when adopting PBL. From this group, three nurse educators participated in an in-depth study which aimed to examine the processes of conceptual change associated with adopting PBL as part of alternative teaching strategies. These three participants held common concerns about the changes required to their current teaching practices when moving to a new pedagogical approach. On completion of the programme, varying degrees of change in existing instructional practices were evident. This change was found to result from engaging educators in reflection about practice, providing opportunities to implement the new approaches on a trial basis, and providing feedback and support throughout the change process.
Tissue Engineering of Blood Vessels: Functional Requirements, Progress, and Future Challenges.
Kumar, Vivek A; Brewster, Luke P; Caves, Jeffrey M; Chaikof, Elliot L
2011-09-01
Vascular disease results in the decreased utility and decreased availability of autologus vascular tissue for small diameter (< 6 mm) vessel replacements. While synthetic polymer alternatives to date have failed to meet the performance of autogenous conduits, tissue-engineered replacement vessels represent an ideal solution to this clinical problem. Ongoing progress requires combined approaches from biomaterials science, cell biology, and translational medicine to develop feasible solutions with the requisite mechanical support, a non-fouling surface for blood flow, and tissue regeneration. Over the past two decades interest in blood vessel tissue engineering has soared on a global scale, resulting in the first clinical implants of multiple technologies, steady progress with several other systems, and critical lessons-learned. This review will highlight the current inadequacies of autologus and synthetic grafts, the engineering requirements for implantation of tissue-engineered grafts, and the current status of tissue-engineered blood vessel research.
Towards a Competency-based Vision for Construction Safety Education
NASA Astrophysics Data System (ADS)
Pedro, Akeem; Hai Chien, Pham; Park, Chan Sik
2018-04-01
Accidents still prevail in the construction industry, resulting in injuries and fatalities all over the world. Educational programs in construction should deliver safety knowledge and skills to students who will become responsible for ensuring safe construction work environments in the future. However, there is a gap between the competencies current pedagogical approaches target, and those required for safety in practice. This study contributes to addressing this issue in three steps. Firstly, a vision for competency-based construction safety education is conceived. Building upon this, a research scheme to achieve the vision is developed, and the first step of the scheme is initiated in this study. The critical competencies required for safety education are investigated through analyses of literature, and confirmed through surveys with construction and safety management professionals. Results from the study would be useful in establishing and orienting education programs towards current industry safety needs and requirements
A template-based approach for responsibility management in executable business processes
NASA Astrophysics Data System (ADS)
Cabanillas, Cristina; Resinas, Manuel; Ruiz-Cortés, Antonio
2018-05-01
Process-oriented organisations need to manage the different types of responsibilities their employees may have w.r.t. the activities involved in their business processes. Despite several approaches provide support for responsibility modelling, in current Business Process Management Systems (BPMS) the only responsibility considered at runtime is the one related to performing the work required for activity completion. Others like accountability or consultation must be implemented by manually adding activities in the executable process model, which is time-consuming and error-prone. In this paper, we address this limitation by enabling current BPMS to execute processes in which people with different responsibilities interact to complete the activities. We introduce a metamodel based on Responsibility Assignment Matrices (RAM) to model the responsibility assignment for each activity, and a flexible template-based mechanism that automatically transforms such information into BPMN elements, which can be interpreted and executed by a BPMS. Thus, our approach does not enforce any specific behaviour for the different responsibilities but new templates can be modelled to specify the interaction that best suits the activity requirements. Furthermore, libraries of templates can be created and reused in different processes. We provide a reference implementation and build a library of templates for a well-known set of responsibilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-04-01
This article describes a system developed for rapid light-off of underbody catalysts that has shown potential to meet Euro Stage III emissions targets and to be more cost-effective than some alternatives. Future emissions legislation will require SI engine aftertreatment systems to approach full operating efficiency within the first few seconds after starting to reduce the high total-emissions fraction currently contributed by the cold phase of driving. A reduction of cold-start emissions during Phase 1 (Euro) or Bag 1 (FTP), which in many cases can be as much as 80% of the total for the cycle, has been achieved by electricalmore » heating of the catalytic converter. But electrically heated catalyst (EHC) systems require high currents (100--200 A) to heat the metallic substrate to light-off temperatures over the first 15--20 seconds. Other viable approaches to reducing cold-start emissions include use of a fuel-powered burner upstream of the catalyst. However, as with EHC, the complexity of parts and the introduction of raw fuel into the exhaust system make this device unsatisfactory. Still another approach, an exhaust gas ignition (EGI) system, was first demonstrated in 1991. The operation of a system developed by engineers at Ford Motor Co., Ltd., Cambustion Ltd., and Tickford Ltd. is described here.« less
An Internet of Things platform architecture for supporting ambient assisted living environments.
Tsirmpas, Charalampos; Kouris, Ioannis; Anastasiou, Athanasios; Giokas, Kostas; Iliopoulou, Dimitra; Koutsouris, Dimitris
2017-01-01
Internet of Things (IoT) is the logical further development of today's Internet, enabling a huge amount of devices to communicate, compute, sense and act. IoT sensors placed in Ambient Assisted Living (AAL) environments, enable the context awareness and allow the support of the elderly in their daily routines, ultimately allowing an independent and safe lifestyle. The vast amount of data that are generated and exchanged between the IoT nodes require innovative context modeling approaches that go beyond currently used models. Current paper presents and evaluates an open interoperable platform architecture in order to utilize the technical characteristics of IoT and handle the large amount of generated data, as a solution to the technical requirements of AAL applications.
NASA Technical Reports Server (NTRS)
Gendreau, Keith; Cash, Webster; Gorenstein, Paul; Windt, David; Kaaret, Phil; Reynolds, Chris
2004-01-01
The Beyond Einstein Program in NASA's Office of Space Science Structure and Evolution of the Universe theme spells out the top level scientific requirements for a Black Hole Imager in its strategic plan. The MAXIM mission will provide better than one tenth of a microarcsecond imaging in the X-ray band in order to satisfy these requirements. We will overview the driving requirements to achieve these goals and ultimately resolve the event horizon of a supermassive black hole. We will present the current status of this effort that includes a study of a baseline design as well as two alternative approaches.
High flexible Hydropower Generation concepts for future grids
NASA Astrophysics Data System (ADS)
Hell, Johann
2017-04-01
The ongoing changes in electric power generation are resulting in new requirements for the classical generating units. In consequence a paradigm change in operation of power systems is necessary and a new approach in finding solutions is needed. The presented paper is dealing with the new requirements on current and future energy systems with the focus on hydro power generation. A power generation landscape for some European regions is shown and generation and operational flexibility is explained. Based on the requirements from the Transmission System Operator in UK, the transient performance of a Pumped Storage installation is discussed.
NASA Technical Reports Server (NTRS)
Barber, Bryan; Kahn, Laura; Wong, David
1990-01-01
Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.
Jibb, Lindsay A; Stevens, Bonnie J; Nathan, Paul C; Seto, Emily; Cafazzo, Joseph A; Stinson, Jennifer N
2014-03-19
Pain that occurs both within and outside of the hospital setting is a common and distressing problem for adolescents with cancer. The use of smartphone technology may facilitate rapid, in-the-moment pain support for this population. To ensure the best possible pain management advice is given, evidence-based and expert-vetted care algorithms and system design features, which are designed using user-centered methods, are required. To develop the decision algorithm and system requirements that will inform the pain management advice provided by a real-time smartphone-based pain management app for adolescents with cancer. A systematic approach to algorithm development and system design was utilized. Initially, a comprehensive literature review was undertaken to understand the current body of knowledge pertaining to pediatric cancer pain management. A user-centered approach to development was used as the results of the review were disseminated to 15 international experts (clinicians, scientists, and a consumer) in pediatric pain, pediatric oncology and mHealth design, who participated in a 2-day consensus conference. This conference used nominal group technique to develop consensus on important pain inputs, pain management advice, and system design requirements. Using data generated at the conference, a prototype algorithm was developed. Iterative qualitative testing was conducted with adolescents with cancer, as well as pediatric oncology and pain health care providers to vet and refine the developed algorithm and system requirements for the real-time smartphone app. The systematic literature review established the current state of research related to nonpharmacological pediatric cancer pain management. The 2-day consensus conference established which clinically important pain inputs by adolescents would require action (pain management advice) from the app, the appropriate advice the app should provide to adolescents in pain, and the functional requirements of the app. These results were used to build a detailed prototype algorithm capable of providing adolescents with pain management support based on their individual pain. Analysis of qualitative interviews with 9 multidisciplinary health care professionals and 10 adolescents resulted in 4 themes that helped to adapt the algorithm and requirements to the needs of adolescents. Specifically, themes were overall endorsement of the system, the need for a clinical expert, the need to individualize the system, and changes to the algorithm to improve potential clinical effectiveness. This study used a phased and user-centered approach to develop a pain management algorithm for adolescents with cancer and the system requirements of an associated app. The smartphone software is currently being created and subsequent work will focus on the usability, feasibility, and effectiveness testing of the app for adolescents with cancer pain.
Stevens, Bonnie J; Nathan, Paul C; Seto, Emily; Cafazzo, Joseph A; Stinson, Jennifer N
2014-01-01
Background Pain that occurs both within and outside of the hospital setting is a common and distressing problem for adolescents with cancer. The use of smartphone technology may facilitate rapid, in-the-moment pain support for this population. To ensure the best possible pain management advice is given, evidence-based and expert-vetted care algorithms and system design features, which are designed using user-centered methods, are required. Objective To develop the decision algorithm and system requirements that will inform the pain management advice provided by a real-time smartphone-based pain management app for adolescents with cancer. Methods A systematic approach to algorithm development and system design was utilized. Initially, a comprehensive literature review was undertaken to understand the current body of knowledge pertaining to pediatric cancer pain management. A user-centered approach to development was used as the results of the review were disseminated to 15 international experts (clinicians, scientists, and a consumer) in pediatric pain, pediatric oncology and mHealth design, who participated in a 2-day consensus conference. This conference used nominal group technique to develop consensus on important pain inputs, pain management advice, and system design requirements. Using data generated at the conference, a prototype algorithm was developed. Iterative qualitative testing was conducted with adolescents with cancer, as well as pediatric oncology and pain health care providers to vet and refine the developed algorithm and system requirements for the real-time smartphone app. Results The systematic literature review established the current state of research related to nonpharmacological pediatric cancer pain management. The 2-day consensus conference established which clinically important pain inputs by adolescents would require action (pain management advice) from the app, the appropriate advice the app should provide to adolescents in pain, and the functional requirements of the app. These results were used to build a detailed prototype algorithm capable of providing adolescents with pain management support based on their individual pain. Analysis of qualitative interviews with 9 multidisciplinary health care professionals and 10 adolescents resulted in 4 themes that helped to adapt the algorithm and requirements to the needs of adolescents. Specifically, themes were overall endorsement of the system, the need for a clinical expert, the need to individualize the system, and changes to the algorithm to improve potential clinical effectiveness. Conclusions This study used a phased and user-centered approach to develop a pain management algorithm for adolescents with cancer and the system requirements of an associated app. The smartphone software is currently being created and subsequent work will focus on the usability, feasibility, and effectiveness testing of the app for adolescents with cancer pain. PMID:24646454
Security Requirements Management in Software Product Line Engineering
NASA Astrophysics Data System (ADS)
Mellado, Daniel; Fernández-Medina, Eduardo; Piattini, Mario
Security requirements engineering is both a central task and a critical success factor in product line development due to the complexity and extensive nature of product lines. However, most of the current product line practices in requirements engineering do not adequately address security requirements engineering. Therefore, in this chapter we will propose a security requirements engineering process (SREPPLine) driven by security standards and based on a security requirements decision model along with a security variability model to manage the variability of the artefacts related to security requirements. The aim of this approach is to deal with security requirements from the early stages of the product line development in a systematic way, in order to facilitate conformance with the most relevant security standards with regard to the management of security requirements, such as ISO/IEC 27001 and ISO/IEC 15408.
New architectural paradigms for multi-petabyte distributed storage systems
NASA Technical Reports Server (NTRS)
Lee, Richard R.
1994-01-01
In the not too distant future, programs such as NASA's Earth Observing System, NSF/ARPA/NASA's Digital Libraries Initiative and Intelligence Community's (NSA, CIA, NRO, etc.) mass storage system upgrades will all require multi-petabyte (petabyte: 1015 bytes of bitfile data) (or larger) distributed storage solutions. None of these requirements, as currently defined, will meet their objectives utilizing either today's architectural paradigms or storage solutions. Radically new approaches will be required to not only store and manage veritable 'mountain ranges of data', but to make the cost of ownership affordable, much less practical in today's (and certainly the future's) austere budget environment! Within this paper we will explore new architectural paradigms and project systems performance benefits and dollars per petabyte of information stored. We will discuss essential 'top down' approaches to achieving an overall systems level performance capability sufficient to meet the challenges of these major programs.
Essential use cases for pedagogical patterns
NASA Astrophysics Data System (ADS)
Derntl, Michael; Botturi, Luca
2006-06-01
Coming from architecture, through computer science, pattern-based design spread into other disciplines and is nowadays recognized as a powerful way of capturing and reusing effective design practice. However, current pedagogical pattern approaches lack widespread adoption, both by users and authors, and are still limited to individual initiatives. This paper contributes to creating a shared understanding of what a pattern system is by defining the key terms. Moreover, the paper builds upon and extends a set of existing functional and non-functional requirements for pattern systems, adds structure to these requirements, and derives essential use cases following a goal-based approach for both pattern maintenance and pattern application. Finally, implications concerning the pedagogical use of pattern-based design are drawn, concluding that a stronger focus on the underlying (pedagogical) value system is required in order to make a pattern system a meaningful tool for effective educational design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaufmann, John R.; Hand, James R.; Halverson, Mark A.
This report evaluates how and when to best integrate renewable energy requirements into building energy codes. The basic goals were to: (1) provide a rough guide of where we’re going and how to get there; (2) identify key issues that need to be considered, including a discussion of various options with pros and cons, to help inform code deliberations; and (3) to help foster alignment among energy code-development organizations. The authors researched current approaches nationally and internationally, conducted a survey of key stakeholders to solicit input on various approaches, and evaluated the key issues related to integration of renewable energymore » requirements and various options to address those issues. The report concludes with recommendations and a plan to engage stakeholders. This report does not evaluate whether the use of renewable energy should be required on buildings; that question involves a political decision that is beyond the scope of this report.« less
Culture change, leadership and the grass-roots workforce.
Edwards, Mark; Penlington, Clare; Kalidasan, Varadarajan; Kelly, Tony
2014-08-01
The NHS is arguably entering its most challenging era. It is being asked to do more for less and, in parallel, a cultural shift in response to its described weaknesses has been prescribed. The definition of culture, the form this change should take and the mechanism to achieve it are not well understood. The complexity of modern healthcare requires that we evolve our approach to the workforce and enhance our understanding of the styles of leadership that are required in order to bring about this cultural change. Identification of leaders within the workforce and dissemination of a purposeful and strategic quality improvement agenda, in part defined by the general workforce, are important components in establishing the change that the organisation currently requires. We are implementing this approach locally by identifying and developing grassroots networks linked to a portfolio of safety and quality projects. © 2014 Royal College of Physicians.
Operations Assessment of Launch Vehicle Architectures using Activity Based Cost Models
NASA Technical Reports Server (NTRS)
Ruiz-Torres, Alex J.; McCleskey, Carey
2000-01-01
The growing emphasis on affordability for space transportation systems requires the assessment of new space vehicles for all life cycle activities, from design and development, through manufacturing and operations. This paper addresses the operational assessment of launch vehicles, focusing on modeling the ground support requirements of a vehicle architecture, and estimating the resulting costs and flight rate. This paper proposes the use of Activity Based Costing (ABC) modeling for this assessment. The model uses expert knowledge to determine the activities, the activity times and the activity costs based on vehicle design characteristics. The approach provides several advantages to current approaches to vehicle architecture assessment including easier validation and allowing vehicle designers to understand the cost and cycle time drivers.
NASA Technical Reports Server (NTRS)
Drake, Jeffrey T.; Prasad, Nadipuram R.
1999-01-01
This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.
Machine learning action parameters in lattice quantum chromodynamics
NASA Astrophysics Data System (ADS)
Shanahan, Phiala E.; Trewartha, Daniel; Detmold, William
2018-05-01
Numerical lattice quantum chromodynamics studies of the strong interaction are important in many aspects of particle and nuclear physics. Such studies require significant computing resources to undertake. A number of proposed methods promise improved efficiency of lattice calculations, and access to regions of parameter space that are currently computationally intractable, via multi-scale action-matching approaches that necessitate parametric regression of generated lattice datasets. The applicability of machine learning to this regression task is investigated, with deep neural networks found to provide an efficient solution even in cases where approaches such as principal component analysis fail. The high information content and complex symmetries inherent in lattice QCD datasets require custom neural network layers to be introduced and present opportunities for further development.
Bondü, Rebecca; Scheithauer, Herbert
2009-01-01
In March and September 2009 the school shootings in Winnenden and Ansbach once again demonstrated the need for preventive approaches in order to prevent further offences in Germany. Due to the low frequency of such offences and the low specificity of relevant risk factors known so far, prediction and prevention seems difficult though. None the less, several preventive approaches are currently discussed. The present article highlights these approaches and their specific advantages and disadvantages. As school shootings are multicausally determined, approaches focussing only on single aspects (i.e. prohibiting violent computer games or further strengthening gun laws) do not meet requirements. Other measures such as installing technical safety devices or optimizing actions of police and school attendants are supposed to reduce harm in case of emergency. Instead, scientifically founded and promising preventive approaches focus on secondary prevention and for this purpose employ the threat assessment approach, which is widespread within the USA. In this framework, responsible occupational groups such as teachers, school psychologists and police officers are to be trained in identifying students' warning signs, judging danger of these students for self and others in a systematic process and initiating suitable interventions.
Development of practical high temperature superconducting wire for electric power application
NASA Technical Reports Server (NTRS)
Hawsey, Robert A.; Sokolowski, Robert S.; Haldar, Pradeep; Motowidlo, Leszek R.
1995-01-01
The technology of high temperature superconductivity has gone from beyond mere scientific curiousity into the manufacturing environment. Single lengths of multifilamentary wire are now produced that are over 200 meters long and that carry over 13 amperes at 77 K. Short-sample critical current densities approach 5 x 104 A/sq cm at 77 K. Conductor requirements such as high critical current density in a magnetic field, strain-tolerant sheathing materials, and other engineering properties are addressed. A new process for fabricating round BSCCO-2212 wire has produced wires with critical current densities as high as 165,000 A/sq cm at 4.2 K and 53,000 A/sq cm at 40 K. This process eliminates the costly, multiple pressing and rolling steps that are commonly used to develop texture in the wires. New multifilamentary wires with strengthened sheathing materials have shown improved yield strengths up to a factor of five better than those made with pure silver. Many electric power devices require the wire to be formed into coils for production of strong magnetic fields. Requirements for coils and magnets for electric power applications are described.
Loads and low frequency dynamics - An ENVIRONET data base
NASA Technical Reports Server (NTRS)
Garba, John A.
1988-01-01
The loads and low frequency dynamics data base, part of Environet, is described with particular attention given to its development and contents. The objective of the data base is to provide the payload designer with design approaches and design data to meet STS safety requirements. Currently the data base consists of the following sections: abstract, scope, glossary, requirements, interaction with other environments, summary of the loads analysis process, design considerations, guidelines for payload design loads, information data base, and references.
The challenges of simulating wake vortex encounters and assessing separation criteria
NASA Technical Reports Server (NTRS)
Dunham, R. E.; Stuever, Robert A.; Vicroy, Dan D.
1993-01-01
During landings and take-offs, the longitudinal spacing between airplanes is in part determined by the safe separation required to avoid the trailing vortex wake of the preceding aircraft. Safe exploration of the feasibility of reducing longitudinal separation standards will require use of aircraft simulators. This paper discusses the approaches to vortex modeling, methods for modeling the aircraft/vortex interaction, some of the previous attempts of defining vortex hazard criteria, and current understanding of the development of vortex hazard criteria.
Human Planetary Landing System (HPLS) Capability Roadmap NRC Progress Review
NASA Technical Reports Server (NTRS)
Manning, Rob; Schmitt, Harrison H.; Graves, Claude
2005-01-01
Capability Roadmap Team. Capability Description, Scope and Capability Breakdown Structure. Benefits of the HPLS. Roadmap Process and Approach. Current State-of-the-Art, Assumptions and Key Requirements. Top Level HPLS Roadmap. Capability Presentations by Leads. Mission Drivers Requirements. "AEDL" System Engineering. Communication & Navigation Systems. Hypersonic Systems. Super to Subsonic Decelerator Systems. Terminal Descent and Landing Systems. A Priori In-Situ Mars Observations. AEDL Analysis, Test and Validation Infrastructure. Capability Technical Challenges. Capability Connection Points to other Roadmaps/Crosswalks. Summary of Top Level Capability. Forward Work.
Nichols, J.D.; Runge, M.C.; Johnson, F.A.; Williams, B.K.
2007-01-01
Since 1995, the US Fish and Wildlife Service has used an adaptive approach to the management of sport harvest of mid-continent Mallard ducks (Anas platyrhynchos) in North America. This approach differs from many current approaches to conservation and management in requiring close collaboration between managers and scientists. Key elements of this process are objectives, alternative management actions, models permitting prediction of system responses, and a monitoring program. The iterative process produces optimal management decisions and leads to reduction in uncertainty about response of populations to management. This general approach to management has a number of desirable features and is recommended for use in many other programs of management and conservation.
Multi-aircraft dynamics, navigation and operation
NASA Astrophysics Data System (ADS)
Houck, Sharon Wester
Air traffic control stands on the brink of a revolution. Fifty years from now, we will look back and marvel that we ever flew by radio beacons and radar alone, much as we now marvel that early aviation pioneers flew by chronometer and compass alone. The microprocessor, satellite navigation systems, and air-to-air data links are the technical keys to this revolution. Many airports are near or at capacity now for at least portions of the day, making it clear that major increases in airport capacity will be required in order to support the projected growth in air traffic. This can be accomplished by adding airports, adding runways at existing airports, or increasing the capacity of the existing runways. Technology that allows use of ultra closely spaced (750 ft to 2500 ft) parallel approaches would greatly reduce the environmental impact of airport capacity increases. This research tackles the problem of multi aircraft dynamics, navigation, and operation, specifically in the terminal area, and presents new findings on how ultra closely spaced parallel approaches may be accomplished. The underlying approach considers how multiple aircraft are flown in visual conditions, where spacing criteria is much less stringent, and then uses this data to study the critical parameters for collision avoidance during an ultra closely spaced parallel approach. Also included is experimental and analytical investigations on advanced guidance systems that are critical components of precision approaches. Together, these investigations form a novel approach to the design and analysis of parallel approaches for runways spaced less than 2500 ft apart. This research has concluded that it is technically feasible to reduce the required runway spacing during simultaneous instrument approaches to less than the current minimum of 3400 ft with the use of advanced navigation systems while maintaining the currently accepted levels of safety. On a smooth day with both pilots flying a tunnel-in-the-sky display and being guided by a Category I LAAS, it is technically feasible to reduce the runway spacing to 1100 ft. If a Category I LAAS and an "intelligent auto-pilot" that executes both the approach and emergency escape maneuver are used, the technically achievable required runway spacing is reduced to 750 ft. Both statements presume full aircraft state information, including position, velocity, and attitude, is being reliably passed between aircraft at a rate equal to or greater than one Hz.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490
A Modular Approach To Developing A Large Deployable Reflector
NASA Astrophysics Data System (ADS)
Pittman, R.; Leidich, C.; Mascy, F.; Swenson, B.
1984-01-01
NASA is currently studying the feasibility of developing a Large Deployable Reflector (LDR) astronomical facility to perform astrophysical studies of the infrared and submillimeter portion of the spectrum in the mid 1990's. The LDR concept was recommended by the Astronomy Survey Committee of the National Academy of Sciences as one of two space based projects to be started this decade. The current baseline calls for a 20 m (65.6 ft) aperture telescope diffraction limited at 30 μm and automatically deployed from a single Shuttle launch. The volume, performance, and single launch constraints place great demands on the technology and place LDR beyond the state-of-the-art in certain areas such as lightweight reflector segments. The advent of the Shuttle is opening up many new options and capabilities for producing large space systems. Until now, LDR has always been conceived as an integrated system, deployed autonomously in a single launch. This paper will look at a combination of automatic deployment and on-orbit assembly that may reduce the technological complexity and cost of the LDR system. Many technological tools are now in use or under study that will greatly enhance our capabilities to do assembly in space. Two Shuttle volume budget scenarios will be examined to assess the potential of these tools to reduce the LDR system complexity. Further study will be required to reach the full optimal combination of deployment and assembly, since in most cases the capabilities of these new tools have not been demonstrated. In order to take maximum advantage of these concepts, the design of LDR must be flexible and allow one subsystem to be modified without adversely affecting the entire system. One method of achieving this flexibility is to use a modular design approach in which the major subsystems are physically separated during launch and assembled on orbit. A modular design approach facilitates this flexibility but requires that the subsystems be interfaced in a simple, straightforward, and controlled manner. NASA is currently defining a technology development plan for LDR which will identify the technology advances that are required. The modular approach offers the flexibility to easily incorporate these new advances into the design.
Software beamforming: comparison between a phased array and synthetic transmit aperture.
Li, Yen-Feng; Li, Pai-Chi
2011-04-01
The data-transfer and computation requirements are compared between software-based beamforming using a phased array (PA) and a synthetic transmit aperture (STA). The advantages of a software-based architecture are reduced system complexity and lower hardware cost. Although this architecture can be implemented using commercial CPUs or GPUs, the high computation and data-transfer requirements limit its real-time beamforming performance. In particular, transferring the raw rf data from the front-end subsystem to the software back-end remains challenging with current state-of-the-art electronics technologies, which offset the cost advantage of the software back end. This study investigated the tradeoff between the data-transfer and computation requirements. Two beamforming methods based on a PA and STA, respectively, were used: the former requires a higher data transfer rate and the latter requires more memory operations. The beamformers were implemente;d in an NVIDIA GeForce GTX 260 GPU and an Intel core i7 920 CPU. The frame rate of PA beamforming was 42 fps with a 128-element array transducer, with 2048 samples per firing and 189 beams per image (with a 95 MB/frame data-transfer requirement). The frame rate of STA beamforming was 40 fps with 16 firings per image (with an 8 MB/frame data-transfer requirement). Both approaches achieved real-time beamforming performance but each had its own bottleneck. On the one hand, the required data-transfer speed was considerably reduced in STA beamforming, whereas this required more memory operations, which limited the overall computation time. The advantages of the GPU approach over the CPU approach were clearly demonstrated.
Respiratory sensitization and allergy: Current research approaches and needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boverhof, Darrell R.; Billington, Richard; Gollapudi, B. Bhaskar
2008-01-01
There are currently no accepted regulatory models for assessing the potential of a substance to cause respiratory sensitization and allergy. In contrast, a number of models exist for the assessment of contact sensitization and allergic contact dermatitis (ACD). Research indicates that respiratory sensitizers may be identified through contact sensitization assays such as the local lymph node assay, although only a small subset of the compounds that yield positive results in these assays are actually respiratory sensitizers. Due to the increasing health concerns associated with occupational asthma and the impending directives on the regulation of respiratory sensitizers and allergens, an approachmore » which can identify these compounds and distinguish them from contact sensitizers is required. This report discusses some of the important contrasts between respiratory allergy and ACD, and highlights several prominent in vivo, in vitro and in silico approaches that are being applied or could be further developed to identify compounds capable of causing respiratory allergy. Although a number of animal models have been used for researching respiratory sensitization and allergy, protocols and endpoints for these approaches are often inconsistent, costly and difficult to reproduce, thereby limiting meaningful comparisons of data between laboratories and development of a consensus approach. A number of emerging in vitro and in silico models show promise for use in the characterization of contact sensitization potential and should be further explored for their ability to identify and differentiate contact and respiratory sensitizers. Ultimately, the development of a consistent, accurate and cost-effective model will likely incorporate a number of these approaches and will require effective communication, collaboration and consensus among all stakeholders.« less
Advanced EUV mask and imaging modeling
NASA Astrophysics Data System (ADS)
Evanschitzky, Peter; Erdmann, Andreas
2017-10-01
The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.
From Piloting e-Submission to Electronic Management of Assessment (EMA): Mapping Grading Journeys
ERIC Educational Resources Information Center
Vergés Bausili, Anna
2018-01-01
The increasing interest in electronic management of assessment is a sign of a gradual institutionalisation of e-submission and e-marking technologies in UK Higher Education. The effective adoption of these technologies requires a managed approach, especially a detailed understanding of current assessment practices within the institution and the…
Bayesian Statistics in Educational Research: A Look at the Current State of Affairs
ERIC Educational Resources Information Center
König, Christoph; van de Schoot, Rens
2018-01-01
The ability of a scientific discipline to build cumulative knowledge depends on its predominant method of data analysis. A steady accumulation of knowledge requires approaches which allow researchers to consider results from comparable prior research. Bayesian statistics is especially relevant for establishing a cumulative scientific discipline,…
Academic Airframe Icing Perspective
NASA Technical Reports Server (NTRS)
Bragg, Mike; Rothmayer, Alric; Thompson, David
2009-01-01
2-D ice accretion and aerodynamics reasonably well understood for engineering applications To significantly improve our current capabilities we need to understand 3-D: a) Important ice accretion physics and modeling not well understood in 3-D; and b) Aerodynamics unsteady and 3-D especially near stall. Larger systems issues important and require multidisciplinary team approach
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... inquiries and investigations. The current approach of requiring members to report the reference time instead... proposing amendments to the equity trade reporting rules relating to reporting (i) an additional time field for specified trades, (ii) execution time in milliseconds, (iii) reversals, (iv) trades executed on...
Blended Training on Scientific Software: A Study on How Scientific Data Are Generated
ERIC Educational Resources Information Center
Skordaki, Efrosyni-Maria; Bainbridge, Susan
2018-01-01
This paper presents the results of a research study on scientific software training in blended learning environments. The investigation focused on training approaches followed by scientific software users whose goal is the reliable application of such software. A key issue in current literature is the requirement for a theory-substantiated…
Health supply chain management.
Zimmerman, Rolf; Gallagher, Pat
2010-01-01
This chapter gives an educational overview of: * The actual application of supply chain practice and disciplines required for service delivery improvement within the current health environment. * A rationale for the application of Supply Chain Management (SCM) approaches to the Health sector. * The tools and methods available for supply chain analysis and benchmarking. * Key supply chain success factors.
Ethics Education in Australian Preservice Teacher Programs: A Hidden Imperative?
ERIC Educational Resources Information Center
Boon, Helen J.; Maxwell, Bruce
2016-01-01
This paper provides a snapshot of the current approach to ethics education in accredited Australian pre-service teacher programs. Methods included a manual calendar search of ethics related subjects required in teacher programs using a sample of 24 Australian universities and a survey of 26 university representatives. Findings show a paucity of…
Parental Perceptions and Recommendations of Computing Majors: A Technology Acceptance Model Approach
ERIC Educational Resources Information Center
Powell, Loreen; Wimmer, Hayden
2017-01-01
Currently, there are more technology related jobs then there are graduates in supply. The need to understand user acceptance of computing degrees is the first step in increasing enrollment in computing fields. Additionally, valid measurement scales for predicting user acceptance of Information Technology degree programs are required. The majority…
Special Education Teachers' Views on Using Technology in Teaching Mathematics
ERIC Educational Resources Information Center
Baglama, Basak; Yikmis, Ahmet; Demirok, Mukaddes Sakalli
2017-01-01
Individuals with special needs require support in acquiring various academic and social skills and mathematical skills are one of the most important skills in which individuals with special needs need to acquire in order to maintain their daily lives. Current approaches in education emphasize the importance of integrating technology into special…
2005-06-01
of current military C2 organizations. The unit of analysis for organizational diagnosis is the Joint Task Force (JTF). It represents a multi-Service...Strategy as Structured Chaos Boston, MA: Harvard Business School Press (1998). [5] Burton, R.M. and Obel, B., Strategic Organizational Diagnosis and
A GIS APPROACH FOR IDENTIFYING SPECIES AND LOCATIONS AT RISK FROM OFF-TARGET MOVEMENT OF PESTICIDES
In many countries, numerous tests are required prior to pesticide registration for the protection of human health and the environment from the unintended effects of chemical releases. Current methodology used by the US EPA for determining plant species at risk from off site movem...
Word Processing and Its Implications for Business Communications Courses.
ERIC Educational Resources Information Center
Kruk, Leonard B.
Word processing, a systematic approach to office work, is currently based on the use of sophisticated dictating and typing machines. The word processing market is rapidly increasing with the paper explosion brought on by such factors as increasing governmental regulation, Internal Revenue Service requirements, and the need for stockholders to be…
A simulated approach to estimating PM10 and PM2.5 concentrations downwind from cotton gins
USDA-ARS?s Scientific Manuscript database
Cotton gins are required to obtain operating permits from state air pollution regulatory agencies (SAPRA), which regulate the amount of particulate matter that can be emitted. Industrial Source Complex Short Term version 3 (ISCST3) is the Gaussian dispersion model currently used by some SAPRAs to pr...
"Drinking water quality at the consumer's tap is the center piece of U.S. drinking water regulations to protect people's health. Recently promulgated Stage II DBP rules are an example, which requires a system approach in a multi-barrier strategy for compliance and risk managemen...
Rethinking the Roles of Assessment in Music Education
ERIC Educational Resources Information Center
Scott, Sheila J.
2012-01-01
In music education, current attention to student-centered approaches for learning affects our understanding of student assessment. This view to curriculum reform requires new perspectives for assessment. There is a need to move beyond the summative use of assessment to assign grades to examining the roles of assessment in supporting and enhancing…
LBM-EP: Lattice-Boltzmann method for fast cardiac electrophysiology simulation from 3D images.
Rapaka, S; Mansi, T; Georgescu, B; Pop, M; Wright, G A; Kamen, A; Comaniciu, Dorin
2012-01-01
Current treatments of heart rhythm troubles require careful planning and guidance for optimal outcomes. Computational models of cardiac electrophysiology are being proposed for therapy planning but current approaches are either too simplified or too computationally intensive for patient-specific simulations in clinical practice. This paper presents a novel approach, LBM-EP, to solve any type of mono-domain cardiac electrophysiology models at near real-time that is especially tailored for patient-specific simulations. The domain is discretized on a Cartesian grid with a level-set representation of patient's heart geometry, previously estimated from images automatically. The cell model is calculated node-wise, while the transmembrane potential is diffused using Lattice-Boltzmann method within the domain defined by the level-set. Experiments on synthetic cases, on a data set from CESC'10 and on one patient with myocardium scar showed that LBM-EP provides results comparable to an FEM implementation, while being 10 - 45 times faster. Fast, accurate, scalable and requiring no specific meshing, LBM-EP paves the way to efficient and detailed models of cardiac electrophysiology for therapy planning.
Materials Requirements for Advanced Propulsion Systems
NASA Technical Reports Server (NTRS)
Whitaker, Ann F.; Cook, Mary Beth; Clinton, R. G., Jr.
2005-01-01
NASA's mission to "reach the Moon and Mars" will be obtained only if research begins now to develop materials with expanded capabilities to reduce mass, cost and risk to the program. Current materials cannot function satisfactorily in the deep space environments and do not meet the requirements of long term space propulsion concepts for manned missions. Directed research is needed to better understand materials behavior for optimizing their processing. This research, generating a deeper understanding of material behavior, can lead to enhanced implementation of materials for future exploration vehicles. materials providing new approaches for manufacture and new options for In response to this need for more robust materials, NASA's Exploration Systems Mission Directorate (ESMD) has established a strategic research initiative dedicated to materials development supporting NASA's space propulsion needs. The Advanced Materials for Exploration (AME) element directs basic and applied research to understand material behavior and develop improved materials allowing propulsion systems to operate beyond their current limitations. This paper will discuss the approach used to direct the path of strategic research for advanced materials to ensure that the research is indeed supportive of NASA's future missions to the moon, Mars, and beyond.
Minimally invasive surgery of the anterior skull base: transorbital approaches
Gassner, Holger G.; Schwan, Franziska; Schebesch, Karl-Michael
2016-01-01
Minimally invasive approaches are becoming increasingly popular to access the anterior skull base. With interdisciplinary cooperation, in particular endonasal endoscopic approaches have seen an impressive expansion of indications over the past decades. The more recently described transorbital approaches represent minimally invasive alternatives with a differing spectrum of access corridors. The purpose of the present paper is to discuss transorbital approaches to the anterior skull base in the light of the current literature. The transorbital approaches allow excellent exposure of areas that are difficult to reach like the anterior and posterior wall of the frontal sinus; working angles may be more favorable and the paranasal sinus system can be preserved while exposing the skull base. Because of their minimal morbidity and the cosmetically excellent results, the transorbital approaches represent an important addition to established endonasal endoscopic and open approaches to the anterior skull base. Their execution requires an interdisciplinary team approach. PMID:27453759
Classification of DNA nucleotides with transverse tunneling currents
NASA Astrophysics Data System (ADS)
Nyvold Pedersen, Jonas; Boynton, Paul; Di Ventra, Massimiliano; Jauho, Antti-Pekka; Flyvbjerg, Henrik
2017-01-01
It has been theoretically suggested and experimentally demonstrated that fast and low-cost sequencing of DNA, RNA, and peptide molecules might be achieved by passing such molecules between electrodes embedded in a nanochannel. The experimental realization of this scheme faces major challenges, however. In realistic liquid environments, typical currents in tunneling devices are of the order of picoamps. This corresponds to only six electrons per microsecond, and this number affects the integration time required to do current measurements in real experiments. This limits the speed of sequencing, though current fluctuations due to Brownian motion of the molecule average out during the required integration time. Moreover, data acquisition equipment introduces noise, and electronic filters create correlations in time-series data. We discuss how these effects must be included in the analysis of, e.g., the assignment of specific nucleobases to current signals. As the signals from different molecules overlap, unambiguous classification is impossible with a single measurement. We argue that the assignment of molecules to a signal is a standard pattern classification problem and calculation of the error rates is straightforward. The ideas presented here can be extended to other sequencing approaches of current interest.
The induced electric field due to a current transient
NASA Astrophysics Data System (ADS)
Beck, Y.; Braunstein, A.; Frankental, S.
2007-05-01
Calculations and measurements of the electric fields, induced by a lightning strike, are important for understanding the phenomenon and developing effective protection systems. In this paper, a novel approach to the calculation of the electric fields due to lightning strikes, using a relativistic approach, is presented. This approach is based on a known current wave-pair model, representing the lightning current wave. The model presented is one that describes the lightning current wave, either at the first stage of the descending charge wave from the cloud or at the later stage of the return stroke. The electric fields computed are cylindrically symmetric. A simplified method for the calculation of the electric field is achieved by using special relativity theory and relativistic considerations. The proposed approach, described in this paper, is based on simple expressions (by applying Coulomb's law) compared with much more complicated partial differential equations based on Maxwell's equations. A straight forward method of calculating the electric field due to a lightning strike, modelled as a negative-positive (NP) wave-pair, is determined by using the special relativity theory in order to calculate the 'velocity field' and relativistic concepts for calculating the 'acceleration field'. These fields are the basic elements required for calculating the total field resulting from the current wave-pair model. Moreover, a modified simpler method using sub models is represented. The sub-models are filaments of either static charges or charges at constant velocity only. Combining these simple sub-models yields the total wave-pair model. The results fully agree with that obtained by solving Maxwell's equations for the discussed problem.
NASA Astrophysics Data System (ADS)
Mukherjee, Saptarshi; Rosell, Anders; Udpa, Lalita; Udpa, Satish; Tamburrino, Antonello
2017-02-01
The modeling of U-Bend segment in steam generator tubes for predicting eddy current probe signals from cracks, wear and pitting in this region poses challenges and is non-trivial. Meshing the geometry in the cartesian coordinate system might require a large number of elements to model the U-bend region. Also, since the lift-off distance between the probe and tube wall is usually very small, a very fine mesh is required near the probe region to accurately describe the eddy current field. This paper presents a U-bend model using differential geometry principles that exploit the result that Maxwell's equations are covariant with respect to changes of coordinates and independent of metrics. The equations remain unaltered in their form, regardless of the choice of the coordinates system, provided the field quantities are represented in the proper covariant and contravariant form. The complex shapes are mapped into simple straight sections, while small lift-off is mapped to larger values, thus reducing the intrinsic dimension of the mesh and stiffness matrix. In this contribution, the numerical implementation of the above approach will be discussed with regard to field and current distributions within the U-bend tube wall. For the sake of simplicity, a two dimensional test case will be considered. The approach is evaluated in terms of efficiency and accuracy by comparing the results with that obtained using a conventional FE model in cartesian coordinates.
Novel aspects of plasma control in ITER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphreys, D.; Jackson, G.; Walker, M.
2015-02-15
ITER plasma control design solutions and performance requirements are strongly driven by its nuclear mission, aggressive commissioning constraints, and limited number of operational discharges. In addition, high plasma energy content, heat fluxes, neutron fluxes, and very long pulse operation place novel demands on control performance in many areas ranging from plasma boundary and divertor regulation to plasma kinetics and stability control. Both commissioning and experimental operations schedules provide limited time for tuning of control algorithms relative to operating devices. Although many aspects of the control solutions required by ITER have been well-demonstrated in present devices and even designed satisfactorily formore » ITER application, many elements unique to ITER including various crucial integration issues are presently under development. We describe selected novel aspects of plasma control in ITER, identifying unique parts of the control problem and highlighting some key areas of research remaining. Novel control areas described include control physics understanding (e.g., current profile regulation, tearing mode (TM) suppression), control mathematics (e.g., algorithmic and simulation approaches to high confidence robust performance), and integration solutions (e.g., methods for management of highly subscribed control resources). We identify unique aspects of the ITER TM suppression scheme, which will pulse gyrotrons to drive current within a magnetic island, and turn the drive off following suppression in order to minimize use of auxiliary power and maximize fusion gain. The potential role of active current profile control and approaches to design in ITER are discussed. Issues and approaches to fault handling algorithms are described, along with novel aspects of actuator sharing in ITER.« less
Novel aspects of plasma control in ITER
Humphreys, David; Ambrosino, G.; de Vries, Peter; ...
2015-02-12
ITER plasma control design solutions and performance requirements are strongly driven by its nuclear mission, aggressive commissioning constraints, and limited number of operational discharges. In addition, high plasma energy content, heat fluxes, neutron fluxes, and very long pulse operation place novel demands on control performance in many areas ranging from plasma boundary and divertor regulation to plasma kinetics and stability control. Both commissioning and experimental operations schedules provide limited time for tuning of control algorithms relative to operating devices. Although many aspects of the control solutions required by ITER have been well-demonstrated in present devices and even designed satisfactorily formore » ITER application, many elements unique to ITER including various crucial integration issues are presently under development. We describe selected novel aspects of plasma control in ITER, identifying unique parts of the control problem and highlighting some key areas of research remaining. Novel control areas described include control physics understanding (e.g. current profile regulation, tearing mode suppression (TM)), control mathematics (e.g. algorithmic and simulation approaches to high confidence robust performance), and integration solutions (e.g. methods for management of highly-subscribed control resources). We identify unique aspects of the ITER TM suppression scheme, which will pulse gyrotrons to drive current within a magnetic island, and turn the drive off following suppression in order to minimize use of auxiliary power and maximize fusion gain. The potential role of active current profile control and approaches to design in ITER are discussed. Finally, issues and approaches to fault handling algorithms are described, along with novel aspects of actuator sharing in ITER.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKone, Thomas E.; Ryan, P. Barry; Ozkaynak, Haluk
2007-02-01
Understanding and quantifying outdoor and indoor sources of human exposure are essential but often not adequately addressed in health-effects studies for air pollution. Air pollution epidemiology, risk assessment, health tracking and accountability assessments are examples of health-effects studies that require but often lack adequate exposure information. Recent advances in exposure modeling along with better information on time-activity and exposure factors data provide us with unique opportunities to improve the assignment of exposures for both future and ongoing studies linking air pollution to health impacts. In September 2006, scientists from the US Environmental Protection Agency (EPA) and the Centers for Diseasemore » Control and Prevention (CDC) along with scientists from the academic community and state health departments convened a symposium on air pollution exposure and health in order to identify, evaluate, and improve current approaches for linking air pollution exposures to disease. This manuscript presents the key issues, challenges and recommendations identified by the exposure working group, who used cases studies of particulate matter, ozone, and toxic air pollutant exposure to evaluate health-effects for air pollution. One of the over-arching lessons of this workshop is that obtaining better exposure information for these different health-effects studies requires both goal-setting for what is needed and mapping out the transition pathway from current capabilities to meeting these goals. Meeting our long-term goals requires definition of incremental steps that provide useful information for the interim and move us toward our long-term goals. Another over-arching theme among the three different pollutants and the different health study approaches is the need for integration among alternate exposure assessment approaches. For example, different groups may advocate exposure indicators, biomonitoring, mapping methods (GIS), modeling, environmental media monitoring, and/or personal exposure modeling. However, emerging research reveals that the greatest progress comes from integration among two or more of these efforts.« less
Restoration of Secondary Containment in Double Shell Tank (DST) Pits
DOE Office of Scientific and Technical Information (OSTI.GOV)
SHEN, E.J.
2000-10-05
Cracks found in many of the double-shell tank (DST) pump and valve pits bring into question the ability of the pits to provide secondary containment and remain in compliance with State and Federal regulations. This study was commissioned to identify viable options for maintain/restoring secondary containment capability in these pits. The basis for this study is the decision analysis process which identifies the requirements to be met and the desired goals (decision criteria) that each option will be weighed against. A facilitated workshop was convened with individuals knowledgeable of Tank Farms Operations, engineering practices, and safety/environmental requirements. The outcome ofmore » this workshop was the validation or identification of the critical requirements, definition of the current problem, identification and weighting of the desired goals, baselining of the current repair methods, and identification of potential alternate solutions. The workshop was followed up with further investigations into the potential solutions that were identified in the workshop and through other efforts. These solutions are identified in the body of this report. Each of the potential solutions were screened against the list of requirements and only those meeting the requirements were considered viable options. To expand the field of viable options, hybrid concepts that combine the strongest features of different individual approaches were also examined. Several were identified. The decision analysis process then ranked each of the viable options against the weighted decision criteria, which resulted in a recommended solution. The recommended approach is based upon installing a sprayed on coating system.« less
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.
2013-01-01
Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.
Aghdasi, Nava; Whipple, Mark; Humphreys, Ian M; Moe, Kris S; Hannaford, Blake; Bly, Randall A
2018-06-01
Successful multidisciplinary treatment of skull base pathology requires precise preoperative planning. Current surgical approach (pathway) selection for these complex procedures depends on an individual surgeon's experiences and background training. Because of anatomical variation in both normal tissue and pathology (eg, tumor), a successful surgical pathway used on one patient is not necessarily the best approach on another patient. The question is how to define and obtain optimized patient-specific surgical approach pathways? In this article, we demonstrate that the surgeon's knowledge and decision making in preoperative planning can be modeled by a multiobjective cost function in a retrospective analysis of actual complex skull base cases. Two different approaches- weighted-sum approach and Pareto optimality-were used with a defined cost function to derive optimized surgical pathways based on preoperative computed tomography (CT) scans and manually designated pathology. With the first method, surgeon's preferences were input as a set of weights for each objective before the search. In the second approach, the surgeon's preferences were used to select a surgical pathway from the computed Pareto optimal set. Using preoperative CT and magnetic resonance imaging, the patient-specific surgical pathways derived by these methods were similar (85% agreement) to the actual approaches performed on patients. In one case where the actual surgical approach was different, revision surgery was required and was performed utilizing the computationally derived approach pathway.
On-board fault management for autonomous spacecraft
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Doyle, Susan C.; Martin, Eric; Sellers, Suzanne
1991-01-01
The dynamic nature of the Cargo Transfer Vehicle's (CTV) mission and the high level of autonomy required mandate a complete fault management system capable of operating under uncertain conditions. Such a fault management system must take into account the current mission phase and the environment (including the target vehicle), as well as the CTV's state of health. This level of capability is beyond the scope of current on-board fault management systems. This presentation will discuss work in progress at TRW to apply artificial intelligence to the problem of on-board fault management. The goal of this work is to develop fault management systems. This presentation will discuss work in progress at TRW to apply artificial intelligence to the problem of on-board fault management. The goal of this work is to develop fault management systems that can meet the needs of spacecraft that have long-range autonomy requirements. We have implemented a model-based approach to fault detection and isolation that does not require explicit characterization of failures prior to launch. It is thus able to detect failures that were not considered in the failure and effects analysis. We have applied this technique to several different subsystems and tested our approach against both simulations and an electrical power system hardware testbed. We present findings from simulation and hardware tests which demonstrate the ability of our model-based system to detect and isolate failures, and describe our work in porting the Ada version of this system to a flight-qualified processor. We also discuss current research aimed at expanding our system to monitor the entire spacecraft.
Efficient non-hydrostatic modelling of 3D wave-induced currents using a subgrid approach
NASA Astrophysics Data System (ADS)
Rijnsdorp, Dirk P.; Smit, Pieter B.; Zijlema, Marcel; Reniers, Ad J. H. M.
2017-08-01
Wave-induced currents are an ubiquitous feature in coastal waters that can spread material over the surf zone and the inner shelf. These currents are typically under resolved in non-hydrostatic wave-flow models due to computational constraints. Specifically, the low vertical resolutions adequate to describe the wave dynamics - and required to feasibly compute at the scales of a field site - are too coarse to account for the relevant details of the three-dimensional (3D) flow field. To describe the relevant dynamics of both wave and currents, while retaining a model framework that can be applied at field scales, we propose a two grid approach to solve the governing equations. With this approach, the vertical accelerations and non-hydrostatic pressures are resolved on a relatively coarse vertical grid (which is sufficient to accurately resolve the wave dynamics), whereas the horizontal velocities and turbulent stresses are resolved on a much finer subgrid (of which the resolution is dictated by the vertical scale of the mean flows). This approach ensures that the discrete pressure Poisson equation - the solution of which dominates the computational effort - is evaluated on the coarse grid scale, thereby greatly improving efficiency, while providing a fine vertical resolution to resolve the vertical variation of the mean flow. This work presents the general methodology, and discusses the numerical implementation in the SWASH wave-flow model. Model predictions are compared with observations of three flume experiments to demonstrate that the subgrid approach captures both the nearshore evolution of the waves, and the wave-induced flows like the undertow profile and longshore current. The accuracy of the subgrid predictions is comparable to fully resolved 3D simulations - but at much reduced computational costs. The findings of this work thereby demonstrate that the subgrid approach has the potential to make 3D non-hydrostatic simulations feasible at the scale of a realistic coastal region.
An open, component-based information infrastructure for integrated health information networks.
Tsiknakis, Manolis; Katehakis, Dimitrios G; Orphanoudakis, Stelios C
2002-12-18
A fundamental requirement for achieving continuity of care is the seamless sharing of multimedia clinical information. Different technological approaches can be adopted for enabling the communication and sharing of health record segments. In the context of the emerging global information society, the creation of and access to the integrated electronic health record (I-EHR) of a citizen has been assigned high priority in many countries. This requirement is complementary to an overall requirement for the creation of a health information infrastructure (HII) to support the provision of a variety of health telematics and e-health services. In developing a regional or national HII, the components or building blocks that make up the overall information system ought to be defined and an appropriate component architecture specified. This paper discusses current international priorities and trends in developing the HII. It presents technological challenges and alternative approaches towards the creation of an I-EHR, being the aggregation of health data created during all interactions of an individual with the healthcare system. It also presents results from an ongoing Research and Development (R&D) effort towards the implementation of the HII in HYGEIAnet, the regional health information network of Crete, Greece, using a component-based software engineering approach. Critical design decisions and related trade-offs, involved in the process of component specification and development, are also discussed and the current state of development of an I-EHR service is presented. Finally, Human Computer Interaction (HCI) and security issues, which are important for the deployment and use of any I-EHR service, are considered.
Meeting EHR security requirements: SeAAS approach.
Katt, Basel; Trojer, Thomas; Breu, Ruth; Schabetsberger, Thomas; Wozak, Florian
2010-01-01
In the last few years, Electronic Health Record (EHR) systems have received a great attention in the literature, as well as in the industry. They are expected to lead to health care savings, increase health care quality and reduce medical errors. This interest has been accompanied by the development of different standards and frameworks to meet EHR challenges. One of the most important initiatives that was developed to solve problems of EHR is IHE (Integrating the Healthcare Enterprise), which adapts the distributed approach to store and manage healthcare data. IHE aims at standardizing the way healthcare systems exchange information in distributed environments. For this purpose it defines several so called Integration Profiles that specify the interactions and the interfaces (Transactions) between various healthcare systems (Actors) or entities. Security was considered also in few profiles that tackled the main security requirements, mainly authentication and audit trails. The security profiles of IHE currently suffer two drawbacks. First, they apply end point security methodology, which has been proven recently to be insufficient and cumbersome in distributed and heterogeneous environment. Second, the current security profiles for more complex security requirements are oversimplified, vague and do not consider architectural design. This recently changed to some extend e.g., with the introduction of newly published white papers regarding privacy [5] and access control [9]. In order to solve the first problem we utilize results of previous studies conducted in the area of security-aware IHE-based systems and the state-of-the-art Security-as-a-Service approach as a convenient methodology to group domain-wide security needs and overcome the end point security shortcomings.
Andersen, Melvin E.; Clewell, Harvey J.; Carmichael, Paul L.; Boekelheide, Kim
2013-01-01
The 2007 report “Toxicity Testing in the 21st Century: A Vision and A Strategy” argued for a change in toxicity testing for environmental agents and discussed federal funding mechanisms that could be used to support this transformation within the USA. The new approach would test for in vitro perturbations of toxicity pathways using human cells with high throughput testing platforms. The NRC report proposed a deliberate timeline, spanning about 20 years, to implement a wholesale replacement of current in-life toxicity test approaches focused on apical responses with in vitro assays. One approach to accelerating implementation is to focus on well-studied prototype compounds with known toxicity pathway targets. Through a series of carefully executed case studies with four or five pathway prototypes, the various steps required for implementation of an in vitro toxicity pathway approach to risk assessment could be developed and refined. In this article, we discuss alternative approaches for implementation and also outline advantages of a case study approach and the manner in which the cases studies could be pursued using current methodologies. A case study approach would be complementary to recently proposed efforts to map the human toxome, while representing a significant extension toward more formal risk assessment compared to the profiling and prioritization approaches offered by programs such as the EPA’s ToxCast effort. PMID:21993955
Watch what you say, your computer might be listening: A review of automated speech recognition
NASA Technical Reports Server (NTRS)
Degennaro, Stephen V.
1991-01-01
Spoken language is the most convenient and natural means by which people interact with each other and is, therefore, a promising candidate for human-machine interactions. Speech also offers an additional channel for hands-busy applications, complementing the use of motor output channels for control. Current speech recognition systems vary considerably across a number of important characteristics, including vocabulary size, speaking mode, training requirements for new speakers, robustness to acoustic environments, and accuracy. Algorithmically, these systems range from rule-based techniques through more probabilistic or self-learning approaches such as hidden Markov modeling and neural networks. This tutorial begins with a brief summary of the relevant features of current speech recognition systems and the strengths and weaknesses of the various algorithmic approaches.
Deep ultraviolet light-emitting and laser diodes
NASA Astrophysics Data System (ADS)
Khan, Asif; Asif, Fatima; Muhtadi, Sakib
2016-02-01
Nearly all the air-water purification/polymer curing systems and bio-medical instruments require 250-300 nm wavelength ultraviolet light for which mercury lamps are primarily used. As a potential replacement for these hazardous mercury lamps, several global research teams are developing AlGaN based Deep Ultraviolet (DUV) light emitting diodes (LEDs) and DUV LED Lamps and Laser Diodes over Sapphire and AlN substrates. In this paper, we review the current research focus and the latest device results. In addition to the current results we also discuss a new quasipseudomorphic device design approach. This approach which is much easier to integrate in a commercial production setting was successfully used to demonstrate UVC devices on Sapphire substrates with performance levels equal to or better than the conventional relaxed device designs.
X-56A MUTT: Aeroservoelastic Modeling
NASA Technical Reports Server (NTRS)
Ouellette, Jeffrey A.
2015-01-01
For the NASA X-56a Program, Armstrong Flight Research Center has been developing a set of linear states space models that integrate the flight dynamics and structural dynamics. These high order models are needed for the control design, control evaluation, and test input design. The current focus has been on developing stiff wing models to validate the current modeling approach. The extension of the modeling approach to the flexible wings requires only a change in the structural model. Individual subsystems models (actuators, inertial properties, etc.) have been validated by component level ground tests. Closed loop simulation of maneuvers designed to validate the flight dynamics of these models correlates very well flight test data. The open loop structural dynamics are also shown to correlate well to the flight test data.
Prediction of car cabin environment by means of 1D and 3D cabin model
NASA Astrophysics Data System (ADS)
Fišer, J.; Pokorný, J.; Jícha, M.
2012-04-01
Thermal comfort and also reduction of energy requirements of air-conditioning system in vehicle cabins are currently very intensively investigated and up-to-date issues. The article deals with two approaches of modelling of car cabin environment; the first model was created in simulation language Modelica (typical 1D approach without cabin geometry) and the second one was created in specialized software Theseus-FE (3D approach with cabin geometry). Performance and capabilities of this tools are demonstrated on the example of the car cabin and the results from simulations are compared with the results from the real car cabin climate chamber measurements.
Mesenchymal stem cell therapy for acute radiation syndrome.
Fukumoto, Risaku
2016-01-01
Acute radiation syndrome affects military personnel and civilians following the uncontrolled dispersal of radiation, such as that caused by detonation of nuclear devices and inappropriate medical treatments. Therefore, there is a growing need for medical interventions that facilitate the improved recovery of victims and patients. One promising approach may be cell therapy, which, when appropriately implemented, may facilitate recovery from whole body injuries. This editorial highlights the current knowledge regarding the use of mesenchymal stem cells for the treatment of acute radiation syndrome, the benefits and limitations of which are under investigation. Establishing successful therapies for acute radiation syndrome may require using such a therapeutic approach in addition to conventional approaches.
Vote Stuffing Control in IPTV-based Recommender Systems
NASA Astrophysics Data System (ADS)
Bhatt, Rajen
Vote stuffing is a general problem in the functioning of the content rating-based recommender systems. Currently IPTV viewers browse various contents based on the program ratings. In this paper, we propose a fuzzy clustering-based approach to remove the effects of vote stuffing and consider only the genuine ratings for the programs over multiple genres. The approach requires only one authentic rating, which is generally available from recommendation system administrators or program broadcasters. The entire process is automated using fuzzy c-means clustering. Computational experiments performed over one real-world program rating database shows that the proposed approach is very efficient for controlling vote stuffing.
Heavy atom labeled nucleotides for measurement of kinetic isotope effects.
Weissman, Benjamin P; Li, Nan-Sheng; York, Darrin; Harris, Michael; Piccirilli, Joseph A
2015-11-01
Experimental analysis of kinetic isotope effects represents an extremely powerful approach for gaining information about the transition state structure of complex reactions not available through other methodologies. The implementation of this approach to the study of nucleic acid chemistry requires the synthesis of nucleobases and nucleotides enriched for heavy isotopes at specific positions. In this review, we highlight current approaches to the synthesis of nucleic acids enriched site specifically for heavy oxygen and nitrogen and their application in heavy atom isotope effect studies. This article is part of a special issue titled: Enzyme Transition States from Theory and Experiment. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of Miniaturized Optimized Smart Sensors (MOSS) for space plasmas
NASA Technical Reports Server (NTRS)
Young, D. T.
1993-01-01
The cost of space plasma sensors is high for several reasons: (1) Most are one-of-a-kind and state-of-the-art, (2) the cost of launch to orbit is high, (3) ruggedness and reliability requirements lead to costly development and test programs, and (4) overhead is added by overly elaborate or generalized spacecraft interface requirements. Possible approaches to reducing costs include development of small 'sensors' (defined as including all necessary optics, detectors, and related electronics) that will ultimately lead to cheaper missions by reducing (2), improving (3), and, through work with spacecraft designers, reducing (4). Despite this logical approach, there is no guarantee that smaller sensors are necessarily either better or cheaper. We have previously advocated applying analytical 'quality factors' to plasma sensors (and spacecraft) and have begun to develop miniaturized particle optical systems by applying quantitative optimization criteria. We are currently designing a Miniaturized Optimized Smart Sensor (MOSS) in which miniaturized electronics (e.g., employing new power supply topology and extensive us of gate arrays and hybrid circuits) are fully integrated with newly developed particle optics to give significant savings in volume and mass. The goal of the SwRI MOSS program is development of a fully self-contained and functional plasma sensor weighing 1 lb and requiring 1 W. MOSS will require only a typical spacecraft DC power source (e.g., 30 V) and command/data interfaces in order to be fully functional, and will provide measurement capabilities comparable in most ways to current sensors.
Improvement of Current Drive Efficiency in Projected FNSF Discharges
NASA Astrophysics Data System (ADS)
Prater, R.; Chan, V.; Garofalo, A.
2012-10-01
The Fusion Nuclear Science Facility - Advanced Tokamak (FNSF-AT) is envisioned as a facility that uses the tokamak approach to address the development of the AT path to fusion and fusion's energy objectives. It uses copper coils for a compact device with high βN and moderate power gain. The major radius is 2.7 m and central toroidal field is 5.44 T. Achieving the required confinement and stability at βN˜3.7 requires a current profile with negative central shear and qmin>1. Off-axis Electron Cyclotron Current Drive (ECCD), in addition to high bootstrap current fraction, can help support this current profile. Using the applied EC frequency and launch location as free parameters, a systematic study has been carried out to optimize the ECCD in the range ρ= 0.5-0.7. Using a top launch, making use of a large toroidal component to the launch direction, adjusting the vertical launch angle so that the rays propagate nearly parallel to the resonance, and adjusting the frequency for optimum total current give a high dimensionless efficiency of 0.44 for a broad ECCD profile peaked at ρ=0.7, and the driven current is 17 kA/MW for n20= 2.1 and Te= 10.3 keV locally.
A Conceptual Modeling Approach for OLAP Personalization
NASA Astrophysics Data System (ADS)
Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan
Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.
Islam, Mohammed A
2010-01-01
Despite the emerging new insights into our understandings of the cellular mechanisms underlying cardiac arrhythmia, medical therapy for this disease remains unsatisfactory. Atrial fibrillation (AF), the most prevalent arrhythmia, is responsible for significant morbidity and mortality. On the other hand, ventricular fibrillation results in sudden cardiac deaths in many instances. Prolongation of cardiac action potential (AP) is a proven principle of antiarrhythmic therapy. Class III antiarrhythmic agents prolong AP and QT interval by blocking rapidly activating delayed rectifier current (I(Kr)). However, I(Kr) blocking drugs carry the risk of life-threatening proarrhythmia. Recently, modulation of atrial-selective ultra-rapid delayed rectifier current (I(Kur)), has emerged as a novel therapeutic approach to treat AF. A number of I(Kur) blockers are being evaluated for the treatment of AF. The inhibition of slowly activating delayed rectifier current (I(Ks)) has also been proposed as an effective and safer antiarrhythmic approach because of its distinguishing characteristics that differ in remarkable ways from other selective class III agents. Selective I(Ks) block may prolong AP duration (APD) at rapid rates without leading to proarrhythmia. This article reviews the pathophysiological roles of I(Kur) and I(Ks) in cardiac repolarization and the implications of newly developed I(Kur) and I(Ks) blocking agents as promising antiarrhythmic approaches. Several recent patents pertinent to antiarrhythmic drug development have been discussed. Further research will be required to evaluate the efficacy and safety of these agents in the clinical setting.
Human Papillomavirus Vaccination Requirements in US Schools: Recommendations for Moving Forward.
North, Anna L; Niccolai, Linda M
2016-10-01
Safe and effective human papillomavirus (HPV) vaccines have been available and recommended for adolescents for a decade in the United States, yet vaccination rates remain suboptimal. School entry requirements have increased uptake of other vaccines for adolescents and made coverage more equitable. However, only 3 jurisdictions require HPV vaccine for school. We summarize the current status of HPV vaccine requirements and discuss the rationales for and against these policies. The rationales for requirements include HPV vaccine efficacy and safety, effectiveness of requirements for increasing vaccine uptake and making it more equitable, and use of requirements as "safety nets" and to achieve herd immunity. The rationales against requirements include low parental acceptance of HPV vaccine, the financial burden on educational systems and health departments, and the possibility for alternatives to increase vaccine uptake. Many challenges to HPV vaccine requirements are addressable, and we conclude with recommendations on how to approach these challenges.
Basic-CPR and AIDS: are volunteer life-savers prepared for a storm?
Bierens, J J; Berden, H J
1996-10-01
Professional health care workers have access to guidelines, equipment and techniques to reduce the exposure to infectious material in case of resuscitation. The current official content of national courses for volunteer life-savers do not address this issue, as far as we know. Concern about the risks of infection due to resuscitation is increasing in this group. This article describes a rational approach of the problem, that includes data on the infection risk of basic-CPR, and an approach that accepts that the concern can not be controlled by objective data. In such an emotional approach, direct contact has to be minimised by using devices. Requirements for resuscitation devices with a barrier function are listed. Although both approaches will reduce the fear of infection, we advice a rational approach.
NASA Astrophysics Data System (ADS)
Lewison, R. L.; Saumweber, W. J.; Erickson, A.; Martone, R. G.
2016-12-01
Dynamic ocean management, or management that uses near real-time data to guide the spatial distribution of commercial activities, is an emerging approach to balance ocean resource use and conservation. Employing a wide range of data types, dynamic ocean management in a fisheries context can be used to meet multiple objectives - managing target quota, bycatch reduction, and reducing interactions with species of conservation concern. There is a growing list of DOM applications currently in practice in fisheries around the world, yet the approach is new enough that both fishers and fisheries managers are unclear how DOM can be applied to their fishery. Here, we use the experience from dynamic ocean management applications that are currently in practice to address the commonly asked question "How can dynamic management approaches be implemented in a traditionally managed fishery?". Combining knowledge from the DOM participants with a review of regulatory frameworks and incentive structures, stakeholder participation, and technological requirements of DOM in practice, we identify ingredients that have supported successful implementation of this new management approach.
Numerical modeling of hydrodynamics and sediment transport—an integrated approach
NASA Astrophysics Data System (ADS)
Gic-Grusza, Gabriela; Dudkowska, Aleksandra
2017-10-01
Point measurement-based estimation of bedload transport in the coastal zone is very difficult. The only way to assess the magnitude and direction of bedload transport in larger areas, particularly those characterized by complex bottom topography and hydrodynamics, is to use a holistic approach. This requires modeling of waves, currents, and the critical bed shear stress and bedload transport magnitude, with a due consideration to the realistic bathymetry and distribution of surface sediment types. Such a holistic approach is presented in this paper which describes modeling of bedload transport in the Gulf of Gdańsk. Extreme storm conditions defined based on 138-year NOAA data were assumed. The SWAN model (Booij et al. 1999) was used to define wind-wave fields, whereas wave-induced currents were calculated using the Kołodko and Gic-Grusza (2015) model, and the magnitude of bedload transport was estimated using the modified Meyer-Peter and Müller (1948) formula. The calculations were performed using a GIS model. The results obtained are innovative. The approach presented appears to be a valuable source of information on bedload transport in the coastal zone.
Multidisciplinary approach to successful implementation of production information system (PRISM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shariff, M.R.; Gopalakrishnan, S.G.; Francis, N.
1995-12-31
A company wide corporate and regional production database supporting all production areas was envisaged critical to the current expansion within Petronas Carigali Sdn Bhd (PCSB). A multi disciplinary project team was thus formed to analyze the requirements prior to developing, testing, implementing and training users. PCSB has currently evolved into a mature E & P company on par with other E & P companies within the region. This expansion necessitates a common Production Information System for the efficient dissemination of vital Production Information for Production Surveillance, Reservoir Management, Reserve Assessment, Special Studies and Standardized Group-wide Reporting. This paper will discussmore » all the phases involved in the project which includes Systems Requirement Study, Data Migration, System Development, System Implementation and Post-Implementation Plan.« less
Charge-based MOSFET model based on the Hermite interpolation polynomial
NASA Astrophysics Data System (ADS)
Colalongo, Luigi; Richelli, Anna; Kovacs, Zsolt
2017-04-01
An accurate charge-based compact MOSFET model is developed using the third order Hermite interpolation polynomial to approximate the relation between surface potential and inversion charge in the channel. This new formulation of the drain current retains the same simplicity of the most advanced charge-based compact MOSFET models such as BSIM, ACM and EKV, but it is developed without requiring the crude linearization of the inversion charge. Hence, the asymmetry and the non-linearity in the channel are accurately accounted for. Nevertheless, the expression of the drain current can be worked out to be analytically equivalent to BSIM, ACM and EKV. Furthermore, thanks to this new mathematical approach the slope factor is rigorously defined in all regions of operation and no empirical assumption is required.
Applicability of empirical data currently used in predicting solid propellant exhaust plumes
NASA Technical Reports Server (NTRS)
Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.; Greenwood, T.; Roberts, B. B.
1977-01-01
Theoretical and experimental approaches to exhaust plume analysis are compared. A two-phase model is extended to include treatment of reacting gas chemistry, and thermodynamical modeling of the gaseous phase of the flow field is considered. The applicability of empirical data currently available to define particle drag coefficients, heat transfer coefficients, mean particle size, and particle size distributions is investigated. Experimental and analytical comparisons are presented for subscale solid rocket motors operating at three altitudes with attention to pitot total pressure and stagnation point heating rate measurements. The mathematical treatment input requirements are explained. The two-phase flow field solution adequately predicts gasdynamic properties in the inviscid portion of two-phase exhaust plumes. It is found that prediction of exhaust plume gas pressures requires an adequate model of flow field dynamics.
Research on scheme of applying ASON to current networks
NASA Astrophysics Data System (ADS)
Mao, Y. F.; Li, J. R.; Deng, L. J.
2008-10-01
Automatically Switched Optical Network (ASON) is currently a new and hot research subject in the world. It can provide high bandwidth, high assembly flexibility, high network security and reliability, but with a low management cost. It is presented to meet the requirements for high-throughput optical access with stringent Quality of Service (QoS). But as a brand new technology, ASON can not be supported by the traditional protocol software and network equipments. And the approach to build a new ASON network on the basis of completely abandoning the traditional optical network facilities is not desirable, because it costs too much and wastes a lot of network resources can also be used. So how to apply ASON to the current networks and realize the smooth transition between the existing network and ASON has been a serious problem to many network operators. In this research, the status in quo of ASON is introduced first and then the key problems should be considered when applying ASON to current networks are discussed. Based on this, the strategies should be complied with to overcome these key problems are listed. At last, the approach to apply ASON to the current optical networks is proposed and analyzed.
Towards Measuring the Economic Value of Higher Education: Lessons from South Africa
ERIC Educational Resources Information Center
Allais, Stephanie
2017-01-01
A crisis of student funding has led to most South African universities being closed for weeks, after protests in 2015 and again in 2016. A policy response to these events requires insight into relationships between higher education, society, and the economy. This paper interrogates the assumptions which underpin current approaches to measuring…
The Role of Teacher Imagination in Conceptualising the Child as a Second Language Learner
ERIC Educational Resources Information Center
Guz, Ewa; Tetiurka, Maugorzata
2013-01-01
In order to initiate and maintain meaningful interaction in a young learner L2 classroom, an adult teacher needs to approach children in ways consistent with their developmental profile and adjust teaching methodology so as to accommodate young learners' current skills. This requires the ability to predict the child's possible responses to…
The Case for Implementing the Levels of Prevention Model: Opiate Abuse on American College Campuses
ERIC Educational Resources Information Center
Daniels-Witt, Quri; Thompson, Amy; Glassman, Tavis; Federman, Sara; Bott, Katie
2017-01-01
Opiate abuse in the United States is on the rise among the college student population. This public health crisis requires immediate action from professionals and stakeholders who are committed to addressing the needs of prospective, current, and recovering opiate users using comprehensive prevention methods. Such approaches have been used to…
Comparison of Acoustic and Kinematic Approaches to Measuring Utterance-Level Speech Variability
ERIC Educational Resources Information Center
Howell, Peter; Anderson, Andrew J.; Bartrip, Jon; Bailey, Eleanor
2009-01-01
Purpose: The spatiotemporal index (STI) is one measure of variability. As currently implemented, kinematic data are used, requiring equipment that cannot be used with some patient groups or in scanners. An experiment is reported that addressed whether STI can be extended to an audio measure of sound pressure of the speech envelope over time that…
ERIC Educational Resources Information Center
Ragin, Tracey B.
2013-01-01
Fundamental computer skills are vital in the current technology-driven society. The purpose of this study was to investigate the development needs of students at a rural community college in the Southeast who lacked the computer literacy skills required in a basic computer course. Guided by Greenwood's pragmatic approach as a reformative force in…
Ethical Leadership and Moral Literacy: Incorporating Ethical Dilemmas in a Case-Based Pedagogy
ERIC Educational Resources Information Center
Jenlink, Patrick M.; Jenlink, Karen Embry
2015-01-01
In this paper the authors examine an ethical dilemma approach to case-based pedagogy for leadership preparation, which was used in a doctoral studies program. Specifically, the authors argue that preparing educational leaders for the ethical dilemmas and moral decision-making that define schools requires assessing current programs and pedagogical…
Hardiness of Adolescents with Special Educational Needs: Research Results
ERIC Educational Resources Information Center
Shchipanova, Dina Ye.; Tserkovnikova, Nataliya G.; Uskova, Bella A.; Puzyrev, Viktor V.; Markova, Anastasia S.; Fomin, Evgenii P.
2016-01-01
The relevance of the problem under study is due to the fact that the current worldwide trend shows high dynamics of the increase in the number of children with SEN, which requires that society in general and the education system, in particular, should develop approaches to the socialization and adaptation of people with SEN through strengthening…
An overview of computer vision
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1982-01-01
An overview of computer vision is provided. Image understanding and scene analysis are emphasized, and pertinent aspects of pattern recognition are treated. The basic approach to computer vision systems, the techniques utilized, applications, the current existing systems and state-of-the-art issues and research requirements, who is doing it and who is funding it, and future trends and expectations are reviewed.
ERIC Educational Resources Information Center
McGuigan, Nicholas; Kern, Thomas
2016-01-01
The future employment markets our graduates are likely to face are increasingly complex and unpredictable. Demands are being placed on higher-education providers to become more holistic and integrated in their approach. For business schools across Australia, this requires a significant (re)conceptualisation of how student learning is facilitated,…
ERIC Educational Resources Information Center
Rhode, William E.; And Others
In order to examine the possibilities for an advanced multimedia instructional system, a review and assessment of current instructional media was undertaken in terms of a functional description, instructional flexibility, support requirements, and costs. Following this, a model of an individual instructional system was developed as a basis for…
Electronic Learning Courses as a Means to Activate Students' Independent Work in Studying Physics
ERIC Educational Resources Information Center
Shurygin, Viktor Yurjevich; Krasnova, Lyubov Alekseevna
2016-01-01
Currently, there are special requirements to the system of higher education, focused not only on imparting knowledge to students, but also on the formation of the continuous need for independent self-education, self-creative approach to getting knowledge throughout their active life. In this regard, the role of students' independent work with its…
Using nocturnal cold air drainage flow to monitor ecosystem processes in complex terrain
Thomas G. Pypker; Michael H. Unsworth; Alan C. Mix; William Rugh; Troy Ocheltree; Karrin Alstad; Barbara J. Bond
2007-01-01
This paper presents initial investigations of a new approach to monitor ecosystem processes in complex terrain on large scales. Metabolic processes in mountainous ecosystems are poorly represented in current ecosystem monitoring campaigns because the methods used for monitoring metabolism at the ecosystem scale (e.g., eddy covariance) require flat study sites. Our goal...
ERIC Educational Resources Information Center
Santiago, Deborah Albright
2012-01-01
Although teachers implement differentiated instructional techniques to provide students with enriching hands-on activities related to real life experiences, the implementation of instructional techniques has required teachers to rethink and revise their approaches to classroom management (CM). While a gap in research exists on current practices in…
Helicopter noise prediction - The current status and future direction
NASA Technical Reports Server (NTRS)
Brentner, Kenneth S.; Farassat, F.
1992-01-01
The paper takes stock of the progress, assesses the current prediction capabilities, and forecasts the direction of future helicopter noise prediction research. The acoustic analogy approach, specifically, theories based on the Ffowcs Williams-Hawkings equations, are the most widely used for deterministic noise sources. Thickness and loading noise can be routinely predicted given good plane motion and blade loading inputs. Blade-vortex interaction noise can also be predicted well with measured input data, but prediction of airloads with the high spatial and temporal resolution required for BVI is still difficult. Current semiempirical broadband noise predictions are useful and reasonably accurate. New prediction methods based on a Kirchhoff formula and direct computation appear to be very promising, but are currently very demanding computationally.
Tan, Aimin; Saffaj, Taoufiq; Musuku, Adrien; Awaiye, Kayode; Ihssane, Bouchaib; Jhilal, Fayçal; Sosse, Saad Alaoui; Trabelsi, Fethi
2015-03-01
The current approach in regulated LC-MS bioanalysis, which evaluates the precision and trueness of an assay separately, has long been criticized for inadequate balancing of lab-customer risks. Accordingly, different total error approaches have been proposed. The aims of this research were to evaluate the aforementioned risks in reality and the difference among four common total error approaches (β-expectation, β-content, uncertainty, and risk profile) through retrospective analysis of regulated LC-MS projects. Twenty-eight projects (14 validations and 14 productions) were randomly selected from two GLP bioanalytical laboratories, which represent a wide variety of assays. The results show that the risk of accepting unacceptable batches did exist with the current approach (9% and 4% of the evaluated QC levels failed for validation and production, respectively). The fact that the risk was not wide-spread was only because the precision and bias of modern LC-MS assays are usually much better than the minimum regulatory requirements. Despite minor differences in magnitude, very similar accuracy profiles and/or conclusions were obtained from the four different total error approaches. High correlation was even observed in the width of bias intervals. For example, the mean width of SFSTP's β-expectation is 1.10-fold (CV=7.6%) of that of Saffaj-Ihssane's uncertainty approach, while the latter is 1.13-fold (CV=6.0%) of that of Hoffman-Kringle's β-content approach. To conclude, the risk of accepting unacceptable batches was real with the current approach, suggesting that total error approaches should be used instead. Moreover, any of the four total error approaches may be used because of their overall similarity. Lastly, the difficulties/obstacles associated with the application of total error approaches in routine analysis and their desirable future improvements are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
Solar fuels production by artificial photosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ager, Joel W., E-mail: JWAger@lbl.gov; Lee, Min-Hyung; Javey, Ali
2013-12-10
A practical method to use sunlight to generate storable chemical energy could dramatically change the landscape of global energy generation. One of the fundamental requirements of such an “artificial photosynthesis” scheme is a light capture and conversion approach capable of generating the required chemical potentials (e.g. >1.23 V for splitting water into H{sub 2} and O{sub 2}). An approach based on inorganic light absorbers coupled directly to oxidation and reduction catalysts is being developed in the Joint Center for Artificial Photosynthesis (JCAP). P-type III-V semiconductors with a high surface area can be used as high current density photocathodes. The longevitymore » under operation of these photocathodes can be improved by the use of conformal metal oxides deposited by atomic layer deposition.« less
Machine learning action parameters in lattice quantum chromodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shanahan, Phiala; Trewartha, Daneil; Detmold, William
Numerical lattice quantum chromodynamics studies of the strong interaction underpin theoretical understanding of many aspects of particle and nuclear physics. Such studies require significant computing resources to undertake. A number of proposed methods promise improved efficiency of lattice calculations, and access to regions of parameter space that are currently computationally intractable, via multi-scale action-matching approaches that necessitate parametric regression of generated lattice datasets. The applicability of machine learning to this regression task is investigated, with deep neural networks found to provide an efficient solution even in cases where approaches such as principal component analysis fail. Finally, the high information contentmore » and complex symmetries inherent in lattice QCD datasets require custom neural network layers to be introduced and present opportunities for further development.« less
NASA Technical Reports Server (NTRS)
Maddalon, J. M.; Hayhurst, K. J.; Neogi, N. A.; Verstynen, H. A.; Clothier, R. A.
2016-01-01
One of the key challenges to the development of a commercial Unmanned Air-craft System (UAS) market is the lack of explicit consideration of UAS in the current regulatory framework. Despite recent progress, additional steps are needed to enable broad UAS types and operational models. This paper discusses recent research that examines how a risk-based approach for safety might change the process and substance of airworthiness requirements for UAS. The project proposed risk-centric airworthiness requirements for a midsize un-manned rotorcraft used for agricultural spraying and also identified factors that may contribute to distinguishing safety risk among different UAS types and operational concepts. Lessons learned regarding how a risk-based approach can expand the envelope of UAS certification are discussed.
Machine learning action parameters in lattice quantum chromodynamics
Shanahan, Phiala; Trewartha, Daneil; Detmold, William
2018-05-16
Numerical lattice quantum chromodynamics studies of the strong interaction underpin theoretical understanding of many aspects of particle and nuclear physics. Such studies require significant computing resources to undertake. A number of proposed methods promise improved efficiency of lattice calculations, and access to regions of parameter space that are currently computationally intractable, via multi-scale action-matching approaches that necessitate parametric regression of generated lattice datasets. The applicability of machine learning to this regression task is investigated, with deep neural networks found to provide an efficient solution even in cases where approaches such as principal component analysis fail. Finally, the high information contentmore » and complex symmetries inherent in lattice QCD datasets require custom neural network layers to be introduced and present opportunities for further development.« less
TargetSpy: a supervised machine learning approach for microRNA target prediction.
Sturm, Martin; Hackenberg, Michael; Langenberger, David; Frishman, Dmitrij
2010-05-28
Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences.In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila, suggesting that it may be applicable to a broad range of species. Moreover, we have demonstrated that the application of machine learning techniques in combination with upcoming deep sequencing data results in a powerful microRNA target site prediction tool http://www.targetspy.org.
TargetSpy: a supervised machine learning approach for microRNA target prediction
2010-01-01
Background Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. Results We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences. In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Conclusion Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila, suggesting that it may be applicable to a broad range of species. Moreover, we have demonstrated that the application of machine learning techniques in combination with upcoming deep sequencing data results in a powerful microRNA target site prediction tool http://www.targetspy.org. PMID:20509939
An integrated approach towards future ballistic neck protection materials selection.
Breeze, John; Helliker, Mark; Carr, Debra J
2013-05-01
Ballistic protection for the neck has historically taken the form of collars attached to the ballistic vest (removable or fixed), but other approaches, including the development of prototypes incorporating ballistic material into the collar of an under body armour shirt, are now being investigated. Current neck collars incorporate the same ballistic protective fabrics as the soft armour of the remaining vest, reflecting how ballistic protective performance alone has historically been perceived as the most important property for neck protection. However, the neck has fundamental differences from the thorax in terms of anatomical vulnerability, flexibility and equipment integration, necessitating a separate solution from the thorax in terms of optimal materials selection. An integrated approach towards the selection of the most appropriate combination of materials to be used for each of the two potential designs of future neck protection has been developed. This approach requires evaluation of the properties of each potential material in addition to ballistic performance alone, including flexibility, mass, wear resistance and thermal burden. The aim of this article is to provide readers with an overview of this integrated approach towards ballistic materials selection and an update of its current progress in the development of future ballistic neck protection.
FY11 Facility Assessment Study for Aeronautics Test Program
NASA Technical Reports Server (NTRS)
Loboda, John A.; Sydnor, George H.
2013-01-01
This paper presents the approach and results for the Aeronautics Test Program (ATP) FY11 Facility Assessment Project. ATP commissioned assessments in FY07 and FY11 to aid in the understanding of the current condition and reliability of its facilities and their ability to meet current and future (five year horizon) test requirements. The principle output of the assessment was a database of facility unique, prioritized investments projects with budgetary cost estimates. This database was also used to identify trends for the condition of facility systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendell, Mark J.
2015-06-01
This report briefly summarizes, based on recent review articles and selected more recent research reports, current scientific knowledge on two topics: assessing unhealthy levels of indoor D/M in homes and remediating home dampness-related problems to protect health. Based on a comparison of current scientific knowledge to that required to support effective, evidence-based, health-protective policies on home D/M, gaps in knowledge are highlighted, prior questions and research questions specified, and necessary research activities and approaches recommended.
Alternative approaches to conventional antiepileptic drugs in the management of paediatric epilepsy
Kneen, R; Appleton, R E
2006-01-01
Over the last two decades, there has been a rapid expansion in the number and types of available antiepileptic drugs (AEDs), but there is increasing concern amongst parents and carers about their unwanted side effects. Seizure control is achieved in approximately 75% of children treated with conventional AEDs, but non‐conventional (or non‐standard) medical treatments, surgical procedures, dietary approaches, and other non‐pharmacological treatment approaches may have a role to play in those with intractable seizures or AED toxicity. Many of the approaches are largely common sense and are already incorporated into our current practice, including, for example, avoidance techniques and lifestyle advice, while others require further investigation or appear to be impractical in children. PMID:17056869
A low-cost inertial smoothing system for landing approach guidance
NASA Technical Reports Server (NTRS)
Niessen, F. R.
1973-01-01
Accurate position and velocity information with low noise content for instrument approaches and landings is required for both control and display applications. In a current VTOL automatic instrument approach and landing research program, radar-derived landing guidance position reference signals, which are noisy, have been mixed with acceleration information derived from low-cost onboard sensors to provide high-quality position and velocity information. An in-flight comparison of signal quality and accuracy has shown good agreement between the low-cost inertial smoothing system and an aided inertial navigation system. Furthermore, the low-cost inertial smoothing system has been proven to be satisfactory in control and display system applications for both automatic and pilot-in-the-loop instrument approaches and landings.
'How I do it': TEM for tumors of the rectum.
Collinson, Rowan J; McC Mortensen, Neil J
2009-02-01
Transanal endoscopic microsurgery (TEM) has an established role in the management of benign rectal tumors. It also has an expanding role in the management of malignant tumors, which is more demanding for the clinician. It requires accurate histological and radiological assessment and draws on an expert understanding of the nature of local recurrence, metastasis, and the place of adjuvant therapies. A multidisciplinary approach is recommended. This paper discusses our institutional approach to TEM for benign and malignant tumors and covers some of the current management controversies.
Choice of Outcome Measure in an Economic Evaluation: A Potential Role for the Capability Approach.
Lorgelly, Paula K
2015-08-01
The last decade has seen a renewed interest in Sen's capability approach; health economists have been instrumental in leading much of this work. One particular stream of research is the application of the approach to outcome measurement. To date, there have been a dozen attempts (some combined) to operationalise the approach, and produce an outcome measure that offers a broader evaluative space than health-related quality-of-life measures. Applications have so far been confined to public health, physical, mental health and social care interventions, but the capability approach could be of benefit to evaluations of pharmacotherapies and other technologies. This paper provides an introduction to the capability approach, reviews the measures that are available for use in an economic evaluation, including their current applications, and then concludes with a discussion of a number of issues that require further consideration before the approach is adopted more widely to inform resource allocation decisions.
Regenerative endodontics as a tissue engineering approach: past, current and future.
Malhotra, Neeraj; Mala, Kundabala
2012-12-01
With the reported startling statistics of high incidence of tooth decay and tooth loss, the current interest is focused on the development of alternate dental tissue replacement therapies. This has led to the application of dental tissue engineering as a clinically relevant method for the regeneration of dental tissues and generation of bioengineered whole tooth. Although, tissue engineering approach requires the three main key elements of stem cells, scaffold and morphogens, a conductive environment (fourth element) is equally important for successful engineering of any tissue and/or organ. The applications of this science has evolved continuously in dentistry, beginning from the application of Ca(OH)(2) in vital pulp therapy to the development of a fully functional bioengineered tooth (mice). Thus, with advances in basic research, recent reports and studies have shown successful application of tissue engineering in the field of dentistry. However, certain practical obstacles are yet to be overcome before dental tissue regeneration can be applied as evidence-based approach in clinics. The article highlights on the past achievements, current developments and future prospects of tissue engineering and regenerative therapy in the field of endodontics and bioengineered teeth (bioteeth). © 2012 The Authors. Australian Endodontic Journal © 2012 Australian Society of Endodontology.
Wood, Alexander
2004-01-01
This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer
Wörsching, Jana; Padberg, Frank; Ertl-Wagner, Birgit; Kumpf, Ulrike; Kirsch, Beatrice; Keeser, Daniel
2016-10-01
Transcranial current stimulation approaches include neurophysiologically distinct non-invasive brain stimulation techniques widely applied in basic, translational and clinical research: transcranial direct current stimulation (tDCS), oscillating transcranial direct current stimulation (otDCS), transcranial alternating current stimulation (tACS) and transcranial random noise stimulation (tRNS). Prefrontal tDCS seems to be an especially promising tool for clinical practice. In order to effectively modulate relevant neural circuits, systematic research on prefrontal tDCS is needed that uses neuroimaging and neurophysiology measures to specifically target and adjust this method to physiological requirements. This review therefore analyses the various neuroimaging methods used in combination with prefrontal tDCS in healthy and psychiatric populations. First, we provide a systematic overview on applications, computational models and studies combining neuroimaging or neurophysiological measures with tDCS. Second, we categorise these studies in terms of their experimental designs and show that many studies do not vary the experimental conditions to the extent required to demonstrate specific relations between tDCS and its behavioural or neurophysiological effects. Finally, to support best-practice tDCS research we provide a methodological framework for orientation among experimental designs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sparse approximation of currents for statistics on curves and surfaces.
Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas
2008-01-01
Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.
Development of 10B-Based 3He Replacement Neutron Detectors
NASA Astrophysics Data System (ADS)
King, Michael J.; Gozani, Tsahi; Hilliard, Donald B.
2011-12-01
Radiation portal monitors (RPM) are currently deployed at United States border crossings to passively inspect vehicles and persons for any emission of neutrons and/or gamma rays, which may indicate the presence of unshielded nuclear materials. The RPM module contains an organic scintillator with 3He proportional counters to detect gamma rays and thermalized neutrons, respectively. The supply of 3He is rapidly dwindling, requiring alternative detectors to provide the same function and performance. Our alternative approach is one consisting of a thinly-coated 10B flat-panel ionization chamber neutron detector that can be deployed as a direct drop-in replacement for current RPM 3He detectors. The uniqueness of our approach in providing a large-area detector is in the simplicity of construction, scalability of the unit cell detector, ease of adaptability to a variety of applications and low cost. Currently, Rapiscan Laboratories and Helicon Thin Film Systems have designed and developed an operational 100 cm2 multi-layer prototype 10BB-based ionization chamber.
Hendren, Christine Ogilvie; Lowry, Michael; Grieger, Khara D; Money, Eric S; Johnston, John M; Wiesner, Mark R; Beaulieu, Stephen M
2013-02-05
As the use of engineered nanomaterials becomes more prevalent, the likelihood of unintended exposure to these materials also increases. Given the current scarcity of experimental data regarding fate, transport, and bioavailability, determining potential environmental exposure to these materials requires an in depth analysis of modeling techniques that can be used in both the near- and long-term. Here, we provide a critical review of traditional and emerging exposure modeling approaches to highlight the challenges that scientists and decision-makers face when developing environmental exposure and risk assessments for nanomaterials. We find that accounting for nanospecific properties, overcoming data gaps, realizing model limitations, and handling uncertainty are key to developing informative and reliable environmental exposure and risk assessments for engineered nanomaterials. We find methods suited to recognizing and addressing significant uncertainty to be most appropriate for near-term environmental exposure modeling, given the current state of information and the current insufficiency of established deterministic models to address environmental exposure to engineered nanomaterials.
Mean-field dynamo in a turbulence with shear and kinetic helicity fluctuations.
Kleeorin, Nathan; Rogachevskii, Igor
2008-03-01
We study the effects of kinetic helicity fluctuations in a turbulence with large-scale shear using two different approaches: the spectral tau approximation and the second-order correlation approximation (or first-order smoothing approximation). These two approaches demonstrate that homogeneous kinetic helicity fluctuations alone with zero mean value in a sheared homogeneous turbulence cannot cause a large-scale dynamo. A mean-field dynamo is possible when the kinetic helicity fluctuations are inhomogeneous, which causes a nonzero mean alpha effect in a sheared turbulence. On the other hand, the shear-current effect can generate a large-scale magnetic field even in a homogeneous nonhelical turbulence with large-scale shear. This effect was investigated previously for large hydrodynamic and magnetic Reynolds numbers. In this study we examine the threshold required for the shear-current dynamo versus Reynolds number. We demonstrate that there is no need for a developed inertial range in order to maintain the shear-current dynamo (e.g., the threshold in the Reynolds number is of the order of 1).
An alternative approach for computing seismic response with accidental eccentricity
NASA Astrophysics Data System (ADS)
Fan, Xuanhua; Yin, Jiacong; Sun, Shuli; Chen, Pu
2014-09-01
Accidental eccentricity is a non-standard assumption for seismic design of tall buildings. Taking it into consideration requires reanalysis of seismic resistance, which requires either time consuming computation of natural vibration of eccentric structures or finding a static displacement solution by applying an approximated equivalent torsional moment for each eccentric case. This study proposes an alternative modal response spectrum analysis (MRSA) approach to calculate seismic responses with accidental eccentricity. The proposed approach, called the Rayleigh Ritz Projection-MRSA (RRP-MRSA), is developed based on MRSA and two strategies: (a) a RRP method to obtain a fast calculation of approximate modes of eccentric structures; and (b) an approach to assemble mass matrices of eccentric structures. The efficiency of RRP-MRSA is tested via engineering examples and compared with the standard MRSA (ST-MRSA) and one approximate method, i.e., the equivalent torsional moment hybrid MRSA (ETM-MRSA). Numerical results show that RRP-MRSA not only achieves almost the same precision as ST-MRSA, and is much better than ETM-MRSA, but is also more economical. Thus, RRP-MRSA can be in place of current accidental eccentricity computations in seismic design.
NASA Occupant Protection Standards Development
NASA Technical Reports Server (NTRS)
Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles
2012-01-01
Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement
Current strategies in multiphasic scaffold design for osteochondral tissue engineering: A review.
Yousefi, Azizeh-Mitra; Hoque, Md Enamul; Prasad, Rangabhatala G S V; Uth, Nicholas
2015-07-01
The repair of osteochondral defects requires a tissue engineering approach that aims at mimicking the physiological properties and structure of two different tissues (cartilage and bone) using specifically designed scaffold-cell constructs. Biphasic and triphasic approaches utilize two or three different architectures, materials, or composites to produce a multilayered construct. This article gives an overview of some of the current strategies in multiphasic/gradient-based scaffold architectures and compositions for tissue engineering of osteochondral defects. In addition, the application of finite element analysis (FEA) in scaffold design and simulation of in vitro and in vivo cell growth outcomes has been briefly covered. FEA-based approaches can potentially be coupled with computer-assisted fabrication systems for controlled deposition and additive manufacturing of the simulated patterns. Finally, a summary of the existing challenges associated with the repair of osteochondral defects as well as some recommendations for future directions have been brought up in the concluding section of this article. © 2014 Wiley Periodicals, Inc.
The third wave of biological psychiatry
Walter, Henrik
2013-01-01
In this article I will argue that we are witnessing at this moment the third wave of biological psychiatry. This framework conceptualizes mental disorders as brain disorders of a special kind that requires a multilevel approach ranging from genes to psychosocial mechanisms. In contrast to earlier biological psychiatry approaches, the mental plays a more prominent role in the third wave. This will become apparent by discussing the recent controversy evolving around the recently published DSM-5 and the competing transdiagnostic Research Domain Criteria approach of the National Institute of Mental Health that is build on concepts of cognitive neuroscience. A look at current conceptualizations in biological psychiatry as well as at some discussions in current philosophy of mind on situated cognition, reveals that the thesis, that mental brain disorders are brain disorders has to be qualified with respect to how mental states are constituted and with respect to multilevel explanations of which factors contribute to stable patterns of psychopathological signs and symptoms. PMID:24046754
Affordable proteomics: the two-hybrid systems.
Gillespie, Marc
2003-06-01
Numerous proteomic methodologies exist, but most require a heavy investment in expertise and technology. This puts these approaches out of reach for many laboratories and small companies, rarely allowing proteomics to be used as a pilot approach for biomarker or target identification. Two proteomic approaches, 2D gel electrophoresis and the two-hybrid systems, are currently available to most researchers. The two-hybrid systems, though accommodating to large-scale experiments, were originally designed as practical screens, that by comparison to current proteomics tools were small-scale, affordable and technically feasible. The screens rapidly generated data, identifying protein interactions that were previously uncharacterized. The foundation for a two-hybrid proteomic investigation can be purchased as separate kits from a number of companies. The true power of the technique lies not in its affordability, but rather in its portability. The two-hybrid system puts proteomics back into laboratories where the output of the screens can be evaluated by researchers with experience in the particular fields of basic research, cancer biology, toxicology or drug development.
Moller, Jerry
2005-01-01
The example of fall injury among older people is used to define and illustrate how current Australian systems for allocation of health resources perform for funding emerging public health issues. While the examples are Australian, the allocation and priority setting methods are common in the health sector in all developed western nations. With an ageing population the number of falls injuries in Australia and the cost of treatment will rise dramatically over the next 20-50 years. Current methods of allocating funds within the health system are not well suited to meeting this coming epidemic. The information requirements for cost-benefit and cost-effectiveness measures cannot be met. Marginal approaches to health funding are likely to continue to fund already well-funded treatment or politically driven prevention processes and to miss the opportunity for new prevention initiatives in areas that do not have a high political profile. Fall injury is one of many emerging areas that struggle to make claims for funding because the critical mass of intervention and evidence of its impact is not available. The beneficiaries of allocation failure may be those who treat the disease burden that could have been easily prevented. Changes to allocation mechanisms, data systems and new initiative funding practices are required to ensure that preventative strategies are able to compete on an equal footing with treatment approaches for mainstream health funding.
Microfluidic Transduction Harnesses Mass Transport Principles to Enhance Gene Transfer Efficiency.
Tran, Reginald; Myers, David R; Denning, Gabriela; Shields, Jordan E; Lytle, Allison M; Alrowais, Hommood; Qiu, Yongzhi; Sakurai, Yumiko; Li, William C; Brand, Oliver; Le Doux, Joseph M; Spencer, H Trent; Doering, Christopher B; Lam, Wilbur A
2017-10-04
Ex vivo gene therapy using lentiviral vectors (LVs) is a proven approach to treat and potentially cure many hematologic disorders and malignancies but remains stymied by cumbersome, cost-prohibitive, and scale-limited production processes that cannot meet the demands of current clinical protocols for widespread clinical utilization. However, limitations in LV manufacture coupled with inefficient transduction protocols requiring significant excess amounts of vector currently limit widespread implementation. Herein, we describe a microfluidic, mass transport-based approach that overcomes the diffusion limitations of current transduction platforms to enhance LV gene transfer kinetics and efficiency. This novel ex vivo LV transduction platform is flexible in design, easy to use, scalable, and compatible with standard cell transduction reagents and LV preparations. Using hematopoietic cell lines, primary human T cells, primary hematopoietic stem and progenitor cells (HSPCs) of both murine (Sca-1 + ) and human (CD34 + ) origin, microfluidic transduction using clinically processed LVs occurs up to 5-fold faster and requires as little as one-twentieth of LV. As an in vivo validation of the microfluidic-based transduction technology, HSPC gene therapy was performed in hemophilia A mice using limiting amounts of LV. Compared to the standard static well-based transduction protocols, only animals transplanted with microfluidic-transduced cells displayed clotting levels restored to normal. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Exploring the performance of large-N radio astronomical arrays
NASA Astrophysics Data System (ADS)
Lonsdale, Colin J.; Doeleman, Sheperd S.; Cappallo, Roger J.; Hewitt, Jacqueline N.; Whitney, Alan R.
2000-07-01
New radio telescope arrays are currently being contemplated which may be built using hundreds, or even thousands, of relatively small antennas. These include the One Hectare Telescope of the SETI Institute and UC Berkeley, the LOFAR telescope planned for the New Mexico desert surrounding the VLA, and possibly the ambitious international Square Kilometer Array (SKA) project. Recent and continuing advances in signal transmission and processing technology make it realistic to consider full cross-correlation of signals from such a large number of antennas, permitting the synthesis of an aperture with much greater fidelity than in the past. In principle, many advantages in instrumental performance are gained by this 'large-N' approach to the design, most of which require the development of new algorithms. Because new instruments of this type are expected to outstrip the performance of current instruments by wide margins, much of their scientific productivity is likely to come from the study of objects which are currently unknown. For this reason, instrumental flexibility is of special importance in design studies. A research effort has begun at Haystack Observatory to explore large-N performance benefits, and to determine what array design properties and data reduction algorithms are required to achieve them. The approach to these problems, involving a sophisticated data simulator, algorithm development, and exploration of array configuration parameter space, will be described, and progress to date will be summarized.
A zero-power warming chamber for investigating plant responses to rising temperature
Lewin, Keith F.; McMahon, Andrew M.; Ely, Kim S.; ...
2017-09-19
Advances in understanding and model representation of plant and ecosystem responses to rising temperature have typically required temperature manipulation of research plots, particularly when considering warming scenarios that exceed current climate envelopes. In remote or logistically challenging locations, passive warming using solar radiation is often the only viable approach for temperature manipulation. But, current passive warming approaches are only able to elevate the mean daily air temperature by ~1.5 °C. Motivated by our need to understand temperature acclimation in the Arctic, where warming has been markedly greater than the global average and where future warming is projected to be ~2–3more » °C by the middle of the century; we have developed an alternative approach to passive warming. Our zero-power warming (ZPW) chamber requires no electrical power for fully autonomous operation. It uses a novel system of internal and external heat exchangers that allow differential actuation of pistons in coupled cylinders to control chamber venting. This enables the ZPW chamber venting to respond to the difference between the external and internal air temperatures, thereby increasing the potential for warming and eliminating the risk of overheating. During the thaw season on the coastal tundra of northern Alaska our ZPW chamber was able to elevate the mean daily air temperature 2.6 °C above ambient, double the warming achieved by an adjacent passively warmed control chamber that lacked our hydraulic system. We describe the construction, evaluation and performance of our ZPW chamber and discuss the impact of potential artefacts associated with the design and its operation on the Arctic tundra. Our approach is highly flexible and tunable, enabling customization for use in many different environments where significantly greater temperature manipulation than that possible with existing passive warming approaches is desired.« less
A zero-power warming chamber for investigating plant responses to rising temperature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewin, Keith F.; McMahon, Andrew M.; Ely, Kim S.
Advances in understanding and model representation of plant and ecosystem responses to rising temperature have typically required temperature manipulation of research plots, particularly when considering warming scenarios that exceed current climate envelopes. In remote or logistically challenging locations, passive warming using solar radiation is often the only viable approach for temperature manipulation. But, current passive warming approaches are only able to elevate the mean daily air temperature by ~1.5 °C. Motivated by our need to understand temperature acclimation in the Arctic, where warming has been markedly greater than the global average and where future warming is projected to be ~2–3more » °C by the middle of the century; we have developed an alternative approach to passive warming. Our zero-power warming (ZPW) chamber requires no electrical power for fully autonomous operation. It uses a novel system of internal and external heat exchangers that allow differential actuation of pistons in coupled cylinders to control chamber venting. This enables the ZPW chamber venting to respond to the difference between the external and internal air temperatures, thereby increasing the potential for warming and eliminating the risk of overheating. During the thaw season on the coastal tundra of northern Alaska our ZPW chamber was able to elevate the mean daily air temperature 2.6 °C above ambient, double the warming achieved by an adjacent passively warmed control chamber that lacked our hydraulic system. We describe the construction, evaluation and performance of our ZPW chamber and discuss the impact of potential artefacts associated with the design and its operation on the Arctic tundra. Our approach is highly flexible and tunable, enabling customization for use in many different environments where significantly greater temperature manipulation than that possible with existing passive warming approaches is desired.« less
A zero-power warming chamber for investigating plant responses to rising temperature
NASA Astrophysics Data System (ADS)
Lewin, Keith F.; McMahon, Andrew M.; Ely, Kim S.; Serbin, Shawn P.; Rogers, Alistair
2017-09-01
Advances in understanding and model representation of plant and ecosystem responses to rising temperature have typically required temperature manipulation of research plots, particularly when considering warming scenarios that exceed current climate envelopes. In remote or logistically challenging locations, passive warming using solar radiation is often the only viable approach for temperature manipulation. However, current passive warming approaches are only able to elevate the mean daily air temperature by ˜ 1.5 °C. Motivated by our need to understand temperature acclimation in the Arctic, where warming has been markedly greater than the global average and where future warming is projected to be ˜ 2-3 °C by the middle of the century; we have developed an alternative approach to passive warming. Our zero-power warming (ZPW) chamber requires no electrical power for fully autonomous operation. It uses a novel system of internal and external heat exchangers that allow differential actuation of pistons in coupled cylinders to control chamber venting. This enables the ZPW chamber venting to respond to the difference between the external and internal air temperatures, thereby increasing the potential for warming and eliminating the risk of overheating. During the thaw season on the coastal tundra of northern Alaska our ZPW chamber was able to elevate the mean daily air temperature 2.6 °C above ambient, double the warming achieved by an adjacent passively warmed control chamber that lacked our hydraulic system. We describe the construction, evaluation and performance of our ZPW chamber and discuss the impact of potential artefacts associated with the design and its operation on the Arctic tundra. The approach we describe is highly flexible and tunable, enabling customization for use in many different environments where significantly greater temperature manipulation than that possible with existing passive warming approaches is desired.
What is the future of diabetic wound care?
Sweitzer, Sarah M; Fann, Stephen A; Borg, Thomas K; Baynes, John W; Yost, Michael J
2006-01-01
With diabetes affecting 5% to 10% of the US population, development of a more effective treatment for chronic diabetic wounds is imperative. Clinically, the current treatment in topical wound management includes debridement, topical antibiotics, and a state-of-the-art topical dressing. State-of-the-art dressings are a multi-layer system that can include a collagen cellulose substrate, neonatal foreskin fibroblasts, growth factor containing cream, and a silicone sheet covering for moisture control. Wound healing time can be up to 20 weeks. The future of diabetic wound healing lies in the development of more effective artificial "smart" matrix skin substitutes. This review article will highlight the need for novel smart matrix therapies. These smart matrices will release a multitude of growth factors, cytokines, and bioactive peptide fragments in a temporally and spatially specific, event-driven manner. This timed and focal release of cytokines, enzymes, and pharmacological agents should promote optimal tissue regeneration and repair of full-thickness wounds. Development of these kinds of therapies will require multidisciplinary translational research teams. This review article outlines how current advances in proteomics and genomics can be incorporated into a multidisciplinary translational research approach for developing novel smart matrix dressings for ulcer treatment. With the recognition that the research approach will require both time and money, the best treatment approach is the prevention of diabetic ulcers through better foot care, education, and glycemic control.
Buselli, R; Cristaudo, A
2009-01-01
In Italy the recent safety legislation requires a new committment for the company occupational physician. His duty is a balance between legal requirements and the state of art of prevention. There are many tools to tackle stress at work as a general preventive intervention. The hard challenge of the company physician is to keep all the garantees in terms of prevention and social security for the worker at risk of stress. This paper examines some of the difficulties with current approaches and looks at possible solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edgar, Thomas W.; Hadley, Mark D.; Manz, David O.
This document provides the methods to secure routable control system communication in the electric sector. The approach of this document yields a long-term vision for a future of secure communication, while also providing near term steps and a roadmap. The requirements for the future secure control system environment were spelled out to provide a final target. Additionally a survey and evaluation of current protocols was used to determine if any existing technology could achieve this goal. In the end a four-step path was described that brought about increasing requirement completion and culminates in the realization of the long term vision.
Orbiter wheel and tire certification
NASA Technical Reports Server (NTRS)
Campbell, C. C., Jr.
1985-01-01
The orbiter wheel and tire development has required a unique series of certification tests to demonstrate the ability of the hardware to meet severe performance requirements. Early tests of the main landing gear wheel using conventional slow roll testing resulted in hardware failures. This resulted in a need to conduct high velocity tests with crosswind effects for assurance that the hardware was safe for a limited number of flights. Currently, this approach and the conventional slow roll and static tests are used to certify the wheel/tire assembly for operational use.
Siegel, Jason T; Navarro, Mario A; Thomson, Andrew L
2015-10-01
Investigations conducted through Amazon's Mechanical Turk (MTurk) sometimes explicitly note eligibility requirements when recruiting participants; however, the impact of this practice on data integrity is relatively unexplored within the MTurk context. Contextualized in the organ donor registration domain, the current study assessed whether overtly listing eligibility requirements impairs the accuracy of data collected on MTurk. On day 1, the first and third round of data collection did not list eligibility requirements; the second and fourth round overtly listed a qualification requirement: status as a non-registered organ donor. On day 2, the approach was identical, except the order was reversed-the first and third round overtly listed the study qualifications, while the second and fourth did not. These procedures provided eight different waves of data. In addition, all participants were randomly assigned to read an elevating (i.e., morally inspiring) story or a story not intended to elicit any emotion. Regardless of recruitment approach, only participants who were not registered as donors were included in the analysis. Results indicated that the recruitment script that explicitly requested non-registered donors resulted in the collection of participants with higher mean intentions scores than the script that did not overtly list the eligibility requirements. Further, even though the elevation induction increased intentions to register as a donor, there was not a significant interaction between recruitment approach and the influence of the elevation manipulation on registration intentions. Explicitly listing eligibility requirements can influence the accuracy of estimates derived from data collected through MTurk. Copyright © 2015 Elsevier Ltd. All rights reserved.
An approach for software-driven and standard-based support of cross-enterprise tumor boards.
Mangesius, Patrick; Fischer, Bernd; Schabetsberger, Thomas
2015-01-01
For tumor boards, the networking of different medical disciplines' expertise continues to gain importance. However, interdisciplinary tumor boards spread across several institutions are rarely supported by information technology tools today. The aim of this paper is to point out an approach for a tumor board management system prototype. For analyzing the requirements, an incremental process was used. The requirements were surveyed using Informal Conversational Interview and documented with Use Case Diagrams defined by the Unified Modeling Language (UML). Analyses of current EHR standards were conducted to evaluate technical requirements. Functional and technical requirements of clinical conference applications were evaluated and documented. In several steps, workflows were derived and application mockups were created. Although there is a vast amount of common understanding concerning how clinical conferences should be conducted and how their workflows should be structured, these are hardly standardized, neither on a functional nor on a technical level. This results in drawbacks for participants and patients. Using modern EHR technologies based on profiles such as IHE Cross Enterprise document sharing (XDS), these deficits could be overcome.
NASA Technical Reports Server (NTRS)
Folta, David C.; Carpenter, J. Russell
1999-01-01
A decentralized control is investigated for applicability to the autonomous formation flying control algorithm developed by GSFC for the New Millenium Program Earth Observer-1 (EO-1) mission. This decentralized framework has the following characteristics: The approach is non-hierarchical, and coordination by a central supervisor is not required; Detected failures degrade the system performance gracefully; Each node in the decentralized network processes only its own measurement data, in parallel with the other nodes; Although the total computational burden over the entire network is greater than it would be for a single, centralized controller, fewer computations are required locally at each node; Requirements for data transmission between nodes are limited to only the dimension of the control vector, at the cost of maintaining a local additional data vector. The data vector compresses all past measurement history from all the nodes into a single vector of the dimension of the state; and The approach is optimal with respect to standard cost functions. The current approach is valid for linear time-invariant systems only. Similar to the GSFC formation flying algorithm, the extension to linear LQG time-varying systems requires that each node propagate its filter covariance forward (navigation) and controller Riccati matrix backward (guidance) at each time step. Extension of the GSFC algorithm to non-linear systems can also be accomplished via linearization about a reference trajectory in the standard fashion, or linearization about the current state estimate as with the extended Kalman filter. To investigate the feasibility of the decentralized integration with the GSFC algorithm, an existing centralized LQG design for a single spacecraft orbit control problem is adapted to the decentralized framework while using the GSFC algorithm's state transition matrices and framework. The existing GSFC design uses both reference trajectories of each spacecraft in formation and by appropriate choice of coordinates and simplified measurement modeling is formulated as a linear time-invariant system. Results for improvements to the GSFC algorithm and a multiple satellite formation will be addressed. The goal of this investigation is to progressively relax the assumptions that result in linear time-invariance, ultimately to the point of linearization of the non-linear dynamics about the current state estimate as in the extended Kalman filter. An assessment will then be made about the feasibility of the decentralized approach to the realistic formation flying application of the EO-1/Landsat 7 formation flying experiment.
Health impact assessment in Australia: A review and directions for progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, Patrick, E-mail: patrick.harris@unsw.edu.a; Spickett, Jeff, E-mail: J.Spickett@curtin.edu.a
2011-07-15
This article provides an overview of Health Impact Assessment (HIA) within Australia. We discuss the development and current position of HIA and offer some directions for HIA's progression. Since the early 1990s HIA activity in Australia has increased and diversified in application and practice. This article first highlights the emergent streams of HIA practice across environmental, policy and health equity foci, and how these have developed within Australia. The article then provides summaries of current practice provided by each Australian state and territory. We then offer some insight into current issues that require further progression or resolution if HIA ismore » to progress effectively in Australia. This progress rests both on developing broad system support for HIA across government, led by the health sector, and developing system capacity to undertake, commission or review HIAs. We argue that a unified and clear HIA approach is required as a prerequisite to gaining the understanding and support for HIA in the public and private sectors and the wider community.« less
Load Balancing Strategies for Multiphase Flows on Structured Grids
NASA Astrophysics Data System (ADS)
Olshefski, Kristopher; Owkes, Mark
2017-11-01
The computation time required to perform large simulations of complex systems is currently one of the leading bottlenecks of computational research. Parallelization allows multiple processing cores to perform calculations simultaneously and reduces computational times. However, load imbalances between processors waste computing resources as processors wait for others to complete imbalanced tasks. In multiphase flows, these imbalances arise due to the additional computational effort required at the gas-liquid interface. However, many current load balancing schemes are only designed for unstructured grid applications. The purpose of this research is to develop a load balancing strategy while maintaining the simplicity of a structured grid. Several approaches are investigated including brute force oversubscription, node oversubscription through Message Passing Interface (MPI) commands, and shared memory load balancing using OpenMP. Each of these strategies are tested with a simple one-dimensional model prior to implementation into the three-dimensional NGA code. Current results show load balancing will reduce computational time by at least 30%.
NASA's In Space Manufacturing Initiatives: Conquering the Challenges of In-Space Manufacturing
NASA Technical Reports Server (NTRS)
Clinton, R. G., Jr.
2017-01-01
Current maintenance logistics strategy will not be effective for deep space exploration missions. ISM (In Space Manufacturing) offers the potential to: Significantly reduce maintenance logistics mass requirements; Enable the use of recycled materials and in-situ resources for more dramatic reductions in mass requirements; Enable flexibility, giving systems a broad capability to adapt to unanticipated circumstances; Mitigate risks that are not covered by current approaches to maintainability. Multiple projects are underway currently to develop and validate these capabilities for infusion into ISM exploration systems. ISS is a critical testbed for demonstrating ISM technologies, proving out these capabilities, and performing operational validation of deep space ISM applications. Developing and testing FabLab is a major milestone for springboard to DSG/Cis-lunar Space applications. ISM is a necessary paradigm shift in space operations – design for repair culture must be embraced. ISM team needs to be working with exploration system designers now to identify high-value application areas and influence design.
Optimal current waveforms for brushless permanent magnet motors
NASA Astrophysics Data System (ADS)
Moehle, Nicholas; Boyd, Stephen
2015-07-01
In this paper, we give energy-optimal current waveforms for a permanent magnet synchronous motor that result in a desired average torque. Our formulation generalises previous work by including a general back-electromotive force (EMF) wave shape, voltage and current limits, an arbitrary phase winding connection, a simple eddy current loss model, and a trade-off between power loss and torque ripple. Determining the optimal current waveforms requires solving a small convex optimisation problem. We show how to use the alternating direction method of multipliers to find the optimal current in milliseconds or hundreds of microseconds, depending on the processor used, which allows the possibility of generating optimal waveforms in real time. This allows us to adapt in real time to changes in the operating requirements or in the model, such as a change in resistance with winding temperature, or even gross changes like the failure of one winding. Suboptimal waveforms are available in tens or hundreds of microseconds, allowing for quick response after abrupt changes in the desired torque. We demonstrate our approach on a simple numerical example, in which we give the optimal waveforms for a motor with a sinusoidal back-EMF, and for a motor with a more complicated, nonsinusoidal waveform, in both the constant-torque region and constant-power region.
Enhanced use of phylogenetic data to inform public health approaches to HIV among MSM
German, Danielle; Grabowski, Mary Kate; Beyrer, Chris
2017-01-01
The multi-dimensional nature and continued evolution of HIV epidemics among men who have sex with men (MSM) requires innovative intervention approaches. Strategies are needed that recognize the individual, social, and structural factors driving HIV transmission; that can pinpoint networks with heightened transmission risk; and that can help target intervention in real-time. HIV phylogenetics is a rapidly evolving field with strong promise for informing innovative responses to the HIV epidemic among MSM. Currently, HIV phylogenetic insights are providing new understandings of characteristics of HIV epidemics involving MSM, social networks influencing transmission, characteristics of HIV transmission clusters involving MSM, targets for antiretroviral and other prevention strategies, and dynamics of emergent epidemics. Maximizing the potential of HIV phylogenetics for HIV responses among MSM will require attention to key methodological challenges and ethical considerations, as well as resolving key implementation and scientific questions. Enhanced and integrated use of HIV surveillance, socio-behavioral, and phylogenetic data resources are becoming increasingly critical for informing public health approaches to HIV among MSM. PMID:27584826
A secured e-tendering modeling using misuse case approach
NASA Astrophysics Data System (ADS)
Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman
2016-08-01
Major risk factors relating to electronic transactions may lead to destructive impacts on trust and transparency in the process of tendering. Currently, electronic tendering (e-tendering) systems still remain uncertain in issues relating to legal and security compliance and most importantly it has an unclear security framework. Particularly, the available systems are lacking in addressing integrity, confidentiality, authentication, and non-repudiation in e-tendering requirements. Thus, one of the challenges in developing an e-tendering system is to ensure the system requirements include the function for secured and trusted environment. Therefore, this paper aims to model a secured e-tendering system using misuse case approach. The modeling process begins with identifying the e-tendering process, which is based on the Australian Standard Code of Tendering (AS 4120-1994). It is followed by identifying security threats and their countermeasure. Then, the e-tendering was modelled using misuse case approach. The model can contribute to e-tendering developers and also to other researchers or experts in the e-tendering domain.
Next-generation sequencing library construction on a surface.
Feng, Kuan; Costa, Justin; Edwards, Jeremy S
2018-05-30
Next-generation sequencing (NGS) has revolutionized almost all fields of biology, agriculture and medicine, and is widely utilized to analyse genetic variation. Over the past decade, the NGS pipeline has been steadily improved, and the entire process is currently relatively straightforward. However, NGS instrumentation still requires upfront library preparation, which can be a laborious process, requiring significant hands-on time. Herein, we present a simple but robust approach to streamline library preparation by utilizing surface bound transposases to construct DNA libraries directly on a flowcell surface. The surface bound transposases directly fragment genomic DNA while simultaneously attaching the library molecules to the flowcell. We sequenced and analysed a Drosophila genome library generated by this surface tagmentation approach, and we showed that our surface bound library quality was comparable to the quality of the library from a commercial kit. In addition to the time and cost savings, our approach does not require PCR amplification of the library, which eliminates potential problems associated with PCR duplicates. We described the first study to construct libraries directly on a flowcell. We believe our technique could be incorporated into the existing Illumina sequencing pipeline to simplify the workflow, reduce costs, and improve data quality.
NASA Astrophysics Data System (ADS)
Gallagher, John A.
2016-04-01
The desired operating range of ferroelectric materials with compositions near the morphotropic phase boundary is limited by field induced phase transformations. In [001]C cut and poled relaxor ferroelectric single crystals the mechanically driven ferroelectric rhombohedral to ferroelectric orthorhombic phase transformation is hindered by antagonistic electrical loading. Instability around the phase transformation makes the current experimental technique for characterization of the large field behavior very time consuming. Characterization requires specialized equipment and involves an extensive set of measurements under combined electrical, mechanical, and thermal loads. In this work a mechanism-based model is combined with a more limited set of experiments to obtain the same results. The model utilizes a work-energy criterion that calculates the mechanical work required to induce the transformation and the required electrical work that is removed to reverse the transformation. This is done by defining energy barriers to the transformation. The results of the combined experiment and modeling approach are compared to the fully experimental approach and error is discussed. The model shows excellent predictive capability and is used to substantially reduce the total number of experiments required for characterization. This decreases the time and resources required for characterization of new compositions.
NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Yang; L.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less
Guide for developing an information technology investment road map for population health management.
Hunt, Jacquelyn S; Gibson, Richard F; Whittington, John; Powell, Kitty; Wozney, Brad; Knudson, Susan
2015-06-01
Many health systems recovering from a massive investment in electronic health records are now faced with the prospect of maturing into accountable care organizations. This maturation includes the need to cooperate with new partners, involve substantially new data sources, require investment in additional information technology (IT) solutions, and become proficient in managing care from a new perspective. Adding to the confusion, there are hundreds of population health management (PHM) vendors with overlapping product functions. This article proposes an organized approach to investing in PHM IT. The steps include assessing the organization's business and clinical goals, establishing governance, agreeing on business requirements, evaluating the ability of current IT systems to meet those requirements, setting time lines and budgets, rationalizing current and future needs and capabilities, and installing the new systems in the context of a continuously learning organization. This article will help organizations chart their position on the population health readiness spectrum and enhance their chances for a successful transition from volume-based to value-based care.
Achieving competitive advantage through strategic human resource management.
Fottler, M D; Phillips, R L; Blair, J D; Duran, C A
1990-01-01
The framework presented here challenges health care executives to manage human resources strategically as an integral part of the strategic planning process. Health care executives should consciously formulate human resource strategies and practices that are linked to and reinforce the broader strategic posture of the organization. This article provides a framework for (1) determining and focusing on desired strategic outcomes, (2) identifying and implementing essential human resource management actions, and (3) maintaining or enhancing competitive advantage. The strategic approach to human resource management includes assessing the organization's environment and mission; formulating the organization's business strategy; assessing the human resources requirements based on the intended strategy; comparing the current inventory of human resources in terms of numbers, characteristics, and human resource management practices with respect to the strategic requirements of the organization and its services or product lines; formulating the human resource strategy based on the differences between the assessed requirements and the current inventory; and implementing the appropriate human resource practices to reinforce the strategy and attain competitive advantage.
The future of 3D and video coding in mobile and the internet
NASA Astrophysics Data System (ADS)
Bivolarski, Lazar
2013-09-01
The current Internet success has already changed our social and economic world and is still continuing to revolutionize the information exchange. The exponential increase of amount and types of data that is currently exchanged on the Internet represents significant challenge for the design of future architectures and solutions. This paper reviews the current status and trends in the design of solutions and research activities in the future Internet from point of view of managing the growth of bandwidth requirements and complexity of the multimedia that is being created and shared. Outlines the challenges that are present before the video coding and approaches to the design of standardized media formats and protocols while considering the expected convergence of multimedia formats and exchange interfaces. The rapid growth of connected mobile devices adds to the current and the future challenges in combination with the expected, in near future, arrival of multitude of connected devices. The new Internet technologies connecting the Internet of Things with wireless visual sensor networks and 3D virtual worlds requires conceptually new approaches of media content handling from acquisition to presentation in the 3D Media Internet. Accounting for the entire transmission system properties and enabling adaptation in real-time to context and content throughout the media proceeding path will be paramount in enabling the new media architectures as well as the new applications and services. The common video coding formats will need to be conceptually redesigned to allow for the implementation of the necessary 3D Media Internet features.
Kazi, Ada; Chuah, Candy; Majeed, Abu Bakar Abdul; Leow, Chiuan Herng; Lim, Boon Huat; Leow, Chiuan Yee
2018-03-12
Immunoinformatics plays a pivotal role in vaccine design, immunodiagnostic development, and antibody production. In the past, antibody design and vaccine development depended exclusively on immunological experiments which are relatively expensive and time-consuming. However, recent advances in the field of immunological bioinformatics have provided feasible tools which can be used to lessen the time and cost required for vaccine and antibody development. This approach allows the selection of immunogenic regions from the pathogen genomes. The ideal regions could be developed as potential vaccine candidates to trigger protective immune responses in the hosts. At present, epitope-based vaccines are attractive concepts which have been successfully trailed to develop vaccines which target rapidly mutating pathogens. In this article, we provide an overview of the current progress of immunoinformatics and their applications in the vaccine design, immune system modeling and therapeutics.
NASA Technical Reports Server (NTRS)
Sampson, Paul G.; Sny, Linda C.
1992-01-01
The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).
Prospects and challenges for fungal metatranscriptomes of complex communities
Kuske, Cheryl Rae; Hesse, Cedar Nelson; Challacombe, Jean Faust; ...
2015-01-22
We report that the ability to extract and purify messenger RNA directly from plants, decomposing organic matter and soil, followed by high-throughput sequencing of the pool of expressed genes, has spawned the emerging research area of metatranscriptomics. Each metatranscriptome provides a snapshot of the composition and relative abundance of actively transcribed genes, and thus provides an assessment of the interactions between soil microorganisms and plants, and collective microbial metabolic processes in many environments. We highlight current approaches for analysis of fungal transcriptome and metatranscriptome datasets across a gradient of community complexity, and note benefits and pitfalls associated with those approaches.more » Finally, we discuss knowledge gaps that limit our current ability to interpret metatranscriptome datasets and suggest future research directions that will require concerted efforts within the scientific community.« less
Korjus, Kristjan; Hebart, Martin N.; Vicente, Raul
2016-01-01
Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier’s generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term “Cross-validation and cross-testing” improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do. PMID:27564393
Korjus, Kristjan; Hebart, Martin N; Vicente, Raul
2016-01-01
Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier's generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term "Cross-validation and cross-testing" improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do.
A current-carrying coil design with improved liquid cooling arrangement
NASA Astrophysics Data System (ADS)
Ricci, Leonardo; Martini, Luca Matteo; Franchi, Matteo; Bertoldi, Andrea
2013-06-01
The design of an electromagnet requires the compliance with a number of constraints such as power supply characteristics, coil inductance and resistance, and, above all, heat dissipation, which poses the limit to the maximum achievable magnetic field. A common solution consists in using copper tubes in which a coolant flows. This approach, however, introduces further hydrodynamic concerns. To overcome these difficulties, we developed a new kind of electromagnet in which the pipe concept is replaced by a duct formed by the windings. Here we report on the realization and characterization of a compact model system in which the conductors carry a current that is one order of magnitude higher than the current allowable with conventional designs.
Objectives and metrics for wildlife monitoring
Sauer, J.R.; Knutson, M.G.
2008-01-01
Monitoring surveys allow managers to document system status and provide the quantitative basis for management decision-making, and large amounts of effort and funding are devoted to monitoring. Still, monitoring surveys often fall short of providing required information; inadequacies exist in survey designs, analyses procedures, or in the ability to integrate the information into an appropriate evaluation of management actions. We describe current uses of monitoring data, provide our perspective on the value and limitations of current approaches to monitoring, and set the stage for 3 papers that discuss current goals and implementation of monitoring programs. These papers were derived from presentations at a symposium at The Wildlife Society's 13th Annual Conference in Anchorage, Alaska, USA. [2006
Dave, Vivek S; Gupta, Deepak; Yu, Monica; Nguyen, Phuong; Varghese Gupta, Sheeba
2017-02-01
The Biopharmaceutics Classification System (BCS) classifies pharmaceutical compounds based on their aqueous solubility and intestinal permeability. The BCS Class III compounds are hydrophilic molecules (high aqueous solubility) with low permeability across the biological membranes. While these compounds are pharmacologically effective, poor absorption due to low permeability becomes the rate-limiting step in achieving adequate bioavailability. Several approaches have been explored and utilized for improving the permeability profiles of these compounds. The approaches include traditional methods such as prodrugs, permeation enhancers, ion-pairing, etc., as well as relatively modern approaches such as nanoencapsulation and nanosizing. The most recent approaches include a combination/hybridization of one or more traditional approaches to improve drug permeability. While some of these approaches have been extremely successful, i.e. drug products utilizing the approach have progressed through the USFDA approval for marketing; others require further investigation to be applicable. This article discusses the commonly studied approaches for improving the permeability of BCS Class III compounds.
Seim, Andreas R; Sandberg, Warren S
2010-12-01
To review the current state of anesthesiology for operative and invasive procedures, with an eye toward possible future states. Anesthesiology is at once a mature specialty and in a crisis--requiring breakthrough to move forward. The cost of care now approaches reimbursement, and outcomes as commonly measured approach perfection. Thus, the cost of further improvements seems ready to topple the field, just as the specialty is realizing that seemingly innocuous anesthetic choices have long-term consequences, and better practice is required. Anesthesiologists must create more headroom between costs and revenues in order to sustain the academic vigor and creativity required to create better clinical practice. We outline three areas in which technological and organizational innovation in anesthesiology can improve competitiveness and become a driving force in collaborative efforts to develop the operating rooms and perioperative systems of the future: increasing the profitability of operating rooms; increasing the efficiency of anesthesia; and technological and organizational innovation to foster improved patient flow, communication, coordination, and organizational learning.
NASA Technical Reports Server (NTRS)
Schoeberl, Mark; Rychekewkitsch, Michael; Andrucyk, Dennis; McConaughy, Gail; Meeson, Blanche; Hildebrand, Peter; Einaudi, Franco (Technical Monitor)
2000-01-01
NASA's Earth Science Enterprise's long range vision is to enable the development of a national proactive environmental predictive capability through targeted scientific research and technological innovation. Proactive environmental prediction means the prediction of environmental events and their secondary consequences. These consequences range from disasters and disease outbreak to improved food production and reduced transportation, energy and insurance costs. The economic advantage of this predictive capability will greatly outweigh the cost of development. Developing this predictive capability requires a greatly improved understanding of the earth system and the interaction of the various components of that system. It also requires a change in our approach to gathering data about the earth and a change in our current methodology in processing that data including its delivery to the customers. And, most importantly, it requires a renewed partnership between NASA and its sister agencies. We identify six application themes that summarize the potential of proactive environmental prediction. We also identify four technology themes that articulate our approach to implementing proactive environmental prediction.
Software as a service approach to sensor simulation software deployment
NASA Astrophysics Data System (ADS)
Webster, Steven; Miller, Gordon; Mayott, Gregory
2012-05-01
Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.
Internal NASA Study: NASAs Protoflight Research Initiative
NASA Technical Reports Server (NTRS)
Coan, Mary R.; Hirshorn, Steven R.; Moreland, Robert
2015-01-01
The NASA Protoflight Research Initiative is an internal NASA study conducted within the Office of the Chief Engineer to better understand the use of Protoflight within NASA. Extensive literature reviews and interviews with key NASA members with experience in both robotic and human spaceflight missions has resulted in three main conclusions and two observations. The first conclusion is that NASA's Protoflight method is not considered to be "prescriptive." The current policies and guidance allows each Program/Project to tailor the Protoflight approach to better meet their needs, goals and objectives. Second, Risk Management plays a key role in implementation of the Protoflight approach. Any deviations from full qualification will be based on the level of acceptable risk with guidance found in NPR 8705.4. Finally, over the past decade (2004 - 2014) only 6% of NASA's Protoflight missions and 6% of NASA's Full qualification missions experienced a publicly disclosed mission failure. In other words, the data indicates that the Protoflight approach, in and of it itself, does not increase the mission risk of in-flight failure. The first observation is that it would be beneficial to document the decision making process on the implementation and use of Protoflight. The second observation is that If a Project/Program chooses to use the Protoflight approach with relevant heritage, it is extremely important that the Program/Project Manager ensures that the current project's requirements falls within the heritage design, component, instrument and/or subsystem's requirements for both the planned and operational use, and that the documentation of the relevant heritage is comprehensive, sufficient and the decision well documented. To further benefit/inform this study, a recommendation to perform a deep dive into 30 missions with accessible data on their testing/verification methodology and decision process to research the differences between Protoflight and Full Qualification missions' Design Requirements and Verification & Validation (V&V) (without any impact or special request directly to the project).
NASA Astrophysics Data System (ADS)
Song, Young-Joo; Bae, Jonghee; Kim, Young-Rok; Kim, Bang-Yeop
2016-12-01
In this study, the uncertainty requirements for orbit, attitude, and burn performance were estimated and analyzed for the execution of the 1st lunar orbit insertion (LOI) maneuver of the Korea Pathfinder Lunar Orbiter (KPLO) mission. During the early design phase of the system, associate analysis is an essential design factor as the 1st LOI maneuver is the largest burn that utilizes the onboard propulsion system; the success of the lunar capture is directly affected by the performance achieved. For the analysis, the spacecraft is assumed to have already approached the periselene with a hyperbolic arrival trajectory around the moon. In addition, diverse arrival conditions and mission constraints were considered, such as varying periselene approach velocity, altitude, and orbital period of the capture orbit after execution of the 1st LOI maneuver. The current analysis assumed an impulsive LOI maneuver, and two-body equations of motion were adapted to simplify the problem for a preliminary analysis. Monte Carlo simulations were performed for the statistical analysis to analyze diverse uncertainties that might arise at the moment when the maneuver is executed. As a result, three major requirements were analyzed and estimated for the early design phase. First, the minimum requirements were estimated for the burn performance to be captured around the moon. Second, the requirements for orbit, attitude, and maneuver burn performances were simultaneously estimated and analyzed to maintain the 1st elliptical orbit achieved around the moon within the specified orbital period. Finally, the dispersion requirements on the B-plane aiming at target points to meet the target insertion goal were analyzed and can be utilized as reference target guidelines for a mid-course correction (MCC) maneuver during the transfer. More detailed system requirements for the KPLO mission, particularly for the spacecraft bus itself and for the flight dynamics subsystem at the ground control center, are expected to be prepared and established based on the current results, including a contingency trajectory design plan.
But How Do We Learn? Talking to Vietnamese Children about How They Learn in and out of School
ERIC Educational Resources Information Center
Phelps, Renata; Nhung, Ha Thi Tuyet; Graham, Anne; Geeves, Richard
2012-01-01
Vietnam is currently striving to introduce more child-centred approaches to pedagogy. From a Western perspective, child-centred education requires teachers to perceive children as capable, active partners in learning and to develop deep understandings of their students, including the variety of ways in which they learn. This paper draws from a…
ERIC Educational Resources Information Center
Packard, Richard D.; Dereshiwsky, Mary I.
Despite current interest with the concept of the "New American School" model discussed in "America 2000," school systems continue to approach educational reform and restructuring by tinkering with key organizational components in isolation. The total school organization requires assessment and profiling to determine which key components are drags…
Francisco Rodríguez y Silva; Armando González-Cabán
2013-01-01
The abandonment of land, the high energy load generated and accumulated by vegetation covers, climate change and interface scenarios in Mediterranean forest ecosystems are demanding serious attention to forest fire conditions. This is particularly true when dealing with the budget requirements for undertaking protection programs related to the state of current and...
ERIC Educational Resources Information Center
Rusk, Kara
2012-01-01
For teachers who work with students with severe disabilities, it is a challenge to find ways to incorporate the core content and academic standards with the functional skills that the student will need to be independent as he or she transitions into adulthood. "Current federal requirements challenge educators, to bring about achievement of a…
A retrospective perspective: evaluating population changes by repeating historic bird surveys
Lawrence D. Igl; Douglas H. Johnson
2005-01-01
Acquiring an accurate picture of the changes in bird populations often involves a trade-off between the time and effort required to complete the surveys and the number of years spent surveying the bird populations. An alternative approach to long-term monitoring efforts is to collect current data and contrast those with data collected earlier in a similar fashion on...
Rolling circle amplification of metazoan mitochondrialgenomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simison, W. Brian; Lindberg, D.R.; Boore, J.L.
2005-07-31
Here we report the successful use of rolling circle amplification (RCA) for the amplification of complete metazoan mt genomes to make a product that is amenable to high-throughput genome sequencing techniques. The benefits of RCA over PCR are many and with further development and refinement of RCA, the sequencing of organellar genomics will require far less time and effort than current long PCR approaches.
ERIC Educational Resources Information Center
Elmelegy, Reda Ibrahim
2015-01-01
The current research aims at clarifying how school-based management (SBM) can contribute to achieve the decision-making quality in Egyptian general secondary schools and determine the requirements of quality decision-making. It depends on the descriptive method in order to acknowledge the basics of the SBM and its relationship with the quality of…
ERIC Educational Resources Information Center
Sieglová, Dagmar; Stejskalová, Lenka; Kocurová-Giurgiu, Ioana
2017-01-01
The job expectations and requirements of the information age bring with them a need for a change in teaching and studying. A quantitative approach to working with information and a frontal style of teaching, still a wide practice in many institutions, no longer seem to be suitable preparation for current students' needs. One of the areas affected…
NASA Programs in Space Photovoltaics
NASA Technical Reports Server (NTRS)
Flood, Dennis J.
1992-01-01
Highlighted here are some of the current programs in advanced space solar cell and array development conducted by NASA in support of its future mission requirements. Recent developments are presented for a variety of solar cell types, including both single crystal and thin film cells. A brief description of an advanced concentrator array capable of AM0 efficiencies approaching 25 percent is also provided.
ERIC Educational Resources Information Center
Muzoora, Michael; Terry, Daniel R.; Asiimwe, Agatha A.
2014-01-01
This paper highlights the challenges of current language policies in education in Africa, with reference to Uganda. Also examined are the likely challenges to language policy in education, while indicating how these challenges can be curtailed or overcome. The authors suggest a different view is required when approaching this topic with a paradigm…
ERIC Educational Resources Information Center
LaFlair, Geoffrey T.; Staples, Shelley
2017-01-01
Investigations of the validity of a number of high-stakes language assessments are conducted using an argument-based approach, which requires evidence for inferences that are critical to score interpretation (Chapelle, Enright, & Jamieson, 2008b; Kane, 2013). The current study investigates the extrapolation inference for a high-stakes test of…
The vaccines consistency approach project: an EPAA initiative.
De Mattia, F; Hendriksen, C; Buchheit, K H; Chapsal, J M; Halder, M; Lambrigts, D; Redhead, K; Rommel, E; Scharton-Kersten, T; Sesardic, T; Viviani, L; Ragan, I
2015-01-01
The consistency approach for release testing of established vaccines promotes the use of in vitro, analytical, non-animal based systems allowing the monitoring of quality parameters during the whole production process. By using highly sensitive non-animal methods, the consistency approach has the potential to improve the quality of testing and to foster the 3Rs (replacement, refinement and reduction of animal use) for quality control of established vaccines. This concept offers an alternative to the current quality control strategy which often requires large numbers of laboratory animals. In order to facilitate the introduction of the consistency approach for established human and veterinary vaccine quality control, the European Partnership for Alternatives to Animal Testing (EPAA) initiated a project, the "Vaccines Consistency Approach Project", aiming at developing and validating the consistency approach with stakeholders from academia, regulators, OMCLs, EDQM, European Commission and industry. This report summarises progress since the project's inception.
The State of Space Propulsion Research
NASA Technical Reports Server (NTRS)
Sackheim, R. L.; Cole, J. W.; Litchford, R. J.
2006-01-01
The current state of space propulsion research is assessed from both a historical perspective, spanning the decades since Apollo, and a forward-looking perspective, as defined by the enabling technologies required for a meaningful and sustainable human and robotic exploration program over the forthcoming decades. Previous research and technology investment approaches are examined and a course of action suggested for obtaining a more balanced portfolio of basic and applied research. The central recommendation is the establishment of a robust national Space Propulsion Research Initiative that would run parallel with systems development and include basic research activities. The basic framework and technical approach for this proposed initiative are defined and a potential implementation approach is recommended.
Perception system and functions for autonomous navigation in a natural environment
NASA Technical Reports Server (NTRS)
Chatila, Raja; Devy, Michel; Lacroix, Simon; Herrb, Matthieu
1994-01-01
This paper presents the approach, algorithms, and processes we developed for the perception system of a cross-country autonomous robot. After a presentation of the tele-programming context we favor for intervention robots, we introduce an adaptive navigation approach, well suited for the characteristics of complex natural environments. This approach lead us to develop a heterogeneous perception system that manages several different terrain representatives. The perception functionalities required during navigation are listed, along with the corresponding representations we consider. The main perception processes we developed are presented. They are integrated within an on-board control architecture we developed. First results of an ambitious experiment currently underway at LAAS are then presented.
A sustainable system of systems approach: a new HFE paradigm.
Thatcher, Andrew; Yeow, Paul H P
2016-01-01
Sustainability issues such as natural resource depletion, pollution and poor working conditions have no geographical boundaries in our interconnected world. To address these issues requires a paradigm shift within human factors and ergonomics (HFE), to think beyond a bounded, linear model understanding towards a broader systems framework. For this reason, we introduce a sustainable system of systems model that integrates the current hierarchical conceptualisation of possible interventions (i.e., micro-, meso- and macro-ergonomics) with important concepts from the sustainability literature, including the triple bottom line approach and the notion of time frames. Two practical examples from the HFE literature are presented to illustrate the model. The implications of this paradigm shift for HFE researchers and practitioners are discussed and include the long-term sustainability of the HFE community and comprehensive solutions to problems that consider the emergent issues that arise from this interconnected world. A sustainable world requires a broader systems thinking than that which currently exists in ergonomics. This study proposes a sustainable system of systems model that incorporates ideas from the ecological sciences, notably a nested hierarchy of systems and a hierarchical time dimension. The implications for sustainable design and the sustainability of the HFE community are considered.
Esophageal tissue engineering: an in-depth review on scaffold design.
Tan, J Y; Chua, C K; Leong, K F; Chian, K S; Leong, W S; Tan, L P
2012-01-01
Treatment of esophageal cancer often requires surgical procedures that involve removal. The current approaches to restore esophageal continuity however, are known to have limitations which may not result in full functional recovery. In theory, using a tissue engineered esophagus developed from the patient's own cells to replace the removed esophageal segment can be the ideal method of reconstruction. One of the key elements involved in the tissue engineering process is the scaffold which acts as a template for organization of cells and tissue development. While a number of scaffolds range from traditional non-biodegradable tubing to bioactive decellularized matrix have been proposed to engineer the esophagus in the past decade, results are still not yet favorable with many challenges relating to tissue quality need to be met improvements. The success of new esophageal tissue formation will ultimately depend on the success of the scaffold being able to meet the essential requirements specific to the esophageal tissue. Here, the design of the scaffold and its fabrication approaches are reviewed. In this paper, we review the current state of development in bioengineering the esophagus with particular emphasis on scaffold design. Copyright © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Tonitto, C.; Gurwick, N. P.
2012-12-01
Policy initiatives to reduce greenhouse gas emissions (GHG) have promoted the development of agricultural management protocols to increase SOC storage and reduce GHG emissions. We review approaches for quantifying N2O flux from agricultural landscapes. We summarize the temporal and spatial extent of observations across representative soil classes, climate zones, cropping systems, and management scenarios. We review applications of simulation and empirical modeling approaches and compare validation outcomes across modeling tools. Subsequently, we review current model application in agricultural management protocols. In particular, we compare approaches adapted for compliance with the California Global Warming Solutions Act, the Alberta Climate Change and Emissions Management Act, and by the American Carbon Registry. In the absence of regional data to drive model development, policies that require GHG quantification often use simple empirical models based on highly aggregated data of N2O flux as a function of applied N - Tier 1 models according to IPCC categorization. As participants in development of protocols that could be used in carbon offset markets, we observed that stakeholders outside of the biogeochemistry community favored outcomes from simulation modeling (Tier 3) rather than empirical modeling (Tier 2). In contrast, scientific advisors were more accepting of outcomes based on statistical approaches that rely on local observations, and their views sometimes swayed policy practitioners over the course of policy development. Both Tier 2 and Tier 3 approaches have been implemented in current policy development, and it is important that the strengths and limitations of both approaches, in the face of available data, be well-understood by those drafting and adopting policies and protocols. The reliability of all models is contingent on sufficient observations for model development and validation. Simulation models applied without site-calibration generally result in poor validation results, and this point particularly needs to be emphasized during policy development. For cases where sufficient calibration data are available, simulation models have demonstrated the ability to capture seasonal patterns of N2O flux. The reliability of statistical models likewise depends on data availability. Because soil moisture is a significant driver of N2O flux, the best outcomes occur when empirical models are applied to systems with relevant soil classification and climate. The structure of current carbon offset protocols is not well-aligned with a budgetary approach to GHG accounting. Current protocols credit field-scale reduction in N2O flux as a result of reduced fertilizer use. Protocols do not award farmers credit for reductions in CO2 emissions resulting from reduced production of synthetic N fertilizer. To achieve the greatest GHG emission reductions through reduced synthetic N production and reduced landscape N saturation requires a re-envisioning of the agricultural landscape to include cropping systems with legume and manure N sources. The current focus on on-farm GHG sources focuses credits on simple reductions of N applied in conventional systems rather than on developing cropping systems which promote higher recycling and retention of N.
Stewart, J; Breslin, W J; Beyer, B K; Chadwick, K; De Schaepdrijver, L; Desai, M; Enright, B; Foster, W; Hui, J Y; Moffat, G J; Tornesi, B; Van Malderen, K; Wiesner, L; Chen, C L
2016-03-01
The Health and Environmental Sciences Institute (HESI) Developmental and Reproductive Toxicology Technical Committee sponsored a pharmaceutical industry survey on current industry practices for contraception use during clinical trials. The objectives of the survey were to improve our understanding of the current industry practices for contraception requirements in clinical trials, the governance processes set up to promote consistency and/or compliance with contraception requirements, and the effectiveness of current contraception practices in preventing pregnancies during clinical trials. Opportunities for improvements in current practices were also considered. The survey results from 12 pharmaceutical companies identified significant variability among companies with regard to contraception practices and governance during clinical trials. This variability was due primarily to differences in definitions, areas of scientific uncertainty or misunderstanding, and differences in company approaches to enrollment in clinical trials. The survey also revealed that few companies collected data in a manner that would allow a retrospective understanding of the reasons for failure of birth control during clinical trials. In this article, suggestions are made for topics where regulatory guidance or scientific publications could facilitate best practice. These include provisions for a pragmatic definition of women of childbearing potential, guidance on how animal data can influence the requirements for male and female birth control, evidence-based guidance on birth control and pregnancy testing regimes suitable for low- and high-risk situations, plus practical methods to ascertain the risk of drug-drug interactions with hormonal contraceptives.
Stratified Diffractive Optic Approach for Creating High Efficiency Gratings
NASA Technical Reports Server (NTRS)
Chambers, Diana M.; Nordin, Gregory P.
1998-01-01
Gratings with high efficiency in a single diffracted order can be realized with both volume holographic and diffractive optical elements. However, each method has limitations that restrict the applications in which they can be used. For example, high efficiency volume holographic gratings require an appropriate combination of thickness and permittivity modulation throughout the bulk of the material. Possible combinations of those two characteristics are limited by properties of currently available materials, thus restricting the range of applications for volume holographic gratings. Efficiency of a diffractive optic grating is dependent on its approximation of an ideal analog profile using discrete features. The size of constituent features and, consequently, the number that can be used within a required grating period restricts the applications in which diffractive optic gratings can be used. These limitations imply that there are applications which cannot be addressed by either technology. In this paper we propose to address a number of applications in this category with a new method of creating high efficiency gratings which we call stratified diffractive optic gratings. In this approach diffractive optic techniques are used to create an optical structure that emulates volume grating behavior. To illustrate the stratified diffractive optic grating concept we consider a specific application, a scanner for a space-based coherent wind lidar, with requirements that would be difficult to meet by either volume holographic or diffractive optic methods. The lidar instrument design specifies a transmissive scanner element with the input beam normally incident and the exiting beam deflected at a fixed angle from the optical axis. The element will be rotated about the optical axis to produce a conical scan pattern. The wavelength of the incident beam is 2.06 microns and the required deflection angle is 30 degrees, implying a grating period of approximately 4 microns. Creating a high efficiency volume grating with these parameters would require a grating thickness that cannot be attained with current photosensitive materials. For a diffractive optic grating, the number of binary steps necessary to produce high efficiency combined with the grating period requires feature sizes and alignment tolerances that are also unattainable with current techniques. Rotation of the grating and integration into a space-based lidar system impose the additional requirements that it be insensitive to polarization orientation, that its mass be minimized and that it be able to withstand launch and space environments.
Computational design optimization for microfluidic magnetophoresis
Plouffe, Brian D.; Lewis, Laura H.; Murthy, Shashi K.
2011-01-01
Current macro- and microfluidic approaches for the isolation of mammalian cells are limited in both efficiency and purity. In order to design a robust platform for the enumeration of a target cell population, high collection efficiencies are required. Additionally, the ability to isolate pure populations with minimal biological perturbation and efficient off-chip recovery will enable subcellular analyses of these cells for applications in personalized medicine. Here, a rational design approach for a simple and efficient device that isolates target cell populations via magnetic tagging is presented. In this work, two magnetophoretic microfluidic device designs are described, with optimized dimensions and operating conditions determined from a force balance equation that considers two dominant and opposing driving forces exerted on a magnetic-particle-tagged cell, namely, magnetic and viscous drag. Quantitative design criteria for an electromagnetic field displacement-based approach are presented, wherein target cells labeled with commercial magnetic microparticles flowing in a central sample stream are shifted laterally into a collection stream. Furthermore, the final device design is constrained to fit on standard rectangular glass coverslip (60 (L)×24 (W)×0.15 (H) mm3) to accommodate small sample volume and point-of-care design considerations. The anticipated performance of the device is examined via a parametric analysis of several key variables within the model. It is observed that minimal currents (<500 mA) are required to generate magnetic fields sufficient to separate cells from the sample streams flowing at rate as high as 7 ml∕h, comparable to the performance of current state-of-the-art magnet-activated cell sorting systems currently used in clinical settings. Experimental validation of the presented model illustrates that a device designed according to the derived rational optimization can effectively isolate (∼100%) a magnetic-particle-tagged cell population from a homogeneous suspension even in a low abundance. Overall, this design analysis provides a rational basis to select the operating conditions, including chamber and wire geometry, flow rates, and applied currents, for a magnetic-microfluidic cell separation device. PMID:21526007
Wind Sensing, Analysis, and Modeling
NASA Technical Reports Server (NTRS)
Corvin, Michael A.
1995-01-01
The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch system operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided. Current versions of prototype Process Management Environment tools are being provided to the customer.
Wind sensing, analysis, and modeling
NASA Technical Reports Server (NTRS)
Corvin, Michael A.
1995-01-01
The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch systems operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided . Current versions of prototype Process Management Environment tools are being provided to the customer.
A requirements engineering approach for improving the quality of diabetes education websites.
Shabestari, Omid; Roudsari, Abdul
2011-01-01
Diabetes Mellitus is a major chronic disease with multi-organ involvement and high-cost complications. Although it has been proved that structured education can control the risk of developing these complications, there is big room for improvement in the educational services for these patients. e-learning can be a good solution to fill this gap. Most of the current e-learning solutions for diabetes were designed by computer experts and healthcare professionals but the patients, as end-users of these systems, haven't been deeply involved in the design process. Considering the expectations of the patients, this article investigates a requirement engineering process comparing the level of importance given to different attributes of the e-learning by patients and healthcare professionals. The results of this comparison can be used for improving the currently developed online diabetes education systems.
Management Approach for NASA's Earth Venture-1 (EV-1) Airborne Science Investigations
NASA Technical Reports Server (NTRS)
Guillory, Anthony R.; Denkins, Todd C.; Allen, B. Danette
2013-01-01
The Earth System Science Pathfinder (ESSP) Program Office (PO) is responsible for programmatic management of National Aeronautics and Space Administration's (NASA) Science Mission Directorate's (SMD) Earth Venture (EV) missions. EV is composed of both orbital and suborbital Earth science missions. The first of the Earth Venture missions is EV-1, which are Principal Investigator-led, temporally-sustained, suborbital (airborne) science investigations costcapped at $30M each over five years. Traditional orbital procedures, processes and standards used to manage previous ESSP missions, while effective, are disproportionally comprehensive for suborbital missions. Conversely, existing airborne practices are primarily intended for smaller, temporally shorter investigations, and traditionally managed directly by a program scientist as opposed to a program office such as ESSP. In 2010, ESSP crafted a management approach for the successful implementation of the EV-1 missions within the constructs of current governance models. NASA Research and Technology Program and Project Management Requirements form the foundation of the approach for EV-1. Additionally, requirements from other existing NASA Procedural Requirements (NPRs), systems engineering guidance and management handbooks were adapted to manage programmatic, technical, schedule, cost elements and risk. As the EV-1 missions are nearly at the end of their successful execution and project lifecycle and the submission deadline of the next mission proposals near, the ESSP PO is taking the lessons learned and updated the programmatic management approach for all future Earth Venture Suborbital (EVS) missions for an even more flexible and streamlined management approach.
The Digital Twin Paradigm for Future NASA and U.S. Air Force Vehicles
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Stargel, D. S.
2012-01-01
Future generations of NASA and U.S. Air Force vehicles will require lighter mass while being subjected to higher loads and more extreme service conditions over longer time periods than the present generation. Current approaches for certification, fleet management and sustainment are largely based on statistical distributions of material properties, heuristic design philosophies, physical testing and assumed similitude between testing and operational conditions and will likely be unable to address these extreme requirements. To address the shortcomings of conventional approaches, a fundamental paradigm shift is needed. This paradigm shift, the Digital Twin, integrates ultra-high fidelity simulation with the vehicle s on-board integrated vehicle health management system, maintenance history and all available historical and fleet data to mirror the life of its flying twin and enable unprecedented levels of safety and reliability.
Unifying Human Centered Design and Systems Engineering for Human Systems Integration
NASA Technical Reports Server (NTRS)
Boy, Guy A.; McGovernNarkevicius, Jennifer
2013-01-01
Despite the holistic approach of systems engineering (SE), systems still fail, and sometimes spectacularly. Requirements, solutions and the world constantly evolve and are very difficult to keep current. SE requires more flexibility and new approaches to SE have to be developed to include creativity as an integral part and where the functions of people and technology are appropriately allocated within our highly interconnected complex organizations. Instead of disregarding complexity because it is too difficult to handle, we should take advantage of it, discovering behavioral attractors and the emerging properties that it generates. Human-centered design (HCD) provides the creativity factor that SE lacks. It promotes modeling and simulation from the early stages of design and throughout the life cycle of a product. Unifying HCD and SE will shape appropriate human-systems integration (HSI) and produce successful systems.
Technique for Very High Order Nonlinear Simulation and Validation
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.
2001-01-01
Finding the sources of sound in large nonlinear fields via direct simulation currently requires excessive computational cost. This paper describes a simple technique for efficiently solving the multidimensional nonlinear Euler equations that significantly reduces this cost and demonstrates a useful approach for validating high order nonlinear methods. Up to 15th order accuracy in space and time methods were compared and it is shown that an algorithm with a fixed design accuracy approaches its maximal utility and then its usefulness exponentially decays unless higher accuracy is used. It is concluded that at least a 7th order method is required to efficiently propagate a harmonic wave using the nonlinear Euler equations to a distance of 5 wavelengths while maintaining an overall error tolerance that is low enough to capture both the mean flow and the acoustics.
McCormick, Stephen; Romero, L. Michael
2017-01-01
Endocrinologists can make significant contributions to conservation biology by helping to understand the mechanisms by which organisms cope with changing environments. Field endocrine techniques have advanced rapidly in recent years and can provide substantial information on the growth, stress, and reproductive status of individual animals, thereby providing insight into current and future responses of populations to changes in the environment. Environmental stressors and reproductive status can be detected nonlethally by measuring a number of endocrine-related endpoints, including steroids in plasma, living and nonliving tissue, urine, and feces. Information on the environmental or endocrine requirements of individual species for normal growth, development, and reproduction will provide critical information for species and ecosystem conservation. For many taxa, basic information on endocrinology is lacking, and advances in conservation endocrinology will require approaches that are both “basic” and “applied” and include integration of laboratory and field approaches.
High-resolution wavefront control of high-power laser systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brase, J; Brown, C; Carrano, C
1999-07-08
Nearly every new large-scale laser system application at LLNL has requirements for beam control which exceed the current level of available technology. For applications such as inertial confinement fusion, laser isotope separation, laser machining, and laser the ability to transport significant power to a target while maintaining good beam quality is critical. There are many ways that laser wavefront quality can be degraded. Thermal effects due to the interaction of high-power laser or pump light with the internal optical components or with the ambient gas are common causes of wavefront degradation. For many years, adaptive optics based on thing deformablemore » glass mirrors with piezoelectric or electrostrictive actuators have be used to remove the low-order wavefront errors from high-power laser systems. These adaptive optics systems have successfully improved laser beam quality, but have also generally revealed additional high-spatial-frequency errors, both because the low-order errors have been reduced and because deformable mirrors have often introduced some high-spatial-frequency components due to manufacturing errors. Many current and emerging laser applications fall into the high-resolution category where there is an increased need for the correction of high spatial frequency aberrations which requires correctors with thousands of degrees of freedom. The largest Deformable Mirrors currently available have less than one thousand degrees of freedom at a cost of approximately $1M. A deformable mirror capable of meeting these high spatial resolution requirements would be cost prohibitive. Therefore a new approach using a different wavefront control technology is needed. One new wavefront control approach is the use of liquid-crystal (LC) spatial light modulator (SLM) technology for the controlling the phase of linearly polarized light. Current LC SLM technology provides high-spatial-resolution wavefront control, with hundreds of thousands of degrees of freedom, more than two orders of magnitude greater than the best Deformable Mirrors currently made. Even with the increased spatial resolution, the cost of these devices is nearly two orders of magnitude less than the cost of the largest deformable mirror.« less
Greason, Kevin L; Pochettino, Alberto; Sandhu, Gurpreet S; King, Katherine S; Holmes, David R
2016-04-01
Transfemoral transcatheter aortic valve insertion may be performed in a catheterization laboratory (ie, the minimalist approach). It seems reasonable when considering this approach to avoid it in patients at risk for intraoperative morbidity that would require surgical intervention. We hypothesized that it would be possible to associate baseline characteristics with such morbidity, which would help heart teams select patients for the minimalist approach. We reviewed the records of 215 consecutive patients who underwent transfemoral transcatheter aortic valve insertion with a current commercially available device from November 2008 through July 2015. Demographic characteristics of the patients included a mean age of 78.9 ± 10.6 years, female sex in 73 patients (34.0%), and a mean Society of Thoracic Surgeons predicted risk of mortality of 8.7% ± 5.4%. Valve prostheses were balloon-expandable in 126 patients (58.6%) and self-expanding in 89 patients (41.4%). Significant intraoperative morbidity occurred in 22 patients (10.2%) and included major vascular injury in 12 patients (5.6%), hemodynamic compromise requiring cardiopulmonary bypass support in 4 patients (1.9%), cardiac tamponade requiring intervention in 3 patients (1.4%), ventricular valve embolization in 2 patients (0.9%), and inability to obtain percutaneous access requiring open vascular access in 1 patient (0.5%). Intraoperative morbidity was similarly distributed across all valve types (P = .556) and sheath sizes (P = .369). There were no baseline patient characteristics predictive of intraoperative morbidity. Patient and valve characteristics are not predictive of significant intraoperative morbidity during transfemoral transcatheter aortic valve insertion. The finding has implications for patient selection for the minimalist approach. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Towards new approaches in phenological modelling
NASA Astrophysics Data System (ADS)
Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas
2014-05-01
Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.