Sample records for simplified process basically

  1. Simplify Web Development for Faculty and Promote Instructional Design.

    ERIC Educational Resources Information Center

    Pedersen, David C.

    Faculty members are often overwhelmed with the prospect of implementing Web-based instruction. In an effort to simplify the process and incorporate some basic instructional design elements, the Educational Technology Team at Embry Riddle Aeronautical University created a course template for WebCT. Utilizing rapid prototyping, the template…

  2. Simplifying the writing process for the novice writer.

    PubMed

    Redmond, Mary Connie

    2002-10-01

    Nurses take responsibility for reading information to update their professional knowledge and to meet relicensure requirements. However, nurses are less enthusiastic about writing for professional publication. This article explores the reluctance of nurses to write, the reasons why writing for publication is important to the nursing profession, the importance of mentoring to potential writers, and basic information about simplifying the writing process for novice writers. Copyright 2002 by American Society of PeriAnesthesia Nurses.

  3. LOW-COST PERSONNEL DOSIMETER.

    DTIC Science & Technology

    specification was achieved by simplifying and improving the basic Bendix dosimeter design, using plastics for component parts, minimizing direct labor, and making the instrument suitable for automated processing and assembly. (Author)

  4. A Game Simulation of Multilateral Trade for Classroom Use.

    ERIC Educational Resources Information Center

    Thompson, Gary L.; Carter, Ronald L.

    An alternative to existing methods for teaching elementary economic geography courses was developed in a game format to teach the basic process of trade through role playing. Simplifying the complexities of multilateral trade to a few basic decisions and acts, the cognitive objectives are to develop in the student: 1) an understanding of regional…

  5. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  6. NLM's Medical Library Resource Improvement Grant for Consortia Development: a proposed outline to simplify the application process.

    PubMed

    Kabler, A W

    1980-01-01

    The National Library of Medicine's Resource Improvement Grant for Consortia is available to assist with developing hospital library consortia, and to support the development of basic healthy information collections. In an effort to simplify the grant application process, this paper presents suggestions for writing the narrative section of the first budget-period application, using the outline in NLM's Application Instructions for Consortium Applicants. Suggestions for writing the narratives of the second budget-period application and the collection development application are also included.

  7. A Guide to Program Planning Vol. II.

    ERIC Educational Resources Information Center

    Allen, Earl, Sr.

    This booklet is a simplified guide for program planning and is intended to complement a somewhat lengthier companion booklet on program evaluation. It spells out in outline fashion the basic elements and steps involved in the planning process. Brief sections focus in turn on different phases of the planning process, including problem…

  8. An Image Understanding Environment for DARPA Supported Research and Applications, Second Annual Report

    DTIC Science & Technology

    1992-05-01

    relatively independent of the 29 30 Basic Objects Support Objects GUI Access Objects Displays Display Mapping Menues Pixel Snapshot Gizmos /Widgets...a user interactively or set from some gizmo /widget, or that a particular browser field is to be updated when some state occurs or a process completes...also want to distinguish tree graph browsers.] 4.3.2 Simplified access to GUI objects "* Gizmos and Widgets: The IUE should provide simplified

  9. United States Navy Contracting Officer Warranting Process

    DTIC Science & Technology

    2011-03-01

    by 30% or more of the respondents: Contract Law , Cost Analysis, Market Research, Contract Source Selection, Simplified Acquisition Procedures, and...that the majority of AOs found the following course at least somewhat important: Contract Law , Cost Analysis, Market Research, Contract 52 Source...the budget and appropriation cycle 4. Ethics and conduct standards 5. Basic contract laws and regulations 6. Socio-economic requirements in

  10. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  11. Nonvolatile GaAs Random-Access Memory

    NASA Technical Reports Server (NTRS)

    Katti, Romney R.; Stadler, Henry L.; Wu, Jiin-Chuan

    1994-01-01

    Proposed random-access integrated-circuit electronic memory offers nonvolatile magnetic storage. Bits stored magnetically and read out with Hall-effect sensors. Advantages include short reading and writing times and high degree of immunity to both single-event upsets and permanent damage by ionizing radiation. Use of same basic material for both transistors and sensors simplifies fabrication process, with consequent benefits in increased yield and reduced cost.

  12. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  13. Simplified model to describe the dissociative recombination of linear polyatomic ions of astrophysical interest

    NASA Astrophysics Data System (ADS)

    Fonseca Dos Santos, Samantha; Douguet, Nicolas; Kokoouline, Viatcheslav; Orel, Ann

    2013-05-01

    We will present theoretical results on the dissociative recombination (DR) of the linear polyatomic ions HCNH+, HCO+ and N2H+. Besides their astrophysical importance, they also share the characteristic that at low electronic impact energies their DR process happens via the indirect DR mechanism. We apply a general simplified model successfully implemented to treat the DR process of the highly symmetric non-linear molecules H3+, CH3+, H3O+ and NH4+ to calculated cross sections and DR rates for these ions. The model is based on multichannel quantum defect theory and accounts for all the main ingredients of indirect DR. New perspectives on dissociative recombination of HCO+ will also be discussed, including the possible role of HOC+ in storage ring experimental results. This work is supported by the DOE Office of Basic Energy Science and the National Science Foundation, Grant No's PHY-11-60611 and PHY-10-68785.

  14. Prediction of plasma properties in mercury ion thrusters

    NASA Technical Reports Server (NTRS)

    Longhurst, G. R.

    1978-01-01

    A simplified theoretical model was developed which obtains to first order the plasma properties in the discharge chamber of a mercury ion thruster from basic thruster design and controllable operating parameters. The basic operation and design of ion thrusters is discussed, and the important processes which influence the plasma properties are described in terms of the design and control parameters. The conservation for mass, charge and energy were applied to the ion production region, which was defined as the region of the discharge chamber having as its outer boundary the surface of revolution of the innermost field line to intersect the anode. Mass conservation and the equations describing the various processes involved with mass addition and removal from the ion production region are satisfied by a Maxwellian electron density spatial distribution in that region.

  15. Computerized Dental Comparison: A Critical Review of Dental Coding and Ranking Algorithms Used in Victim Identification.

    PubMed

    Adams, Bradley J; Aschheim, Kenneth W

    2016-01-01

    Comparison of antemortem and postmortem dental records is a leading method of victim identification, especially for incidents involving a large number of decedents. This process may be expedited with computer software that provides a ranked list of best possible matches. This study provides a comparison of the most commonly used conventional coding and sorting algorithms used in the United States (WinID3) with a simplified coding format that utilizes an optimized sorting algorithm. The simplified system consists of seven basic codes and utilizes an optimized algorithm based largely on the percentage of matches. To perform this research, a large reference database of approximately 50,000 antemortem and postmortem records was created. For most disaster scenarios, the proposed simplified codes, paired with the optimized algorithm, performed better than WinID3 which uses more complex codes. The detailed coding system does show better performance with extremely large numbers of records and/or significant body fragmentation. © 2015 American Academy of Forensic Sciences.

  16. Gravity waves and instabilities in the lower and middle atmosphere

    NASA Technical Reports Server (NTRS)

    Klostermeyer, Juergen

    1989-01-01

    Some basic aspects of mesoscale and small-scale gravity waves and instability mechanisms are discussed. Internal gravity waves with wavelengths between ten and less than one kilometer and periods between several hours and several minutes appear to play a central role in atmospheric wavenumber and frequency spectra. Therefore, the author discusses the propagation of gravity waves in simplified atmospheric models. Their interaction with the wind as well as their mutual interaction and stability mechanisms based on these processes are discussed. Mesosphere stratosphere troposphere radar observations showing the relevant hydrodynamic processes are stressed.

  17. Atmospheric, Climatic, and Environmental Research

    NASA Technical Reports Server (NTRS)

    Broecker, Wallace S.; Gornitz, Vivien M.

    1994-01-01

    The climate and atmospheric modeling project involves analysis of basic climate processes, with special emphasis on studies of the atmospheric CO2 and H2O source/sink budgets and studies of the climatic role Of CO2, trace gases and aerosols. These studies are carried out, based in part on use of simplified climate models and climate process models developed at GISS. The principal models currently employed are a variable resolution 3-D general circulation model (GCM), and an associated "tracer" model which simulates the advection of trace constituents using the winds generated by the GCM.

  18. Geocoded data structures and their applications to Earth science investigations

    NASA Technical Reports Server (NTRS)

    Goldberg, M.

    1984-01-01

    A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space.

  19. Astrophysical Flows

    NASA Astrophysics Data System (ADS)

    Pringle, James E.; King, Andrew

    2003-07-01

    Almost all conventional matter in the Universe is fluid, and fluid dynamics plays a crucial role in astrophysics. This new graduate textbook provides a basic understanding of the fluid dynamical processes relevant to astrophysics. The mathematics used to describe these processes is simplified to bring out the underlying physics. The authors cover many topics, including wave propagation, shocks, spherical flows, stellar oscillations, the instabilities caused by effects such as magnetic fields, thermal driving, gravity, shear flows, and the basic concepts of compressible fluid dynamics and magnetohydrodynamics. The authors are Directors of the UK Astrophysical Fluids Facility (UKAFF) at the University of Leicester, and editors of the Cambridge Astrophysics Series. This book has been developed from a course in astrophysical fluid dynamics taught at the University of Cambridge. It is suitable for graduate students in astrophysics, physics and applied mathematics, and requires only a basic familiarity with fluid dynamics.• Provides coverage of the fundamental fluid dynamical processes an astrophysical theorist needs to know • Introduces new mathematical theory and techniques in a straightforward manner • Includes end-of-chapter problems to illustrate the course and introduce additional ideas

  20. Simulated breeding with QU-GENE graphical user interface.

    PubMed

    Hathorn, Adrian; Chapman, Scott; Dieters, Mark

    2014-01-01

    Comparing the efficiencies of breeding methods with field experiments is a costly, long-term process. QU-GENE is a highly flexible genetic and breeding simulation platform capable of simulating the performance of a range of different breeding strategies and for a continuum of genetic models ranging from simple to complex. In this chapter we describe some of the basic mechanics behind the QU-GENE user interface and give a simplified example of how it works.

  1. Linear Quadratic Gaussian Controller Design Using a Graphical User Interface: Application to the Beam-Waveguide Antennas

    NASA Astrophysics Data System (ADS)

    Maneri, E.; Gawronski, W.

    1999-10-01

    The linear quadratic Gaussian (LQG) design algorithms described in [2] and [5] have been used in the controller design of JPL's beam-waveguide [5] and 70-m [6] antennas. This algorithm significantly improves tracking precision in a windy environment. This article describes the graphical user interface (GUI) software for the design LQG controllers. It consists of two parts: the basic LQG design and the fine-tuning of the basic design using a constrained optimization algorithm. The presented GUI was developed to simplify the design process, to make the design process user-friendly, and to enable design of an LQG controller for one with a limited control engineering background. The user is asked to manipulate the GUI sliders and radio buttons to watch the antenna performance. Simple rules are given at the GUI display.

  2. User's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1980-01-01

    A user's manual for the FORTRAN IV computer program MMLE3 is described. It is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The theory and use of the program is described. The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program.

  3. compomics-utilities: an open-source Java library for computational proteomics.

    PubMed

    Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart

    2011-03-08

    The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.

  4. Designing an evaluation framework for WFME basic standards for medical education.

    PubMed

    Tackett, Sean; Grant, Janet; Mmari, Kristin

    2016-01-01

    To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.

  5. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  6. Thermodynamics--A Practical Subject.

    ERIC Educational Resources Information Center

    Jones, Hugh G.

    1984-01-01

    Provides a simplified, synoptic overview of the area of thermodynamics, enumerating and explaining the four basic laws, and introducing the mathematics involved in a stepwise fashion. Discusses such basic tools of thermodynamics as enthalpy, entropy, Helmholtz free energy, and Gibbs free energy, and their uses in problem solving. (JM)

  7. Mixed monofunctional extractants for trivalent actinide/lanthanide separations: TALSPEAK-MME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Aaron T.; Nash, Kenneth L.

    The basic features of an f-element extraction process based on a solvent composed of equimolar mixtures of Cyanex-923 (a mixed trialkyl phosphine oxide) and 2-ethylhexylphosphonic acid mono-2-ethylhexyl ester (HEH[EHP]) extractants in n-dodecane are investigated in this report. This system, which combines features of the TRPO and TALSPEAK processes, is based on co-extraction of trivalent lanthanides and actinides from 0.1 to 1.0 M HNO 3 followed by application of a buffered aminopolycarboxylate solution strip to accomplish a Reverse TALSPEAK selective removal of actinides. This mixed-extractant medium could enable a simplified approach to selective trivalent f-element extraction and actinide partitioning in amore » single process. As compared with other combined process applications in development for more compact actinide partitioning processes (DIAMEX-SANEX, GANEX, TRUSPEAK, ALSEP), this combination features only monofunctional extractants with high solubility limits and comparatively low molar mass. Selective actinide stripping from the loaded extractant phase is done using a glycine-buffered solution containing N-(2-hydroxyethyl)ethylenediaminetriacetic acid (HEDTA) or triethylenetetramine-N,N,N',N'',N''',N'''-hexaacetic acid (TTHA). Lastly, the results reported provide evidence for simplified interactions between the two extractants and demonstrate a pathway toward using mixed monofunctional extractants to separate trivalent actinides (An) from fission product lanthanides (Ln).« less

  8. Mixed monofunctional extractants for trivalent actinide/lanthanide separations: TALSPEAK-MME

    DOE PAGES

    Johnson, Aaron T.; Nash, Kenneth L.

    2015-08-20

    The basic features of an f-element extraction process based on a solvent composed of equimolar mixtures of Cyanex-923 (a mixed trialkyl phosphine oxide) and 2-ethylhexylphosphonic acid mono-2-ethylhexyl ester (HEH[EHP]) extractants in n-dodecane are investigated in this report. This system, which combines features of the TRPO and TALSPEAK processes, is based on co-extraction of trivalent lanthanides and actinides from 0.1 to 1.0 M HNO 3 followed by application of a buffered aminopolycarboxylate solution strip to accomplish a Reverse TALSPEAK selective removal of actinides. This mixed-extractant medium could enable a simplified approach to selective trivalent f-element extraction and actinide partitioning in amore » single process. As compared with other combined process applications in development for more compact actinide partitioning processes (DIAMEX-SANEX, GANEX, TRUSPEAK, ALSEP), this combination features only monofunctional extractants with high solubility limits and comparatively low molar mass. Selective actinide stripping from the loaded extractant phase is done using a glycine-buffered solution containing N-(2-hydroxyethyl)ethylenediaminetriacetic acid (HEDTA) or triethylenetetramine-N,N,N',N'',N''',N'''-hexaacetic acid (TTHA). Lastly, the results reported provide evidence for simplified interactions between the two extractants and demonstrate a pathway toward using mixed monofunctional extractants to separate trivalent actinides (An) from fission product lanthanides (Ln).« less

  9. Development of simplified process for environmentally resistant cells

    NASA Technical Reports Server (NTRS)

    King, W. J.

    1980-01-01

    This report describes a program to develop a simple, foolproof, all vacuum solar cell manufacturing process which can be completely automated and which results in medium efficiency cells which are inherently environmentally resistant. All components of the completed cells are integrated into a monolithic structure with no material interfaces. The exposed materials (SI, Al2O3, Al, Ni) are all resistant to atmospheric attack and the junction, per se, is passivated to prevent long term degradation. Such cells are intended to be incorporated into a simple module consisting basically of a press formed metallic superstructure with a separated glass cover for missile, etc., protection.

  10. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions

    PubMed Central

    Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348

  11. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions.

    PubMed

    Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.

  12. Baseline mathematics and geodetics for tracking operations

    NASA Technical Reports Server (NTRS)

    James, R.

    1981-01-01

    Various geodetic and mapping algorithms are analyzed as they apply to radar tracking systems and tested in extended BASIC computer language for real time computer applications. Closed-form approaches to the solution of converting Earth centered coordinates to latitude, longitude, and altitude are compared with classical approximations. A simplified approach to atmospheric refractivity called gradient refraction is compared with conventional ray tracing processes. An extremely detailed set of documentation which provides the theory, derivations, and application of algorithms used in the programs is included. Validation methods are also presented for testing the accuracy of the algorithms.

  13. Evaluation of an Atmosphere Revitalization Subsystem for Deep Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Abney, Morgan B.; Conrad, Ruth E.; Frederick, Kenneth R.; Greenwood, Zachary W.; Kayatin, Matthew J.; Knox, James C.; Newton, Robert L.; Parrish, Keith J.; Takada, Kevin C.; hide

    2015-01-01

    An Atmosphere Revitalization Subsystem (ARS) suitable for deployment aboard deep space exploration mission vehicles has been developed and functionally demonstrated. This modified ARS process design architecture was derived from the International Space Station's (ISS) basic ARS. Primary functions considered in the architecture include trace contaminant control, carbon dioxide removal, carbon dioxide reduction, and oxygen generation. Candidate environmental monitoring instruments were also evaluated. The process architecture rearranges unit operations and employs equipment operational changes to reduce mass, simplify, and improve the functional performance for trace contaminant control, carbon dioxide removal, and oxygen generation. Results from integrated functional demonstration are summarized and compared to the performance observed during previous testing conducted on an ISS-like subsystem architecture and a similarly evolved process architecture. Considerations for further subsystem architecture and process technology development are discussed.

  14. GENASIS Basics: Object-oriented utilitarian functionality for large-scale physics simulations (Version 2)

    NASA Astrophysics Data System (ADS)

    Cardall, Christian Y.; Budiardja, Reuben D.

    2017-05-01

    GenASiS Basics provides Fortran 2003 classes furnishing extensible object-oriented utilitarian functionality for large-scale physics simulations on distributed memory supercomputers. This functionality includes physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. This revision -Version 2 of Basics - makes mostly minor additions to functionality and includes some simplifying name changes.

  15. Determining the Number of Clusters in a Data Set Without Graphical Interpretation

    NASA Technical Reports Server (NTRS)

    Aguirre, Nathan S.; Davies, Misty D.

    2011-01-01

    Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,

  16. Principal component regression analysis with SPSS.

    PubMed

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  17. A Simple Picaxe Microcontroller Pulse Source for Juxtacellular Neuronal Labelling.

    PubMed

    Verberne, Anthony J M

    2016-10-19

    Juxtacellular neuronal labelling is a method which allows neurophysiologists to fill physiologically-identified neurons with small positively-charged marker molecules. Labelled neurons are identified by histochemical processing of brain sections along with immunohistochemical identification of neuropeptides, neurotransmitters, neurotransmitter transporters or biosynthetic enzymes. A microcontroller-based pulser circuit and associated BASIC software script is described for incorporation into the design of a commercially-available intracellular electrometer for use in juxtacellular neuronal labelling. Printed circuit board construction has been used for reliability and reproducibility. The current design obviates the need for a separate digital pulse source and simplifies the juxtacellular neuronal labelling procedure.

  18. Algorithm Sorts Groups Of Data

    NASA Technical Reports Server (NTRS)

    Evans, J. D.

    1987-01-01

    For efficient sorting, algorithm finds set containing minimum or maximum most significant data. Sets of data sorted as desired. Sorting process simplified by reduction of each multielement set of data to single representative number. First, each set of data expressed as polynomial with suitably chosen base, using elements of set as coefficients. Most significant element placed in term containing largest exponent. Base selected by examining range in value of data elements. Resulting series summed to yield single representative number. Numbers easily sorted, and each such number converted back to original set of data by successive division. Program written in BASIC.

  19. 75 FR 71376 - Simplified Network Application Processing System, On-Line Registration and Account Maintenance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ...-02] RIN 0694-AE98 Simplified Network Application Processing System, On-Line Registration and Account...'') electronically via BIS's Simplified Network Application Processing (SNAP-R) system. Currently, parties must... Network Applications Processing System (SNAP-R) in October 2006. The SNAP-R system provides a Web based...

  20. Simplified three-dimensional model provides anatomical insights in lizards' caudal autotomy as printed illustration.

    PubMed

    De Amorim, Joana D C G; Travnik, Isadora; De Sousa, Bernadete M

    2015-03-01

    Lizards' caudal autotomy is a complex and vastly employed antipredator mechanism, with thorough anatomic adaptations involved. Due to its diminished size and intricate structures, vertebral anatomy is hard to be clearly conveyed to students and researchers of other areas. Three-dimensional models are prodigious tools in unveiling anatomical nuances. Some of the techniques used to create them can produce irregular and complicated forms, which despite being very accurate, lack didactical uniformity and simplicity. Since both are considered fundamental characteristics for comprehension, a simplified model could be the key to improve learning. The model here presented depicts the caudal osteology of Tropidurus itambere, and was designed to be concise, in order to be easily assimilated, yet complete, not to compromise the informative aspect. The creation process requires only basic skills in manipulating polygons in 3D modeling softwares, in addition to the appropriate knowledge of the structure to be modeled. As reference for the modeling, we used microscopic observation and a photograph database of the caudal structures. This way, no advanced laboratory equipment was needed and all biological materials were preserved for future research. Therefore, we propose a wider usage of simplified 3D models both in the classroom and as illustrations for scientific publications.

  1. Enantioselective separation of biologically active basic compounds in ultra-performance supercritical fluid chromatography.

    PubMed

    Geryk, Radim; Kalíková, Květa; Schmid, Martin G; Tesařová, Eva

    2016-08-17

    The enantioseparation of basic compounds represent a challenging task in modern SFC. Therefore this work is focused on development and optimization of fast SFC methods suitable for enantioseparation of 27 biologically active basic compounds of various structures. The influences of the co-solvent type as well as different mobile phase additives on retention, enantioselectivity and enantioresolution were investigated. Obtained results confirmed that the mobile phase additives, especially bases (or the mixture of base and acid), improve peak shape and enhance enantioresolution. The best results were achieved with isopropylamine or the mixture of isopropylamine and trifluoroacetic acid as additives. In addition, the effect of temperature and back pressure were evaluated to optimize the enantioseparation process. The immobilized amylose-based chiral stationary phase, i.e. tris(3,5-dimethylphenylcarbamate) derivative of amylose proved to be useful tool for the enantioseparation of a broad spectrum of chiral bases. The chromatographic conditions that yielded baseline enantioseparations of all tested compounds were discovered. The presented work can serve as a guide for simplifying the method development for enantioseparation of basic racemates in SFC. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Quark contact interactions at the LHC

    NASA Astrophysics Data System (ADS)

    Bazzocchi, F.; De Sanctis, U.; Fabbrichesi, M.; Tonero, A.

    2012-06-01

    Quark contact interactions are an important signal of new physics. We introduce a model in which the presence of a symmetry protects these new interactions from giving large corrections in flavor changing processes at low energies. This minimal model provides the basic set of operators which must be considered to contribute to the high-energy processes. To discuss their experimental signature in jet pairs produced in proton-proton collisions, we simplify the number of possible operators down to two. We show (for a representative integrated luminosity of 200pb-1 at s=7TeV) how the presence of two operators significantly modifies the bound on the characteristic energy scale of the contact interactions, which is obtained by keeping a single operator.

  3. Observe, simplify, titrate, model, and synthesize: A paradigm for analyzing behavior

    PubMed Central

    Alberts, Jeffrey R.

    2013-01-01

    Phenomena in behavior and their underlying neural mechanisms are exquisitely complex problems. Infrequently do we reflect on our basic strategies of investigation and analysis, or formally confront the actual challenges of achieving an understanding of the phenomena that inspire research. Philip Teitelbaum is distinct in his elegant approaches to understanding behavioral phenomena and their associated neural processes. He also articulated his views on effective approaches to scientific analyses of brain and behavior, his vision of how behavior and the nervous system are patterned, and what constitutes basic understanding. His rubrics involve careful observation and description of behavior, simplification of the complexity, analysis of elements, and re-integration through different forms of synthesis. Research on the development of huddling behavior by individual and groups of rats is reviewed in a context of Teitelbaum’s rubrics of research, with the goal of appreciating his broad and positive influence on the scientific community. PMID:22481081

  4. Study on low intensity aeration oxygenation model and optimization for shallow water

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Ding, Zhibin; Ding, Jian; Wang, Yi

    2018-02-01

    Aeration/oxygenation is an effective measure to improve self-purification capacity in shallow water treatment while high energy consumption, high noise and expensive management refrain the development and the application of this process. Based on two-film theory, the theoretical model of the three-dimensional partial differential equation of aeration in shallow water is established. In order to simplify the equation, the basic assumptions of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction are proposed based on engineering practice and are tested by the simulation results of gas holdup which are obtained by simulating the gas-liquid two-phase flow in aeration tank under low-intensity condition. Based on the basic assumptions and the theory of shallow permeability, the model of three-dimensional partial differential equations is simplified and the calculation model of low-intensity aeration oxygenation is obtained. The model is verified through comparing the aeration experiment. Conclusions as follows: (1)The calculation model of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction can reflect the process of aeration well; (2) Under low-intensity conditions, the long-term aeration and oxygenation is theoretically feasible to enhance the self-purification capacity of water bodies; (3) In the case of the same total aeration intensity, the effect of multipoint distributed aeration on the diffusion of oxygen concentration in the horizontal direction is obvious; (4) In the shallow water treatment, reducing the volume of aeration equipment with the methods of miniaturization, array, low-intensity, mobilization to overcome the high energy consumption, large size, noise and other problems can provide a good reference.

  5. Exact protein distributions for stochastic models of gene expression using partitioning of Poisson processes.

    PubMed

    Pendar, Hodjat; Platini, Thierry; Kulkarni, Rahul V

    2013-04-01

    Stochasticity in gene expression gives rise to fluctuations in protein levels across a population of genetically identical cells. Such fluctuations can lead to phenotypic variation in clonal populations; hence, there is considerable interest in quantifying noise in gene expression using stochastic models. However, obtaining exact analytical results for protein distributions has been an intractable task for all but the simplest models. Here, we invoke the partitioning property of Poisson processes to develop a mapping that significantly simplifies the analysis of stochastic models of gene expression. The mapping leads to exact protein distributions using results for mRNA distributions in models with promoter-based regulation. Using this approach, we derive exact analytical results for steady-state and time-dependent distributions for the basic two-stage model of gene expression. Furthermore, we show how the mapping leads to exact protein distributions for extensions of the basic model that include the effects of posttranscriptional and posttranslational regulation. The approach developed in this work is widely applicable and can contribute to a quantitative understanding of stochasticity in gene expression and its regulation.

  6. Exact protein distributions for stochastic models of gene expression using partitioning of Poisson processes

    NASA Astrophysics Data System (ADS)

    Pendar, Hodjat; Platini, Thierry; Kulkarni, Rahul V.

    2013-04-01

    Stochasticity in gene expression gives rise to fluctuations in protein levels across a population of genetically identical cells. Such fluctuations can lead to phenotypic variation in clonal populations; hence, there is considerable interest in quantifying noise in gene expression using stochastic models. However, obtaining exact analytical results for protein distributions has been an intractable task for all but the simplest models. Here, we invoke the partitioning property of Poisson processes to develop a mapping that significantly simplifies the analysis of stochastic models of gene expression. The mapping leads to exact protein distributions using results for mRNA distributions in models with promoter-based regulation. Using this approach, we derive exact analytical results for steady-state and time-dependent distributions for the basic two-stage model of gene expression. Furthermore, we show how the mapping leads to exact protein distributions for extensions of the basic model that include the effects of posttranscriptional and posttranslational regulation. The approach developed in this work is widely applicable and can contribute to a quantitative understanding of stochasticity in gene expression and its regulation.

  7. Advanced manufacturing development of a composite empennage component for L-1011 aircraft

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Work continued toward the development of tooling and processing concepts required for a cocured hat/skin cover assembly. A plan was developed and implemented to develop the process for using preimpregnated T300/5208 with a resin content of 34 + or - 2 percent by weight. Use of this material results in a simplified laminating process because removal by bleeding or prebleeding is no longer required. The approach to this task basically consists of fabricating and testing flat laminated panels and simulated structural panels to verify known processing techniques relative to end-laminate quality. The flat panels were used to determine air bleeding arrangement and required cure cycle. Single and multihat-stiffened panels were fabricated using the established air bleeding arrangement and cure cycle with the resulting cured parts yielding excellent correlation of ply thickness with all surfaces clear of porosity and voids.

  8. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal... not to exceed the simplified acquisition threshold. The short selection process described in FAR 36.602-5 is authorized for use for contracts not expected to exceed the simplified acquisition threshold...

  9. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5... for contracts not to exceed the simplified acquisition threshold. (a) In contracts not expected to exceed the simplified acquisition threshold, either or both of the short selection processes set out at...

  10. A Simple Picaxe Microcontroller Pulse Source for Juxtacellular Neuronal Labelling †

    PubMed Central

    Verberne, Anthony J. M.

    2016-01-01

    Juxtacellular neuronal labelling is a method which allows neurophysiologists to fill physiologically-identified neurons with small positively-charged marker molecules. Labelled neurons are identified by histochemical processing of brain sections along with immunohistochemical identification of neuropeptides, neurotransmitters, neurotransmitter transporters or biosynthetic enzymes. A microcontroller-based pulser circuit and associated BASIC software script is described for incorporation into the design of a commercially-available intracellular electrometer for use in juxtacellular neuronal labelling. Printed circuit board construction has been used for reliability and reproducibility. The current design obviates the need for a separate digital pulse source and simplifies the juxtacellular neuronal labelling procedure. PMID:28952589

  11. BCB Bonding Technology of Back-Side Illuminated COMS Device

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Jiang, G. Q.; Jia, S. X.; Shi, Y. M.

    2018-03-01

    Back-side illuminated CMOS(BSI) sensor is a key device in spaceborne hyperspectral imaging technology. Compared with traditional devices, the path of incident light is simplified and the spectral response is planarized by BSI sensors, which meets the requirements of quantitative hyperspectral imaging applications. Wafer bonding is the basic technology and key process of the fabrication of BSI sensors. 6 inch bonding of CMOS wafer and glass wafer was fabricated based on the low bonding temperature and high stability of BCB. The influence of different thickness of BCB on bonding strength was studied. Wafer bonding with high strength, high stability and no bubbles was fabricated by changing bonding conditions.

  12. An easy-to-perform photometric assay for methyltransferase activity measurements.

    PubMed

    Schäberle, Till F; Siba, Christian; Höver, Thomas; König, Gabriele M

    2013-01-01

    Methyltransferases (MTs) catalyze the transfer of a methyl group from S-adenosylmethionine (SAM) to a suitable substrate. Such methylations are important modifications in secondary metabolisms, especially on natural products produced by polyketide synthases and nonribosomal peptide synthetases, many of which are of special interest due to their prominent pharmacological activities (e.g., lovastatin, cyclosporin). To gain basic biochemical knowledge on the methylation process, it is of immense relevance to simplify methods concerning experimental problems caused by a large variety in substrates. Here, we present a photometric method to analyze MT activity by measuring SAM consumption in a coupled enzyme assay. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. The rock-paper-scissors game

    NASA Astrophysics Data System (ADS)

    Zhou, Hai-Jun

    2016-04-01

    Rock-Paper-Scissors (RPS), a game of cyclic dominance, is not merely a popular children's game but also a basic model system for studying decision-making in non-cooperative strategic interactions. Aimed at students of physics with no background in game theory, this paper introduces the concepts of Nash equilibrium and evolutionarily stable strategy, and reviews some recent theoretical and empirical efforts on the non-equilibrium properties of the iterated RPS, including collective cycling, conditional response patterns and microscopic mechanisms that facilitate cooperation. We also introduce several dynamical processes to illustrate the applications of RPS as a simplified model of species competition in ecological systems and price cycling in economic markets.

  14. DIGE compatible labelling of surface proteins on vital cells in vitro and in vivo.

    PubMed

    Mayrhofer, Corina; Krieger, Sigurd; Allmaier, Günter; Kerjaschki, Dontscho

    2006-01-01

    Efficient methods for profiling of the cell surface proteome are desirable to get a deeper insight in basic biological processes, to localise proteins and to uncover proteins differentially expressed in diseases. Here we present a strategy to target cell surface exposed proteins via fluorescence labelling using CyDye DIGE fluors. This method has been applied to human cell lines in vitro as well as to a complex biological system in vivo. It allows detection of fluorophore-tagged cell surface proteins and visualisation of the accessible proteome within a single 2-D gel, simplifying subsequent UV MALDI-MS analysis.

  15. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  16. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5... selection process for procurements not to exceed the simplified acquisition threshold. References to FAR 36...

  17. Simplifying informed consent for biorepositories: stakeholder perspectives.

    PubMed

    Beskow, Laura M; Friedman, Joëlle Y; Hardy, N Chantelle; Lin, Li; Weinfurt, Kevin P

    2010-09-01

    Complex and sometimes controversial information must be conveyed during the consent process for participation in biorepositories, and studies suggest that consent documents in general are growing in length and complexity. As a first step toward creating a simplified biorepository consent form, we gathered data from multiple stakeholders about what information was most important for prospective participants to know when making a decision about taking part in a biorepository. We recruited 52 research participants, 12 researchers, and 20 institutional review board representatives from Durham and Kannapolis, NC. These subjects were asked to read a model biorepository consent form and highlight sentences they deemed most important. On average, institutional review board representatives identified 72.3% of the sentences as important; researchers selected 53.0%, and participants 40.4% (P = 0.0004). Participants most often selected sentences about the kinds of individual research results that might be offered, privacy risks, and large-scale data sharing. Researchers highlighted sentences about the biorepository's purpose, privacy protections, costs, and participant access to individual results. Institutional review board representatives highlighted sentences about collection of basic personal information, medical record access, and duration of storage. The differing mandates of these three groups can translate into widely divergent opinions about what information is important and appropriate to include a consent form. These differences could frustrate efforts to move simplified forms--for biobanking as well as for other kinds of research--into actual use, despite continued calls for such forms.

  18. Solving fatigue-related problems with cardiac arrest survivors living in the community.

    PubMed

    Kim, Young Joo; Rogers, Joan C; Raina, Ketki D; Callaway, Clifton W; Rittenberger, Jon C; Leibold, Mary Lou; Holm, Margo B

    2017-09-01

    The aim was to describe fatigue-related problems reported by post-cardiac arrest adults with chronic fatigue and energy conservation strategies generated using an Energy Conservation plus Problem Solving Therapy intervention. Following an introduction to the intervention process outlined in a Participant Workbook, participants engaged in the telephone intervention by identifying one to two fatigue-related problems. They then brainstormed with the interventionist to identify potential strategies to reduce fatigue, tested them, and either modified the strategies or moved to the next problem over three to five sessions. Eighteen cardiac arrest survivors with chronic fatigue identified instrumental activities of daily living and leisure activities as fatigue-related activities more frequently than basic activities of daily living. Energy Conservation strategies used most frequently were: plan ahead, pace yourself, delegate to others, and simplify the task. Post-cardiac arrest adults living in the community with chronic fatigue can return to previous daily activities by using energy conservation strategies such as planning ahead, pacing tasks, delegating tasks, and simplifying tasks. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Preventing healthcare-associated infections through human factors engineering.

    PubMed

    Jacob, Jesse T; Herwaldt, Loreen A; Durso, Francis T

    2018-05-24

    Human factors engineering (HFE) approaches are increasingly being used in healthcare, but have been applied in relatively limited ways to infection prevention and control (IPC). Previous studies have focused on using selected HFE tools, but newer literature supports a system-based HFE approach to IPC. Cross-contamination and the existence of workarounds suggest that healthcare workers need better support to reduce and simplify steps in delivering care. Simplifying workflow can lead to better understanding of why a process fails and allow for improvements to reduce errors and increase efficiency. Hand hygiene can be improved using visual cues and nudges based on room layout. Using personal protective equipment appropriately appears simple, but exists in a complex interaction with workload, behavior, emotion, and environmental variables including product placement. HFE can help prevent the pathogen transmission through improving environmental cleaning and appropriate use of medical devices. Emerging evidence suggests that HFE can be applied in IPC to reduce healthcare-associated infections. HFE and IPC collaboration can help improve many of the basic best practices including use of hand hygiene and personal protective equipment by healthcare workers during patient care.

  20. SeaWiFS Science Algorithm Flow Chart

    NASA Technical Reports Server (NTRS)

    Darzi, Michael

    1998-01-01

    This flow chart describes the baseline science algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Data Processing System (SDPS). As such, it includes only processing steps used in the generation of the operational products that are archived by NASA's Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC). It is meant to provide the reader with a basic understanding of the scientific algorithm steps applied to SeaWiFS data. It does not include non-science steps, such as format conversions, and places the greatest emphasis on the geophysical calculations of the level-2 processing. Finally, the flow chart reflects the logic sequences and the conditional tests of the software so that it may be used to evaluate the fidelity of the implementation of the scientific algorithm. In many cases however, the chart may deviate from the details of the software implementation so as to simplify the presentation.

  1. Quantification of Physical Activity During Basic Combat Training and Associated Injuries

    DTIC Science & Technology

    2014-03-31

    sock systems (6) and antiperspirants (7) to reduce foot blisters and the use of mouth guards for the reduction of orofacial injuries (S). These and...simplified list of lying down, sitting, standing, walking or very active. An Israeli Defense Forces study ញ) investigated overuse injuries and PAin ... orofacial injuries during United States Army Basic Military Training. Dent Traumata! 2006, 24:86-90. 9. U.S. Army Public Health Command, Injury

  2. Quantization and symmetry in periodic coverage patterns with applications to earth observation. [for satellite ground tracks

    NASA Technical Reports Server (NTRS)

    King, J. C.

    1975-01-01

    The general orbit-coverage problem in a simplified physical model is investigated by application of numerical approaches derived from basic number theory. A system of basic and general properties is defined by which idealized periodic coverage patterns may be characterized, classified, and delineated. The principal common features of these coverage patterns are their longitudinal quantization, determined by the revolution number R, and their overall symmetry.

  3. Theoretical study on the interactions between chlordecone hydrate and acidic surface groups of activated carbon under basic pH conditions.

    PubMed

    Melchor-Rodríguez, Kenia; Gamboa-Carballo, Juan José; Ferino-Pérez, Anthuan; Passé-Coutrin, Nady; Gaspard, Sarra; Jáuregui-Haza, Ulises Javier

    2018-05-01

    A theoretical study of the influence of acidic surface groups (SG) of activated carbon (AC) on chlordecone hydrate (CLDh) adsorption is presented, in order to help understanding the adsorption process under basic pH conditions. A seven rings aromatic system (coronene) with a functional group in the edge was used as a simplified model of AC to evaluate the influence of SG in the course of adsorption from aqueous solution at basic pH conditions. Two SG were modeled in their deprotonated form: carboxyl and hydroxyl (COO - and O - ), interacting with CLDh. In order to model the solvation process, all systems under study were calculated with up to three water molecules. Multiple Minima Hypersurface (MMH) methodology was employed to study the interactions of CLDh with SG on AC using PM7 semiempirical Hamiltonian, to explore the potential energy surfaces of the systems and evaluate their thermodynamic association energies. The re-optimization of representative structures obtained from MMH was done using M06-2X Density Functional Theory. The Quantum Theory of Atoms in Molecules (QTAIM) was used to characterize the interaction types. As result, the association of CLDh with acidic SG at basic pH conditions preferentially occurs between the two alcohol groups of CLDh with COO - and O - groups and by dispersive interactions of chlorine atoms of CLDh with the graphitic surface. On the other hand, the presence of covalent interactions between the negatively charged oxygen of SG and one hydrogen atom of CLDh alcohol groups (O - ⋯HO interactions) without water molecules, was confirmed by QTAIM study. It can be concluded that the interactions of CLDh with acidic SG of AC under basic pH conditions confirms the physical mechanisms of adsorption process. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Determining Planck's Constant Using a Light-emitting Diode.

    ERIC Educational Resources Information Center

    Sievers, Dennis; Wilson, Alan

    1989-01-01

    Describes a method for making a simple, inexpensive apparatus which can be used to determine Planck's constant. Provides illustrations of a circuit diagram using one or more light-emitting diodes and a BASIC computer program for simplifying calculations. (RT)

  5. Photoelectric Effect: Back to Basics.

    ERIC Educational Resources Information Center

    Powell, R. A.

    1978-01-01

    Presents a simplified theoretical analysis of the variation of quantum yield with photon energy in the photoelectric experiment. Describes a way to amplify the experiment and make it more instructive to advanced students through the measurement of quantum yield of a photo cell. (GA)

  6. Programmer's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.

    1981-01-01

    The MMLE3 is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program. The implementation of the program on specific computer systems is discussed. The structure of the program is diagrammed, and the function and operation of individual routines is described. Complete listings and reference maps of the routines are included on microfiche as a supplement. Four test cases are discussed; listings of the input cards and program output for the test cases are included on microfiche as a supplement.

  7. Educational Video Recording and Editing for The Hand Surgeon

    PubMed Central

    Rehim, Shady A.; Chung, Kevin C.

    2016-01-01

    Digital video recordings are increasingly used across various medical and surgical disciplines including hand surgery for documentation of patient care, resident education, scientific presentations and publications. In recent years, the introduction of sophisticated computer hardware and software technology has simplified the process of digital video production and improved means of disseminating large digital data files. However, the creation of high quality surgical video footage requires basic understanding of key technical considerations, together with creativity and sound aesthetic judgment of the videographer. In this article we outline the practical steps involved with equipment preparation, video recording, editing and archiving as well as guidance for the choice of suitable hardware and software equipment. PMID:25911212

  8. A thermodynamically general theory for convective vortices

    NASA Astrophysics Data System (ADS)

    Renno, Nilton O.

    2008-08-01

    Convective vortices are common features of atmospheres that absorb lower-entropy-energy at higher temperatures than they reject higher-entropy-energy to space. These vortices range from small to large-scale and play an important role in the vertical transport of heat, momentum, and tracer species. Thus, the development of theoretical models for convective vortices is important to our understanding of some of the basic features of planetary atmospheres. The heat engine framework is a useful tool for studying convective vortices. However, current theories assume that convective vortices are reversible heat engines. Since there are questions about how reversible real atmospheric heat engines are, their usefulness for studying real atmospheric vortices is somewhat controversial. In order to reduce this problem, a theory for convective vortices that includes irreversible processes is proposed. The paper's main result is that the proposed theory provides an expression for the pressure drop along streamlines that includes the effects of irreversible processes. It is shown that a simplified version of this expression is a generalization of Bernoulli's equation to convective circulations. It is speculated that the proposed theory not only explains the intensity, but also sheds light on other basic features of convective vortices such as their physical appearance.

  9. Transfer of Perceptual Expertise: The Case of Simplified and Traditional Chinese Character Recognition

    ERIC Educational Resources Information Center

    Liu, Tianyin; Chuk, Tin Yim; Yeh, Su-Ling; Hsiao, Janet H.

    2016-01-01

    Expertise in Chinese character recognition is marked by reduced holistic processing (HP), which depends mainly on writing rather than reading experience. Here we show that, while simplified and traditional Chinese readers demonstrated a similar level of HP when processing characters shared between the simplified and traditional scripts, simplified…

  10. Event-related potentials, cognition, and behavior: a biological approach.

    PubMed

    Kotchoubey, Boris

    2006-01-01

    The prevailing cognitive-psychological accounts of event-related brain potentials (ERPs) assume that ERP components manifest information processing operations leading from stimulus to response. Since this view encounters numerous difficulties already analyzed in previous studies, an alternative view is presented here that regards cortical control of behavior as a repetitive sensorimotor cycle consisting of two phases: (i) feedforward anticipation and (ii) feedback cortical performance. This view allows us to interpret in an integrative manner numerous data obtained from very different domains of ERP studies: from biophysics of ERP waves to their relationship to the processing of language, in which verbal behavior is viewed as likewise controlled by the same two basic control processes: feedforward (hypothesis building) and feedback (hypothesis checking). The proposed approach is intentionally simplified, explaining numerous effects on the basis of few assumptions and relating several levels of analysis: neurophysiology, macroelectrical processes (i.e. ERPs), cognition and behavior. It can, therefore, be regarded as a first approximation to a general theory of ERPs.

  11. Teaching Analytical Thinking

    ERIC Educational Resources Information Center

    Behn, Robert D.; Vaupel, James W.

    1976-01-01

    Description of the philosophy and general nature of a course at Drake University that emphasizes basic concepts of analytical thinking, including think, decompose, simplify, specify, and rethink problems. Some sample homework exercises are included. The journal is available from University of California Press, Berkeley, California 94720.…

  12. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal... contracts not to exceed the simplified acquisition threshold. Either of the procedures provided in FAR 36... simplified acquisition threshold. ...

  13. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... to exceed the simplified acquisition threshold. The HCA may include either or both procedures in FAR...

  14. Terry Turbopump Expanded Operating Band Full-Scale Component and Basic Science Detailed Test Plan - Final.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Douglas; Solom, Matthew

    This document details the milestone approach to define the true operating limitations (margins) of the Terry turbopump systems used in the nuclear industry for Milestone 3 (full-scale component experiments) and Milestone 4 (Terry turbopump basic science experiments) efforts. The overall multinational-sponsored program creates the technical basis to: (1) reduce and defer additional utility costs, (2) simplify plant operations, and (3) provide a better understanding of the true margin which could reduce overall risk of operations.

  15. Terry Turbopump Expanded Operating Band Full-Scale Component and Basic Science Detailed Test Plan-Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solom, Matthew; Ross, Kyle; Cardoni, Jeffrey N.

    This document details the milestone approach to define the true operating limitations (margins) of the Terry turbopump systems used in the nuclear industry for Milestone 3 (full-scale component experiments) and Milestone 4 (Terry turbopump basic science experiments) efforts. The overall multinational-sponsored program creates the technical basis to: (1) reduce and defer additional utility costs, (2) simplify plant operations, and (3) provide a better understanding of the true margin which could reduce overall risk of operations.

  16. Feature-Motivated Simplified Adaptive PCNN-Based Medical Image Fusion Algorithm in NSST Domain.

    PubMed

    Ganasala, Padma; Kumar, Vinod

    2016-02-01

    Multimodality medical image fusion plays a vital role in diagnosis, treatment planning, and follow-up studies of various diseases. It provides a composite image containing critical information of source images required for better localization and definition of different organs and lesions. In the state-of-the-art image fusion methods based on nonsubsampled shearlet transform (NSST) and pulse-coupled neural network (PCNN), authors have used normalized coefficient value to motivate the PCNN-processing both low-frequency (LF) and high-frequency (HF) sub-bands. This makes the fused image blurred and decreases its contrast. The main objective of this work is to design an image fusion method that gives the fused image with better contrast, more detail information, and suitable for clinical use. We propose a novel image fusion method utilizing feature-motivated adaptive PCNN in NSST domain for fusion of anatomical images. The basic PCNN model is simplified, and adaptive-linking strength is used. Different features are used to motivate the PCNN-processing LF and HF sub-bands. The proposed method is extended for fusion of functional image with an anatomical image in improved nonlinear intensity hue and saturation (INIHS) color model. Extensive fusion experiments have been performed on CT-MRI and SPECT-MRI datasets. Visual and quantitative analysis of experimental results proved that the proposed method provides satisfactory fusion outcome compared to other image fusion methods.

  17. Va-Room: Motorcycle Safety.

    ERIC Educational Resources Information Center

    Keller, Rosanne

    One of a series of instructional materials produced by the Literacy Council of Alaska, this booklet provides information about motorcycle safety. Using a simplified vocabulary and shorter sentences, it offers statistics concerning motorcycle accidents; information on how to choose the proper machine; basic information about the operation of the…

  18. Shotgun Canceling.

    ERIC Educational Resources Information Center

    Szymanski, Theodore

    1999-01-01

    Discusses a common misunderstanding demonstrated by many students in basic mathematics courses: not knowing how to properly "cancel" factors in simplifying mathematical equations. Asserts that "crossing-out" or "canceling" is not a valid mathematical operation, and that instructors should be wary about using these terms because of the ease with…

  19. Investigation of Pressure Surges in Aircraft Hydraulic Systems

    DTIC Science & Technology

    1952-03-01

    RESTRICTED Figure 2 TEST APPARATUS FOR CLOSED-END TUBE SYSTEM TESTS , r WADC TR52-37 10 SECURITY INFORMATION-RESTRICTED SECURITY INFORMATION -R ESTR ICTED...simplified circuitto--decrease the labor involved in circuit solutions by manual calculation. The circuit developed for the basic accumulator, valve

  20. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5... for contracts not to exceed the simplified acquisition threshold. At each occurrence, CO approval...-engineer contracts not expected to exceed the simplified acquisition threshold. ...

  1. Design of RISC Processor Using VHDL and Cadence

    NASA Astrophysics Data System (ADS)

    Moslehpour, Saeid; Puliroju, Chandrasekhar; Abu-Aisheh, Akram

    The project deals about development of a basic RISC processor. The processor is designed with basic architecture consisting of internal modules like clock generator, memory, program counter, instruction register, accumulator, arithmetic and logic unit and decoder. This processor is mainly used for simple general purpose like arithmetic operations and which can be further developed for general purpose processor by increasing the size of the instruction register. The processor is designed in VHDL by using Xilinx 8.1i version. The present project also serves as an application of the knowledge gained from past studies of the PSPICE program. The study will show how PSPICE can be used to simplify massive complex circuits designed in VHDL Synthesis. The purpose of the project is to explore the designed RISC model piece by piece, examine and understand the Input/ Output pins, and to show how the VHDL synthesis code can be converted to a simplified PSPICE model. The project will also serve as a collection of various research materials about the pieces of the circuit.

  2. 29 CFR 548.100 - Introductory statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... simplify bookkeeping and computation of overtime pay. 1 The regular rate is the average hourly earnings of... AUTHORIZATION OF ESTABLISHED BASIC RATES FOR COMPUTING OVERTIME PAY Interpretations Introduction § 548.100... requirements of computing overtime pay at the regular rate, 1 and to allow, under specific conditions, the use...

  3. An efficient and rapid influenza gene cloning strategy for reverse genetics system.

    PubMed

    Shao, Hongxia; Fan, Zhonglei; Wan, Zhimin; Tian, Xiaoyan; Chen, Hongjun; Perez, Daniel R; Qin, Aijian; Ye, Jianqiang

    2015-09-15

    Influenza reverse genetics plays vital roles in understanding influenza molecular characteristics and vaccine development. However, current influenza reverse genetics heavily depends on restriction enzyme and ligation for gene cloning. The traditional cloning process of influenza eight fragments for virus rescuing generally requires considerable work. To simplify and increase the pace of gene cloning for influenza reverse genetics system, we developed a rapid restriction enzyme-free ExnaseTM II-based in vitro recombination approach for influenza gene cloning. We used this strategy rapidly and successfully to clone influenza eight genes both from viruses PR8 and H9N2 for virus rescuing. Our data demonstrate that the strategy developed here can accelerate the process of influenza gene cloning into reverse genetics system, and shows high potential for applications in both influenza basic and applied research. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Computational study on UV curing characteristics in nanoimprint lithography: Stochastic simulation

    NASA Astrophysics Data System (ADS)

    Koyama, Masanori; Shirai, Masamitsu; Kawata, Hiroaki; Hirai, Yoshihiko; Yasuda, Masaaki

    2017-06-01

    A computational simulation model of UV curing in nanoimprint lithography based on a simplified stochastic approach is proposed. The activated unit reacts with a randomly selected monomer within a critical reaction radius. Cluster units are chained to each other. Then, another monomer is activated and the next chain reaction occurs. This process is repeated until a virgin monomer disappears within the reaction radius or until the activated monomers react with each other. The simulation model well describes the basic UV curing characteristics, such as the molecular weight distributions of the reacted monomers and the effect of the initiator concentration on the conversion ratio. The effects of film thickness on UV curing characteristics are also studied by the simulation.

  5. The practical and pedagogical advantages of an ambigraphic nucleic acid notation.

    PubMed

    Rozak, David A

    2006-01-01

    The universally applied IUPAC notation for nucleic acids was adopted primarily to facilitate the mental association of G, A, T, C, and the related ambiguity characters with the bases they represent. However it is possible to create a notation that offers greater support for the basic manipulations and analyses to which genetic sequences frequently are subjected. By designing a nucleic acid notation around ambigrams, it is possible to simplify the frequently applied process of reverse complementation and aid the visualization of palindromes. The ambigraphic notation presented here also uses common orthographic features such as stems and loops to highlight guanine and cytosine rich regions, support the derivation of ambiguity characters, and aid educators in teaching the fundamentals of molecular genetics.

  6. Study of photon emission by electron capture during solar nuclei acceleration, 1: Temperature-dependent cross section for charge changing processes

    NASA Technical Reports Server (NTRS)

    Perez-Peraza, J.; Alvarez, M.; Laville, A.; Gallegos, A.

    1985-01-01

    The study of charge changing cross sections of fast ions colliding with matter provides the fundamental basis for the analysis of the charge states produced in such interactions. Given the high degree of complexity of the phenomena, there is no theoretical treatment able to give a comprehensive description. In fact, the involved processes are very dependent on the basic parameters of the projectile, such as velocity charge state, and atomic number, and on the target parameters, the physical state (molecular, atomic or ionized matter) and density. The target velocity, may have also incidence on the process, through the temperature of the traversed medium. In addition, multiple electron transfer in single collisions intrincates more the phenomena. Though, in simplified cases, such as protons moving through atomic hydrogen, considerable agreement has been obtained between theory and experiments However, in general the available theoretical approaches have only limited validity in restricted regions of the basic parameters. Since most measurements of charge changing cross sections are performed in atomic matter at ambient temperature, models are commonly based on the assumption of targets at rest, however at Astrophysical scales, temperature displays a wide range in atomic and ionized matter. Therefore, due to the lack of experimental data , an attempt is made here to quantify temperature dependent cross sections on basis to somewhat arbitrary, but physically reasonable assumptions.

  7. Understanding the complexity of redesigning care around the clinical microsystem.

    PubMed

    Barach, P; Johnson, J K

    2006-12-01

    The microsystem is an organizing design construct in which social systems cut across traditional discipline boundaries. Because of its interdisciplinary focus, the clinical microsystem provides a conceptual and practical framework for simplifying complex organizations that deliver care. It also provides an important opportunity for organizational learning. Process mapping and microworld simulation may be especially useful for redesigning care around the microsystem concept. Process mapping, in which the core processes of the microsystem are delineated and assessed from the perspective of how the individual interacts with the system, is an important element of the continuous learning cycle of the microsystem and the healthcare organization. Microworld simulations are interactive computer based models that can be used as an experimental platform to test basic questions about decision making misperceptions, cause-effect inferences, and learning within the clinical microsystem. Together these tools offer the user and organization the ability to understand the complexity of healthcare systems and to facilitate the redesign of optimal outcomes.

  8. Shape design of an optimal comfortable pillow based on the analytic hierarchy process method

    PubMed Central

    Liu, Shuo-Fang; Lee, Yann-Long; Liang, Jung-Chin

    2011-01-01

    Objective Few studies have analyzed the shapes of pillows. The purpose of this study was to investigate the relationship between the pillow shape design and subjective comfort level for asymptomatic subjects. Methods Four basic pillow designs factors were selected on the basis of literature review and recombined into 8 configurations for testing the rank of degrees of comfort. The data were analyzed by the analytic hierarchy process method to determine the most comfortable pillow. Results Pillow number 4 was the most comfortable pillow in terms of head, neck, shoulder, height, and overall comfort. The design factors of pillow number 4 were using a combination of standard, cervical, and shoulder pillows. A prototype of this pillow was developed on the basis of the study results for designing future pillow shapes. Conclusions This study investigated the comfort level of particular users and redesign features of a pillow. A deconstruction analysis would simplify the process of determining the most comfortable pillow design and aid designers in designing pillows for groups. PMID:22654680

  9. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    ERIC Educational Resources Information Center

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  10. The Keynesian Diagram: A Cross to Bear?

    ERIC Educational Resources Information Center

    Fleck, Juergen

    In elementary economics courses students are often introduced to the basic concepts of macroeconomics through very simplified static models, and the concept of a macroeconomic equilibrium is generally explained with the help of an aggregate demand/aggregate supply (AD/AS) model and an income/expenditure model (via the Keynesian cross diagram).…

  11. Permanent Disability Evaluation

    PubMed Central

    Chovil, A. C.

    1975-01-01

    This paper is a review of the theory and practice of disability evaluation with emphasis on the distinction between medical impairment and disability. The requirements for making an accurate assessment of medical impairments are discussed. The author suggests three basic standards which can be used for establishing a simplified method of assessing physical impairment. PMID:20469213

  12. Simplified bridge load rating methodology using the national bridge inventory file : user manual

    DOT National Transportation Integrated Search

    1988-08-01

    The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...

  13. Simplified bridge load rating methodology using the national bridge inventory file : program listing

    DOT National Transportation Integrated Search

    1987-08-01

    The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...

  14. An Integration and Evaluation Framework for ESPC Coupled Models

    DTIC Science & Technology

    2014-09-30

    the CESM-HYCOM coupled system under the OI for ESPC award. This should be simplified by the use of the MCT datatype in ESMF. Make it available to...ESPC Testbed: Basic optimization Implement MCT datatype in ESMF and include in ESMF release. This was not yet started. 5 ESPC Testbed

  15. A Comprehensive Real-World Distillation Experiment

    ERIC Educational Resources Information Center

    Kazameas, Christos G.; Keller, Kaitlin N.; Luyben, William L.

    2015-01-01

    Most undergraduate mass transfer and separation courses cover the design of distillation columns, and many undergraduate laboratories have distillation experiments. In many cases, the treatment is restricted to simple column configurations and simplifying assumptions are made so as to convey only the basic concepts. In industry, the analysis of a…

  16. PKSolver: An add-in program for pharmacokinetic and pharmacodynamic data analysis in Microsoft Excel.

    PubMed

    Zhang, Yong; Huo, Meirong; Zhou, Jianping; Xie, Shaofei

    2010-09-01

    This study presents PKSolver, a freely available menu-driven add-in program for Microsoft Excel written in Visual Basic for Applications (VBA), for solving basic problems in pharmacokinetic (PK) and pharmacodynamic (PD) data analysis. The program provides a range of modules for PK and PD analysis including noncompartmental analysis (NCA), compartmental analysis (CA), and pharmacodynamic modeling. Two special built-in modules, multiple absorption sites (MAS) and enterohepatic circulation (EHC), were developed for fitting the double-peak concentration-time profile based on the classical one-compartment model. In addition, twenty frequently used pharmacokinetic functions were encoded as a macro and can be directly accessed in an Excel spreadsheet. To evaluate the program, a detailed comparison of modeling PK data using PKSolver and professional PK/PD software package WinNonlin and Scientist was performed. The results showed that the parameters estimated with PKSolver were satisfactory. In conclusion, the PKSolver simplified the PK and PD data analysis process and its output could be generated in Microsoft Word in the form of an integrated report. The program provides pharmacokinetic researchers with a fast and easy-to-use tool for routine and basic PK and PD data analysis with a more user-friendly interface. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  17. Mathematical analysis of frontal affinity chromatography in particle and membrane configurations.

    PubMed

    Tejeda-Mansir, A; Montesinos, R M; Guzmán, R

    2001-10-30

    The scaleup and optimization of large-scale affinity-chromatographic operations in the recovery, separation and purification of biochemical components is of major industrial importance. The development of mathematical models to describe affinity-chromatographic processes, and the use of these models in computer programs to predict column performance is an engineering approach that can help to attain these bioprocess engineering tasks successfully. Most affinity-chromatographic separations are operated in the frontal mode, using fixed-bed columns. Purely diffusive and perfusion particles and membrane-based affinity chromatography are among the main commercially available technologies for these separations. For a particular application, a basic understanding of the main similarities and differences between particle and membrane frontal affinity chromatography and how these characteristics are reflected in the transport models is of fundamental relevance. This review presents the basic theoretical considerations used in the development of particle and membrane affinity chromatography models that can be applied in the design and operation of large-scale affinity separations in fixed-bed columns. A transport model for column affinity chromatography that considers column dispersion, particle internal convection, external film resistance, finite kinetic rate, plus macropore and micropore resistances is analyzed as a framework for exploring further the mathematical analysis. Such models provide a general realistic description of almost all practical systems. Specific mathematical models that take into account geometric considerations and transport effects have been developed for both particle and membrane affinity chromatography systems. Some of the most common simplified models, based on linear driving-force (LDF) and equilibrium assumptions, are emphasized. Analytical solutions of the corresponding simplified dimensionless affinity models are presented. Particular methods for estimating the parameters that characterize the mass-transfer and adsorption mechanisms in affinity systems are described.

  18. Recent developments in high speed lens design at the NPRL

    NASA Astrophysics Data System (ADS)

    McDowell, M. W.; Klee, H. W.

    An account is given of recent South African developments in large aperture lens design for high speed photography that are based on the novel zero-power corrector concept. Complex multiple-element lens configurations based on such conventional optical layouts as the Petzval and double-Gauss can by the means presented be replaced with greatly simplified lens configurations employing as few as four basic elements. A tabulation is made of third-order monochromatic and first-order chromatic aberrations of the basic four-element zero-power corrector design.

  19. The Muscle Sensor for on-site neuroscience lectures to pave the way for a better understanding of brain-machine-interface research.

    PubMed

    Koizumi, Amane; Nagata, Osamu; Togawa, Morio; Sazi, Toshiyuki

    2014-01-01

    Neuroscience is an expanding field of science to investigate enigmas of brain and human body function. However, the majority of the public have never had the chance to learn the basics of neuroscience and new knowledge from advanced neuroscience research through hands-on experience. Here, we report that we produced the Muscle Sensor, a simplified electromyography, to promote educational understanding in neuroscience. The Muscle Sensor can detect myoelectric potentials which are filtered and processed as 3-V pulse signals to shine a light bulb and emit beep sounds. With this educational tool, we delivered "On-Site Neuroscience Lectures" in Japanese junior-high schools to facilitate hands-on experience of neuroscientific electrophysiology and to connect their text-book knowledge to advanced neuroscience researches. On-site neuroscience lectures with the Muscle Sensor pave the way for a better understanding of the basics of neuroscience and the latest topics such as how brain-machine-interface technology could help patients with disabilities such as spinal cord injuries. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. A simple and inexpensive pulsing device for data-recording cameras

    Treesearch

    David L. Sonderman

    1973-01-01

    In some areas of forestry and wood utilization research, use of automatic data recording equipment has become commonplace. This research note describes the basic electronic components needed to modify an existing intervalometer into a simplified pulsing device for controlling an automatic data recording camera. The pulsing device is easily assembled and inexpensive,...

  1. SigrafW: An Easy-to-Use Program for Fitting Enzyme Kinetic Data

    ERIC Educational Resources Information Center

    Leone, Francisco Assis; Baranauskas, Jose Augusto; Furriel, Rosa Prazeres Melo; Borin, Ivana Aparecida

    2005-01-01

    SigrafW is Windows-compatible software developed using the Microsoft[R] Visual Basic Studio program that uses the simplified Hill equation for fitting kinetic data from allosteric and Michaelian enzymes. SigrafW uses a modified Fibonacci search to calculate maximal velocity (V), the Hill coefficient (n), and the enzyme-substrate apparent…

  2. Catmull-Rom Curve Fitting and Interpolation Equations

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2010-01-01

    Computer graphics and animation experts have been using the Catmull-Rom smooth curve interpolation equations since 1974, but the vector and matrix equations can be derived and simplified using basic algebra, resulting in a simple set of linear equations with constant coefficients. A variety of uses of Catmull-Rom interpolation are demonstrated,…

  3. Sums and Products of Jointly Distributed Random Variables: A Simplified Approach

    ERIC Educational Resources Information Center

    Stein, Sheldon H.

    2005-01-01

    Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…

  4. A Rapid PCR-RFLP Method for Monitoring Genetic Variation among Commercial Mushroom Species

    ERIC Educational Resources Information Center

    Martin, Presley; Muruke, Masoud; Hosea, Kenneth; Kivaisi, Amelia; Zerwas, Nick; Bauerle, Cynthia

    2004-01-01

    We report the development of a simplified procedure for restriction fragment length polymorphism (RFLP) analysis of mushrooms. We have adapted standard molecular techniques to be amenable to an undergraduate laboratory setting in order to allow students to explore basic questions about fungal diversity and relatedness among mushroom species. The…

  5. Departures from sensible play in computer blackjack.

    PubMed

    Chau, A W; Phillips, J G; Von Baggo, K L

    2000-10-01

    Gambling has been viewed as irrational, and even though blackjack offers rational strategies (i.e., Basic [E. Thorp, 1966] and card counting), people exhibit departures from rationality (e.g., "Never Bust" strategies). To determine whether departures from rational behavior reflect ignorance or fatigue, university students were provided with on-line Basic advice while playing a simplified computer blackjack. Although the on-line advice initially affected the totals these players sat on, it was eventually discarded for higher risk strategies. Irrational play did not reflect ignorance or fatigue and was not necessarily conservative. Real fluctuations of odds in blackjack may lead to situations in which Basic is not perceived by players as effective. Because Basic is not a personalized strategy, it seems less likely to be maintained in the face of losses. Players were more optimistic that they might win when utilizing their personalized strategies.

  6. A Simplified Finite Element Simulation for Straightening Process of Thin-Walled Tube

    NASA Astrophysics Data System (ADS)

    Zhang, Ziqian; Yang, Huilin

    2017-12-01

    The finite element simulation is an effective way for the study of thin-walled tube in the two cross rolls straightening process. To determine the accurate radius of curvature of the roll profile more efficiently, a simplified finite element model based on the technical parameters of an actual two cross roll straightening machine, was developed to simulate the complex straightening process. Then a dynamic simulation was carried out using ANSYS LS-DYNA program. The result implied that the simplified finite element model was reasonable for simulate the two cross rolls straightening process, and can be obtained the radius of curvature of the roll profile with the tube’s straightness 2 mm/m.

  7. Simplified power processing for ion-thruster subsystems

    NASA Technical Reports Server (NTRS)

    Wessel, F. J.; Hancock, D. J.

    1983-01-01

    A design for a greatly simplified power-processing unit (SPPU) for the 8-cm diameter mercury-ion-thruster subsystem is discussed. This SPPU design will provide a tenfold reduction in parts count, a decrease in system mass and cost, and an increase in system reliability compared to the existing power-processing unit (PPU) used in the Hughes/NASA Lewis Research Center Ion Auxiliary Propulsion Subsystem. The simplifications achieved in this design will greatly increase the attractiveness of ion propulsion in near-term and future spacecraft propulsion applications. A description of a typical ion-thruster subsystem is given. An overview of the thruster/power-processor interface requirements is given. Simplified thruster power processing is discussed.

  8. Further analytical study of hybrid rocket combustion

    NASA Technical Reports Server (NTRS)

    Hung, W. S. Y.; Chen, C. S.; Haviland, J. K.

    1972-01-01

    Analytical studies of the transient and steady-state combustion processes in a hybrid rocket system are discussed. The particular system chosen consists of a gaseous oxidizer flowing within a tube of solid fuel, resulting in a heterogeneous combustion. Finite rate chemical kinetics with appropriate reaction mechanisms were incorporated in the model. A temperature dependent Arrhenius type fuel surface regression rate equation was chosen for the current study. The governing mathematical equations employed for the reacting gas phase and for the solid phase are the general, two-dimensional, time-dependent conservation equations in a cylindrical coordinate system. Keeping the simplifying assumptions to a minimum, these basic equations were programmed for numerical computation, using two implicit finite-difference schemes, the Lax-Wendroff scheme for the gas phase, and, the Crank-Nicolson scheme for the solid phase.

  9. Object extraction method for image synthesis

    NASA Astrophysics Data System (ADS)

    Inoue, Seiki

    1991-11-01

    The extraction of component objects from images is fundamentally important for image synthesis. In TV program production, one useful method is the Video-Matte technique for specifying the necessary boundary of an object. This, however, involves some manually intricate and tedious processes. A new method proposed in this paper can reduce the needed level of operator skill and simplify object extraction. The object is automatically extracted by just a simple drawing of a thick boundary line. The basic principle involves a thinning of the thick boundary line binary image using the edge intensity of the original image. This method has many practical advantages, including the simplicity of specifying an object, the high accuracy of thinned-out boundary line, its ease of application to moving images, and the lack of any need for adjustment.

  10. Organic thin film transistor with a simplified planar structure

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Yu, Jungsheng; Zhong, Jian; Jiang, Yadong

    2009-05-01

    Organic thin film transistor (OTFT) with a simplified planar structure is described. The gate electrode and the source/drain electrodes of OTFT are processed in one planar structure. And these three electrodes are deposited on the glass substrate by DC sputtering technology using Cr/Ni target. Then the electrode layouts of different width length ratio are made by photolithography technology at the same time. Only one step of deposition and one step of photolithography is needed while conventional process takes at least two steps of deposition and two steps of photolithography. Metal is first prepared on the other side of glass substrate and electrode is formed by photolithography. Then source/drain electrode is prepared by deposition and photolithography on the side with the insulation layer. Compared to conventional process of OTFTs, the process in this work is simplified. After three electrodes prepared, the insulation layer is made by spin coating method. The organic material of polyimide is used as the insulation layer. A small molecular material of pentacene is evaporated on the insulation layer using vacuum deposition as the active layer. The process of OTFTs needs only three steps totally. A semi-auto probe stage is used to connect the three electrodes and the probe of the test instrument. A charge carrier mobility of 0.3 cm2 /V s, is obtained from OTFTs on glass substrates with and on/off current ratio of 105. The OTFTs with the planar structure using simplified process can simplify the device process and reduce the fabrication cost.

  11. A Secure and Robust Compressed Domain Video Steganography for Intra- and Inter-Frames Using Embedding-Based Byte Differencing (EBBD) Scheme

    PubMed Central

    Idbeaa, Tarik; Abdul Samad, Salina; Husain, Hafizah

    2016-01-01

    This paper presents a novel secure and robust steganographic technique in the compressed video domain namely embedding-based byte differencing (EBBD). Unlike most of the current video steganographic techniques which take into account only the intra frames for data embedding, the proposed EBBD technique aims to hide information in both intra and inter frames. The information is embedded into a compressed video by simultaneously manipulating the quantized AC coefficients (AC-QTCs) of luminance components of the frames during MPEG-2 encoding process. Later, during the decoding process, the embedded information can be detected and extracted completely. Furthermore, the EBBD basically deals with two security concepts: data encryption and data concealing. Hence, during the embedding process, secret data is encrypted using the simplified data encryption standard (S-DES) algorithm to provide better security to the implemented system. The security of the method lies in selecting candidate AC-QTCs within each non-overlapping 8 × 8 sub-block using a pseudo random key. Basic performance of this steganographic technique verified through experiments on various existing MPEG-2 encoded videos over a wide range of embedded payload rates. Overall, the experimental results verify the excellent performance of the proposed EBBD with a better trade-off in terms of imperceptibility and payload, as compared with previous techniques while at the same time ensuring minimal bitrate increase and negligible degradation of PSNR values. PMID:26963093

  12. A Secure and Robust Compressed Domain Video Steganography for Intra- and Inter-Frames Using Embedding-Based Byte Differencing (EBBD) Scheme.

    PubMed

    Idbeaa, Tarik; Abdul Samad, Salina; Husain, Hafizah

    2016-01-01

    This paper presents a novel secure and robust steganographic technique in the compressed video domain namely embedding-based byte differencing (EBBD). Unlike most of the current video steganographic techniques which take into account only the intra frames for data embedding, the proposed EBBD technique aims to hide information in both intra and inter frames. The information is embedded into a compressed video by simultaneously manipulating the quantized AC coefficients (AC-QTCs) of luminance components of the frames during MPEG-2 encoding process. Later, during the decoding process, the embedded information can be detected and extracted completely. Furthermore, the EBBD basically deals with two security concepts: data encryption and data concealing. Hence, during the embedding process, secret data is encrypted using the simplified data encryption standard (S-DES) algorithm to provide better security to the implemented system. The security of the method lies in selecting candidate AC-QTCs within each non-overlapping 8 × 8 sub-block using a pseudo random key. Basic performance of this steganographic technique verified through experiments on various existing MPEG-2 encoded videos over a wide range of embedded payload rates. Overall, the experimental results verify the excellent performance of the proposed EBBD with a better trade-off in terms of imperceptibility and payload, as compared with previous techniques while at the same time ensuring minimal bitrate increase and negligible degradation of PSNR values.

  13. Simplified methods for computing total sediment discharge with the modified Einstein procedure

    USGS Publications Warehouse

    Colby, Bruce R.; Hubbell, David Wellington

    1961-01-01

    A procedure was presented in 1950 by H. A. Einstein for computing the total discharge of sediment particles of sizes that are in appreciable quantities in the stream bed. This procedure was modified by the U.S. Geological Survey and adapted to computing the total sediment discharge of a stream on the basis of samples of bed sediment, depth-integrated samples of suspended sediment, streamflow measurements, and water temperature. This paper gives simplified methods for computing total sediment discharge by the modified Einstein procedure. Each of four homographs appreciably simplifies a major step in the computations. Within the stated limitations, use of the homographs introduces much less error than is present in either the basic data or the theories on which the computations of total sediment discharge are based. The results are nearly as accurate mathematically as those that could be obtained from the longer and more complex arithmetic and algebraic computations of the Einstein procedure.

  14. PHONICS WITH CONTEXT CLUES, PRIMARY LEVEL.

    ERIC Educational Resources Information Center

    GUFFEY, MARY DEMAREE

    A SIMPLIFIED METHOD OF PHONICS UTILIZING THE GESTALT METHOD OF LEARNING IS PRESENTED. THE WORDS IN THIS COURSE IN PHONICS ARE TO BE TAUGHT AT A TIME DIFFERENT FROM THE READING CLASSES, BUT THE PRINCIPLES DEVELOPED ARE TO BE APPLIED DURING THE READING CLASSES. THE COURSE CAN BE USED WITH ANY BASIC TEXT AND STRESSES THE ABILITY OF CHILDREN TO…

  15. Simplified Guidelines to Hardwood Lumber Grading

    Treesearch

    Walton R. Smith

    1967-01-01

    All native hardwood lumber is graded according to the rules established by the National Hardwood Lumber Association. The rules are complete and detailed so that they permit accurate lumber grading with a minimum of personal judgment. To the student lumber grader, the many fine points and exceptions by species are often confusing and hide the basic rules of standard...

  16. Every Body Needs First Aid... A Manual for LEP Students [Draft] and Teacher Guide.

    ERIC Educational Resources Information Center

    Adachi, Patricia

    The manual contains both student and teacher materials for instruction in basic physiology and first aid. The instructional materials were developed for use with limited-English-proficient high school students, but are suitable for high school first aid classes because of their simplified English and format, sequential organization, detailed table…

  17. Concept and analytical basis for revistas - A fast, flexible computer/graphic system for generating periodic satellite coverage patterns

    NASA Technical Reports Server (NTRS)

    King, J. C.

    1976-01-01

    The generation of satellite coverage patterns is facilitated by three basic strategies: use of a simplified physical model, permitting rapid closed-form calculation; separation of earth rotation and nodal precession from initial geometric analyses; and use of symmetries to construct traces of indefinite length by repetitive transposition of basic one-quadrant elements. The complete coverage patterns generated consist of a basic nadir trace plus a number of associated off-nadir traces, one for each sensor swath edge to be delineated. Each trace is generated by transposing one or two of the basic quadrant elements into a circle on a nonrotating earth model sphere, after which the circle is expanded into the actual 'helical' pattern by adding rotational displacements to the longitude coordinates. The procedure adapts to the important periodic coverage cases by direct insertion of the characteristic integers N and R (days and orbital revolutions, respectively, per coverage period).

  18. Mathematical model for HIV spreads control program with ART treatment

    NASA Astrophysics Data System (ADS)

    Maimunah; Aldila, Dipo

    2018-03-01

    In this article, using a deterministic approach in a seven-dimensional nonlinear ordinary differential equation, we establish a mathematical model for the spread of HIV with an ART treatment intervention. In a simplified model, when no ART treatment is implemented, disease-free and the endemic equilibrium points were established analytically along with the basic reproduction number. The local stability criteria of disease-free equilibrium and the existing criteria of endemic equilibrium were analyzed. We find that endemic equilibrium exists when the basic reproduction number is larger than one. From the sensitivity analysis of the basic reproduction number of the complete model (with ART treatment), we find that the increased number of infected humans who follow the ART treatment program will reduce the basic reproduction number. We simulate this result also in the numerical experiment of the autonomous system to show how treatment intervention impacts the reduction of the infected population during the intervention time period.

  19. Use of Structure as a Basis for Abstraction in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Davison, Hayley J.; Hansman, R. John

    2004-01-01

    The safety and efficiency of the air traffic control domain is highly dependent on the capabilities and limitations of its human controllers. Past research has indicated that structure provided by the airspace and procedures could aid in simplifying the controllers cognitive tasks. In this paper, observations, interviews, voice command data analyses, and radar analyses were conducted at the Boston Terminal Route Control (TRACON) facility to determine if there was evidence of controllers using structure to simplify their cognitive processes. The data suggest that controllers do use structure-based abstractions to simplify their cognitive processes, particularly the projection task. How structure simplifies the projection task and the implications of understanding the benefits structure provides to the projection task was discussed.

  20. Combining cationic and anionic mixed-mode sorbents in a single cartridge to extract basic and acidic pharmaceuticals simultaneously from environmental waters.

    PubMed

    Salas, Daniela; Borrull, Francesc; Fontanals, Núria; Marcé, Rosa Maria

    2018-01-01

    The aim of the present study is to broaden the applications of mixed-mode ion-exchange solid-phase extraction sorbents to extract both basic and acidic compounds simultaneously by combining the sorbents in a single cartridge and developing a simplified extraction procedure. Four different cartridges containing negative and positive charges in the same configuration were evaluated and compared to extract a group of basic, neutral, and acidic pharmaceuticals selected as model compounds. After a thorough optimization of the extraction conditions, the four different cartridges showed to be capable of retaining basic and acidic pharmaceuticals simultaneously through ionic interactions, allowing the introduction of a washing step with 15 mL methanol to eliminate interferences retained by hydrophobic interactions. Using the best combined cartridge, a method was developed, validated, and further applied to environmental waters to demonstrate that the method is promising for the extraction of basic and acidic compounds from very complex samples.

  1. A graphically oriented specification language for automatic code generation. GRASP/Ada: A Graphical Representation of Algorithms, Structure, and Processes for Ada, phase 1

    NASA Technical Reports Server (NTRS)

    Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.

    1989-01-01

    The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.

  2. Stability of the line preserving flows

    NASA Astrophysics Data System (ADS)

    Figura, Przemysław

    2017-11-01

    We examine the equations that are used to describe flows which preserve field lines. We study what happens if we introduce perturbations to the governing equations. The stability of the line preserving flows in the case of the magneto-fluids permeated by magnetic fields is strictly connected to the non-null magnetic reconnection processes. In most of our study we use the Euler potential representation of the external magnetic field. We provide general expressions for the perturbations of the Euler potentials that describe the magnetic field. Similarly, we provide expressions for the case of steady flow as well as we obtain certain conditions required for the stability of the flow. In addition, for steady flows we formulate conditions under which the perturbations of the external field are negligible and the field may be described by its initial unperturbed form. Then we consider the flow equation that transforms quantities from the laboratory coordinate system to the related external field coordinate system. We introduce perturbations to the equation and obtain its simplified versions for the case of a steady flow. For a given system, use of this method allows us to simplify the considerations provided that some part of the system may be described as a perturbation. Next, to study regions favourable for the magnetic reconnection to occur we introduce a deviation vector to the basic line preserving flows condition equation. We provide expressions of the vector for some simplifying cases. This method allows us to examine if given perturbations either stabilise the system or induce magnetic reconnection. To illustrate some of our results we study two examples, namely a simple laboratory plasma flow and a simple planetary magnetosphere model.

  3. Space Station Freedom environmental control and life support system phase 3 simplified integrated test detailed report

    NASA Technical Reports Server (NTRS)

    Roberts, B. C.; Carrasquillo, R. L.; Dubiel, M. Y.; Ogle, K. Y.; Perry, J. L.; Whitley, K. M.

    1990-01-01

    A description of the phase 3 simplified integrated test (SIT) conducted at the Marshall Space Flight Center (MSFC) Core Module Integration Facility (CMIF) in 1989 is presented. This was the first test in the phase 3 series integrated environmental control and life support systems (ECLSS) tests. The basic goal of the SIT was to achieve full integration of the baseline air revitalization (AR) subsystems for Space Station Freedom. Included is a description of the SIT configuration, a performance analysis of each subsystem, results from air and water sampling, and a discussion of lessons learned from the test. Also included is a full description of the preprototype ECLSS hardware used in the test.

  4. A simulation study of emergency lunar escape to orbit using several simplified manual guidance and control techniques

    NASA Technical Reports Server (NTRS)

    Middleton, D. B.; Hurt, G. J., Jr.

    1971-01-01

    A fixed-base piloted simulator investigation has been made of the feasibility of using any of several manual guidance and control techniques for emergency lunar escape to orbit with very simplified, lightweight vehicle systems. The escape-to-orbit vehicles accommodate two men, but one man performs all of the guidance and control functions. Three basic attitude-control modes and four manually executed trajectory-guidance schemes were used successfully during approximately 125 simulated flights under a variety of conditions. These conditions included thrust misalinement, uneven propellant drain, and a vehicle moment-of-inertia range of 250 to 12,000 slugs per square foot. Two types of results are presented - orbit characteristics and pilot ratings of vehicle handling qualities.

  5. 76 FR 7102 - Simplified Network Application Processing System, On-line Registration and Account Maintenance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-09

    ... DEPARTMENT OF COMMERCE Bureau of Industry and Security 15 CFR Part 748 [Docket No. 100826397-1059-02] RIN 0694-AE98 Simplified Network Application Processing System, On-line Registration and Account Maintenance AGENCY: Bureau of Industry and Security, Commerce. ACTION: Final rule. SUMMARY: The Bureau of...

  6. Scalable problems and memory bounded speedup

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Ni, Lionel M.

    1992-01-01

    In this paper three models of parallel speedup are studied. They are fixed-size speedup, fixed-time speedup and memory-bounded speedup. The latter two consider the relationship between speedup and problem scalability. Two sets of speedup formulations are derived for these three models. One set considers uneven workload allocation and communication overhead and gives more accurate estimation. Another set considers a simplified case and provides a clear picture on the impact of the sequential portion of an application on the possible performance gain from parallel processing. The simplified fixed-size speedup is Amdahl's law. The simplified fixed-time speedup is Gustafson's scaled speedup. The simplified memory-bounded speedup contains both Amdahl's law and Gustafson's scaled speedup as special cases. This study leads to a better understanding of parallel processing.

  7. PHONICS WITH CONTEXT CLUES AS APPLIED TO LANGUAGE ARTS.

    ERIC Educational Resources Information Center

    GUFFEY, MARY DEMAREE

    A SIMPLIFIED METHOD OF PHONICS UTILIZING THE GESTALT METHOD OF LEARNING IS PRESENTED. THE WORDS IN THIS COURSE IN PHONICS ARE TO BE TAUGHT AT A TIME DIFFERENT FROM THE READING CLASSES, BUT THE PRINCIPLES DEVELOPED ARE TO BE APPLIED WITHIN THE READING CLASSES. THE COURSE CAN BE USED WITH ANY BASIC TEXT AND STRESSES THE ABILITY OF CHILDREN TO…

  8. Installation/Removal Tool for Screw-Mounted Components

    NASA Technical Reports Server (NTRS)

    Ash, J. P.

    1984-01-01

    Tweezerlike tool simplifies installation of screws in places reached only through narrow openings. With changes in size and shape, basic tool concept applicable to mounting and dismounting of transformers, sockets, terminal strips and mechanical parts. Inexpensive tool fabricated as needed by bending two pieces of steel wire. Exact size and shape selected to suit part manipulated and nature of inaccessible mounting space.

  9. On Access to Knowledge in the Social Sciences and Humanities, From the Viewpoint of Cybernetics and Information Science.

    ERIC Educational Resources Information Center

    Heilprin, Laurence B.

    The literature of knowledge is a very large system in the cybernetic sense of intractibility to control. Improving access to it needs some simplifying theory. A step in this direction is a hypothesis constructed from basic concepts. These include cybernetic concepts of variety and requisite variety; a version of the mathematical concept of…

  10. Playing Music, Playing with Music: A Proposal for Music Coding in Primary School

    ERIC Educational Resources Information Center

    Baratè, Adriano; Ludovico, Luca Andrea; Mangione, Giuseppina Rita; Rosa, Alessia

    2015-01-01

    In this work we will introduce the concept of "music coding," namely a new discipline that employs basic music activities and simplified languages to teach the computational way of thinking to musically-untrained children who attend the primary school. In this context, music represents both a mean and a goal: in fact, from one side…

  11. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  12. Interference method for obtaining the potential flow past an arbitrary cascade of airfoils

    NASA Technical Reports Server (NTRS)

    Katzoff, S; Finn, Robert S; Laurence, James C

    1947-01-01

    A procedure is presented for obtaining the pressure distribution on an arbitrary airfoil section in cascade in a two-dimensional, incompressible, and nonviscous flow. The method considers directly the influence on a given airfoil of the rest of the cascade and evaluates this interference by an iterative process, which appeared to converge rapidly in the cases tried (about unit solidity, stagger angles of 0 degree and 45 degrees). Two variations of the basic interference calculations are described. One, which is accurate enough for most purposes, involves the substitution of sources, sinks, and vortices for the interfering airfoils; the other, which may be desirable for the final approximation, involves a contour integration. The computations are simplified by the use of a chart presented by Betz in a related paper. Illustrated examples are included.

  13. Model based document and report generation for systems engineering

    NASA Astrophysics Data System (ADS)

    Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  14. Spatial spreading model and dynamics of West Nile virus in birds and mosquitoes with free boundary.

    PubMed

    Lin, Zhigui; Zhu, Huaiping

    2017-12-01

    In this paper, a reaction-diffusion system is proposed to model the spatial spreading of West Nile virus in vector mosquitoes and host birds in North America. Transmission dynamics are based on a simplified model involving mosquitoes and birds, and the free boundary is introduced to model and explore the expanding front of the infected region. The spatial-temporal risk index [Formula: see text], which involves regional characteristic and time, is defined for the simplified reaction-diffusion model with the free boundary to compare with other related threshold values, including the usual basic reproduction number [Formula: see text]. Sufficient conditions for the virus to vanish or to spread are given. Our results suggest that the virus will be in a scenario of vanishing if [Formula: see text], and will spread to the whole region if [Formula: see text] for some [Formula: see text], while if [Formula: see text], the spreading or vanishing of the virus depends on the initial number of infected individuals, the area of the infected region, the diffusion rate and other factors. Moreover, some remarks on the basic reproduction numbers and the spreading speeds are presented and compared.

  15. Essential core of the Hawking–Ellis types

    NASA Astrophysics Data System (ADS)

    Martín-Moruno, Prado; Visser, Matt

    2018-06-01

    The Hawking–Ellis (Segre–Plebański) classification of possible stress–energy tensors is an essential tool in analyzing the implications of the Einstein field equations in a more-or-less model-independent manner. In the current article the basic idea is to simplify the Hawking–Ellis type I, II, III, and IV classification by isolating the ‘essential core’ of the type II, type III, and type IV stress–energy tensors; this being done by subtracting (special cases of) type I to simplify the (Lorentz invariant) eigenvalue structure as much as possible without disturbing the eigenvector structure. We will denote these ‘simplified cores’ type II0, type III0, and type IV0. These ‘simplified cores’ have very nice and simple algebraic properties. Furthermore, types I and II0 have very simple classical interpretations, while type IV0 is known to arise semi-classically (in renormalized expectation values of standard stress–energy tensors). In contrast type III0 stands out in that it has neither a simple classical interpretation, nor even a simple semi-classical interpretation. We will also consider the robustness of this classification considering the stability of the different Hawking–Ellis types under perturbations. We argue that types II and III are definitively unstable, whereas types I and IV are stable.

  16. Simplified dichromated gelatin hologram recording process

    NASA Technical Reports Server (NTRS)

    Georgekutty, Tharayil G.; Liu, Hua-Kuang

    1987-01-01

    A simplified method for making dichromated gelatin (DCG) holographic optical elements (HOE) has been discovered. The method is much less tedious and it requires a period of processing time comparable with that for processing a silver halide hologram. HOE characteristics including diffraction efficiency (DE), linearity, and spectral sensitivity have been quantitatively investigated. The quality of the holographic grating is very high. Ninety percent or higher diffraction efficiency has been achieved in simple plane gratings made by this process.

  17. Slags in a Large Variation Range of Oxygen Potential Based on the Ion and Molecule Coexistence Theory

    NASA Astrophysics Data System (ADS)

    Yang, Xue-Min; Li, Jin-Yan; Zhang, Meng; Chai, Guo-Min; Zhang, Jian

    2014-12-01

    A thermodynamic model for predicting sulfide capacity of CaO-FeO-Fe2O3-Al2O3-P2O5 slags in a large variation range of oxygen potential corresponding to mass percentage of FetO from 1.88 to 55.50 pct, i.e., IMCT- model, has been developed by coupling with the deduced desulfurization mechanism of the slags based on the ion and molecule coexistence theory (IMCT). The developed IMCT- model has been verified through comparing the determined sulfide capacity after Ban-ya et al.[20] with the calculated by the developed IMCT- model and the calculated by the reported sulfide capacity models such as the KTH model. Mass percentage of FetO as 6.75 pct corresponding to the mass action concentration of FetO as 0.0637 or oxygen partial as 2.27 × 10-6 Pa is the criterion for distinguishing reducing and oxidizing zones for the slags. Sulfide capacity of the slags in reducing zone is controlled by reaction ability of CaO regardless of slag oxidization ability. However, sulfide capacity of the slags in oxidizing zone shows an obvious increase tendency with the increasing of slag oxidization ability. Sulfide capacity of the slags in reducing zone keeps almost constant with variation of the simplified complex basicity (pct CaO)/((pct Al2O3) + (pct P2O5)), or optical basicity, or the mass action concentration ratios of N FeO/ N CaO, , , and . Sulfide capacity of the slags in oxidizing zone shows an obvious increase with the increasing of the simplified complex basicity (pct CaO)/((pct Al2O3) + (pct P2O5)) or optical basicity, or the aforementioned mass action concentration ratios. Thus, the aforementioned mass action concentration ratios and the corresponding mass percentage ratios of various iron oxides to basic oxide CaO are recommended to represent the comprehensive effect of various iron oxides and basic oxide CaO on sulfide capacity of the slags.

  18. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  19. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  20. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  1. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  3. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  6. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  7. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  8. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  9. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  10. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  11. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  12. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  13. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  14. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  15. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  19. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  20. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  1. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  3. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  6. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  7. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  8. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  9. Road to Grid Parity through Deployment of Low-Cost 21.5% N-Type Si Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velundur, Vijay

    This project seeks to develop and deploy differentiated 21.5% efficient n-type Si solar cells while reaching the SunShot module cost goal of ≤ $0.50/W. This objective hinges on development of enabling low cost technologies that simplify the manufacturing process and reduce overall processing costs. These comprise of (1) Boron emitter formation and passivation; (2) Simplified processing process for emitter and BSF layers; and (3) Advanced metallization for the front and back contacts.

  10. Future launchers strategy : the ariane 2010 initiative

    NASA Astrophysics Data System (ADS)

    Bonnal, Ch.; Eymard, M.; Soccodato, C.

    2001-03-01

    With the new cryogenic upper stage ESC, the European heavy launcher Ariane 5+ is perfectly suited to the space market envisioned for the coming decade: flexible to cope with any payload and commercially attractive despite a fierce competition. Current Arianespace projections for the following years 2010-2020 indicate two major trends: satellites may still become larger and may require very different final orbits; today's market largely dominated by GEO may well evolve, influenced by LEO operations such as those linked to ISS or by constellations, to remain competitive, the launch cost has to be reduced. The future generation of the European heavy launcher has therefore to focus on an ever increased flexibility with a drastic cost reduction. Two strategies are possible to achieve this double goal: reusable launchers, either partially or totally, may ease the access to space, limiting costly expendable stages; the assessment of their technical feasibility and financial viability is undergoing in Europe under the Future Launchers Technology Program (FLTP), expendable launchers, derived from the future Ariane 5+. This second way started by CNES at the end of year 1999 is called the "Ariane 2010 initiative". The main objectives are simultaneously an increase of 25% in performance and a reduction of 30% in launch cost wrt Ariane 5+. To achieve these very ambitious goals, numerous major modifications are studied: technical improvements : modifications of the Solid Rocket Boosters may consist in filament winding casing, increased loading, simplified casting, improved grain, simplified Thrust Vector Control, … evolution of the Vulcain engine leading to higher efficiency despite a simplified design, flow separation controlled nozzle extension, propellant management of the two cryogenic stages, simplified electrical system, increased standardization, for instance on flanged interfaces and manufacturing processes, operational improvements such as launch cycle simplification and standardization of the coupled analyses, organizational improvements such as a redistribution of responsibilities for the developments. All these modifications will of course not be implemented together; the aim is to have a coherent catalogue of improvements in order to enable future choices depending on effective requirements. These basic elements will also be considered for the development of other launchers, in the small or medium size range.

  11. Soft-output decoding algorithms in iterative decoding of turbo codes

    NASA Technical Reports Server (NTRS)

    Benedetto, S.; Montorsi, G.; Divsalar, D.; Pollara, F.

    1996-01-01

    In this article, we present two versions of a simplified maximum a posteriori decoding algorithm. The algorithms work in a sliding window form, like the Viterbi algorithm, and can thus be used to decode continuously transmitted sequences obtained by parallel concatenated codes, without requiring code trellis termination. A heuristic explanation is also given of how to embed the maximum a posteriori algorithms into the iterative decoding of parallel concatenated codes (turbo codes). The performances of the two algorithms are compared on the basis of a powerful rate 1/3 parallel concatenated code. Basic circuits to implement the simplified a posteriori decoding algorithm using lookup tables, and two further approximations (linear and threshold), with a very small penalty, to eliminate the need for lookup tables are proposed.

  12. Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.

  13. How Do Work Stress and Coping Work? Toward a Fundamental Theoretical Reappraisal

    ERIC Educational Resources Information Center

    Briner, Rob B.; Harris, Claire; Daniels, Kevin

    2004-01-01

    The main aim of this paper is to make the case for why a fundamental reappraisal rather than incremental development of work stress and coping theory is required. In order to do this we present, in simplified form, some of the basic tenets of theory in this field. These tenets are questioned and their limitations identified in two ways. The first…

  14. What's so Simple about Simplified Texts? A Computational and Psycholinguistic Investigation of Text Comprehension and Text Processing

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Yang, Hae Sung; McNamara, Danielle S.

    2014-01-01

    This study uses a moving windows self-paced reading task to assess both text comprehension and processing time of authentic texts and these same texts simplified to beginning and intermediate levels. Forty-eight second language learners each read 9 texts (3 different authentic, beginning, and intermediate level texts). Repeated measures ANOVAs…

  15. Implementing Project SIED: Special Education Teachers' Perceptions of a Simplified Technology Decision-Making Process for App Identification and Evaluation

    ERIC Educational Resources Information Center

    Schmidt, Matthew M.; Lin, Meng-Fen Grace; Paek, Seungoh; MacSuga-Gage, Ashley; Gage, Nicholas A.

    2017-01-01

    The worldwide explosion in popularity of mobile devices has created a dramatic increase in mobile software (apps) that are quick and easy to find and install, cheap, disposable, and usually single purpose. Hence, teachers need an equally streamlined and simplified decision-making process to help them identify educational apps--an approach that…

  16. Cycling of DMSP and DMS in Surface Ocean Waters: The Impact of Microzooplankton Grazing and Metabolism

    NASA Technical Reports Server (NTRS)

    Sherr, Evelyn; Sherr, Barry; Wolfe, Gordon; Kiene, Ronald

    1997-01-01

    We have explored and identified several novel aspects of dimetylsulfoniopropionate (DMSP) metabolism and dimetylsulfide (DMS) production by microbial food web processes. Processes studied include: microzooplankton herbivory, uptake and retention of dissolved (DMSP) by marine bacteria, coupled with microzooplankton bacterivory, and generation of (DMS) as a byproduct of chemical grazing deterrence by Emiliania huxleyi. Our results illustrate the complexities of DMSP cycling and DMS production, and support the idea that the flux of DMS to the atmosphere is the result of many coupled trophic interactions which are not currently predictable by simple models or observations tied to a few bulk parameters. Although it is highly desirable to measure trophic interactions by remote sensing techniques, satellite methods cannot currently yield information on bacterial or microzooplankton abundances, activities, and processes. We have identified specific processes which must be included in future efforts, but we do not know yet how widespread or important these will be in many natural waters. We believe further work will enable us to simplify our model of DMS production by eliminating second order processes, and help refine our insight into the primary biological and chemical sources of atmospheric DMS. This is fundamental work which should be supported as basic research.

  17. Scale Interactions in the Tropics from a Simple Multi-Cloud Model

    NASA Astrophysics Data System (ADS)

    Niu, X.; Biello, J. A.

    2017-12-01

    Our lack of a complete understanding of the interaction between the moisture convection and equatorial waves remains an impediment in the numerical simulation of large-scale organization, such as the Madden-Julian Oscillation (MJO). The aim of this project is to understand interactions across spatial scales in the tropics from a simplified framework for scale interactions while a using a simplified framework to describe the basic features of moist convection. Using multiple asymptotic scales, Biello and Majda[1] derived a multi-scale model of moist tropical dynamics (IMMD[1]), which separates three regimes: the planetary scale climatology, the synoptic scale waves, and the planetary scale anomalies regime. The scales and strength of the observed MJO would categorize it in the regime of planetary scale anomalies - which themselves are forced from non-linear upscale fluxes from the synoptic scales waves. In order to close this model and determine whether it provides a self-consistent theory of the MJO. A model for diabatic heating due to moist convection must be implemented along with the IMMD. The multi-cloud parameterization is a model proposed by Khouider and Majda[2] to describe the three basic cloud types (congestus, deep and stratiform) that are most responsible for tropical diabatic heating. We implement a simplified version of the multi-cloud model that is based on results derived from large eddy simulations of convection [3]. We present this simplified multi-cloud model and show results of numerical experiments beginning with a variety of convective forcing states. Preliminary results on upscale fluxes, from synoptic scales to planetary scale anomalies, will be presented. [1] Biello J A, Majda A J. Intraseasonal multi-scale moist dynamics of the tropical atmosphere[J]. Communications in Mathematical Sciences, 2010, 8(2): 519-540. [2] Khouider B, Majda A J. A simple multicloud parameterization for convectively coupled tropical waves. Part I: Linear analysis[J]. Journal of the atmospheric sciences, 2006, 63(4): 1308-1323. [3] Dorrestijn J, Crommelin D T, Biello J A, et al. A data-driven multi-cloud model for stochastic parametrization of deep convection[J]. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 2013, 371(1991): 20120374.

  18. Development of augmented reality system for servicing electromechanical equipment

    NASA Astrophysics Data System (ADS)

    Zhukovskiy, Y.; Koteleva, N.

    2018-05-01

    Electromechanical equipment is widely used. It is used in industrial enterprises, in the spheres of public services, in everyday life, etc. Maintenance servicing of electromechanical equipment is an important part of its life cycle. High-quality and timely service can extend the life of the electromechanical equipment. The creation of special systems that simplify the process of servicing electromechanical equipment is an urgent task. Such systems can shorten the time for maintenance of electrical equipment, and, therefore, reduce the cost of maintenance in general. This article presents an analysis of information on the operation of service services for maintenance and repair of electromechanical equipment, identifies the list of services, and estimates the time required to perform basic service operations. The structure of the augmented reality system is presented, the ways of interaction of the augmented reality system with the automated control systems working at the enterprise are presented.

  19. Rationalisation and Validation of an Acrylamide-Free Procedure in Three-Dimensional Histological Imaging

    PubMed Central

    Lai, Hei Ming; Liu, Alan King Lun; Ng, Wai-Lung; DeFelice, John; Lee, Wing Sang; Li, Heng; Li, Wen; Ng, Ho Man; Chang, Raymond Chuen-Chung; Lin, Bin; Wu, Wutian; Gentleman, Steve M.

    2016-01-01

    Three-dimensional visualization of intact tissues is now being achieved by turning tissues transparent. CLARITY is a unique tissue clearing technique, which features the use of detergents to remove lipids from fixed tissues to achieve optical transparency. To preserve tissue integrity, an acrylamide-based hydrogel has been proposed to embed the tissue. In this study, we examined the rationale behind the use of acrylamide in CLARITY, and presented evidence to suggest that the omission of acrylamide-hydrogel embedding in CLARITY does not alter the preservation of tissue morphology and molecular information in fixed tissues. We therefore propose a novel and simplified workflow for formaldehyde-fixed tissue clearing, which will facilitate the laboratory implementation of this technique. Furthermore, we have investigated the basic tissue clearing process in detail and have highlighted some areas for targeted improvement of technologies essential for the emerging subject of three-dimensional histology. PMID:27359336

  20. Genome-wide essential gene identification in Streptococcus sanguinis

    PubMed Central

    Xu, Ping; Ge, Xiuchun; Chen, Lei; Wang, Xiaojing; Dou, Yuetan; Xu, Jerry Z.; Patel, Jenishkumar R.; Stone, Victoria; Trinh, My; Evans, Karra; Kitten, Todd; Bonchev, Danail; Buck, Gregory A.

    2011-01-01

    A clear perception of gene essentiality in bacterial pathogens is pivotal for identifying drug targets to combat emergence of new pathogens and antibiotic-resistant bacteria, for synthetic biology, and for understanding the origins of life. We have constructed a comprehensive set of deletion mutants and systematically identified a clearly defined set of essential genes for Streptococcus sanguinis. Our results were confirmed by growing S. sanguinis in minimal medium and by double-knockout of paralogous or isozyme genes. Careful examination revealed that these essential genes were associated with only three basic categories of biological functions: maintenance of the cell envelope, energy production, and processing of genetic information. Our finding was subsequently validated in two other pathogenic streptococcal species, Streptococcus pneumoniae and Streptococcus mutans and in two other gram-positive pathogens, Bacillus subtilis and Staphylococcus aureus. Our analysis has thus led to a simplified model that permits reliable prediction of gene essentiality. PMID:22355642

  1. Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method

    NASA Astrophysics Data System (ADS)

    Sun, C. J.; Zhou, J. H.; Wu, W.

    2017-10-01

    During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.

  2. Successive membrane separation processes simplify concentration of lipases produced by Aspergillus niger by solid-state fermentation.

    PubMed

    Reinehr, Christian Oliveira; Treichel, Helen; Tres, Marcus Vinicius; Steffens, Juliana; Brião, Vandré Barbosa; Colla, Luciane Maria

    2017-06-01

    In this study, we developed a simplified method for producing, separating, and concentrating lipases derived from solid-state fermentation of agro-industrial residues by filamentous fungi. First, we used Aspergillus niger to produce lipases with hydrolytic activity. We analyzed the separation and concentration of enzymes using membrane separation processes. The sequential use of microfiltration and ultrafiltration processes made it possible to obtain concentrates with enzymatic activities much higher than those in the initial extract. The permeate flux was higher than 60 L/m 2 h during microfiltration using 20- and 0.45-µm membranes and during ultrafiltration using 100- and 50-kDa membranes, where fouling was reversible during the filtration steps, thereby indicating that the fouling may be removed by cleaning processes. These results demonstrate the feasibility of lipase production using A. niger by solid-state fermentation of agro-industrial residues, followed by successive tangential filtration with membranes, which simplify the separation and concentration steps that are typically required in downstream processes.

  3. Electronic test instrumentation and techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The uses of test equipment and techniques used in space research and development programs are discussed. Modifications and adaptations to enlarge the scope of usefulness or divert the basic uses to alternate applications are analyzed. The items of equipment which have been of benefit to professional personnel in the enlargement and improvement of quality control capabilities are identified. Items which have been simplified or made more accurate in conducting measurements are described.

  4. A resume or curriculum vitae for success.

    PubMed

    Markey, B T; Campbell, R L

    1996-01-01

    Nurses who are searching for new positions can enhance their job employment potential with well-written resumes. Scholarship and award recognition also can be improved by creating well-written resumes and/or curricula vitae. Appropriate cover letters effectively introduce nurses to employers or review committees. This article presents a few basic suggestions that can simplify the creation of any of these documents and help nurses produce a quality product.

  5. Phonon-defect scattering and thermal transport in semiconductors: developing guiding principles

    NASA Astrophysics Data System (ADS)

    Polanco, Carlos; Lindsay, Lucas

    First principles calculations of thermal conductivity have shown remarkable agreement with measurements for high-quality crystals. Nevertheless, most materials contain defects that provide significant extrinsic resistance and lower the conductivity from that of a perfect sample. This effect is usually accounted for with simplified analytical models that neglect the atomistic details of the defect and the exact dynamical properties of the system, which limits prediction capabilities. Recently, a method based on Greens functions was developed to calculate the phonon-defect scattering rates from first principles. This method has shown the important role of point defects in determining thermal transport in diamond and boron arsenide, two competitors for the highest bulk thermal conductivity. Here, we study the role of point defects on other relatively high thermal conductivity semiconductors, e.g., BN, BeSe, SiC, GaN and Si. We compare their first principles defect-phonon scattering rates and effects on transport properties with those from simplified models and explore common principles that determine these. Efforts will focus on basic vibrational properties that vary from system to system, such as density of states, interatomic force constants and defect deformation. Research supported by the U.S. Department of Energy, Basic Energy Sciences, Materials Sciences and Engineering Division.

  6. Methods for the development of a bioregenerative life support system

    NASA Technical Reports Server (NTRS)

    Goldman, Michelle; Gomez, Shawn; Voorhees, Mike

    1990-01-01

    Presented here is a rudimentary approach to designing a life support system based on the utilization of plants and animals. The biggest stumbling block in the initial phases of developing a bioregenerative life support system is encountered in collecting and consolidating the data. If a database existed for the systems engineer so that he or she may have accurate data and a better understanding of biological systems in engineering terms, then the design process would be simplified. Also addressed is a means of evaluating the subsystems chosen. These subsystems are unified into a common metric, kilograms of mass, and normalized in relation to the throughput of a few basic elements. The initial integration of these subsystems is based on input/output masses and eventually balanced to a point of operation within the inherent performance ranges of the organisms chosen. At this point, it becomes necessary to go beyond the simplifying assumptions of simple mass relationships and further define for each organism the processes used to manipulate the throughput matter. Mainly considered here is the fact that these organisms perform input/output functions on differing timescales, thus establishing the need for buffer volumes or appropriate subsystem phasing. At each point in a systematic design it is necessary to disturb the system and discern its sensitivity to the disturbance. This can be done either through the introduction of a catastrophic failure or by applying a small perturbation to the system. One example is increasing the crew size. Here the wide range of performance characteristics once again shows that biological systems have an inherent advantage in responding to systemic perturbations. Since the design of any space-based system depends on mass, power, and volume requirements, each subsystem must be evaluated in these terms.

  7. Development of the ICD-10 simplified version and field test.

    PubMed

    Paoin, Wansa; Yuenyongsuwan, Maliwan; Yokobori, Yukiko; Endo, Hiroyoshi; Kim, Sukil

    2018-05-01

    The International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) has been used in various Asia-Pacific countries for more than 20 years. Although ICD-10 is a powerful tool, clinical coding processes are complex; therefore, many developing countries have not been able to implement ICD-10-based health statistics (WHO-FIC APN, 2007). This study aimed to simplify ICD-10 clinical coding processes, to modify index terms to facilitate computer searching and to provide a simplified version of ICD-10 for use in developing countries. The World Health Organization Family of International Classifications Asia-Pacific Network (APN) developed a simplified version of the ICD-10 and conducted field testing in Cambodia during February and March 2016. Ten hospitals were selected to participate. Each hospital sent a team to join a training workshop before using the ICD-10 simplified version to code 100 cases. All hospitals subsequently sent their coded records to the researchers. Overall, there were 1038 coded records with a total of 1099 ICD clinical codes assigned. The average accuracy rate was calculated as 80.71% (66.67-93.41%). Three types of clinical coding errors were found. These related to errors relating to the coder (14.56%), those resulting from the physician documentation (1.27%) and those considered system errors (3.46%). The field trial results demonstrated that the APN ICD-10 simplified version is feasible for implementation as an effective tool to implement ICD-10 clinical coding for hospitals. Developing countries may consider adopting the APN ICD-10 simplified version for ICD-10 code assignment in hospitals and health care centres. The simplified version can be viewed as an introductory tool which leads to the implementation of the full ICD-10 and may support subsequent ICD-11 adoption.

  8. Automatic Construction of 3D Basic-Semantic Models of Inhabited Interiors Using Laser Scanners and RFID Sensors

    PubMed Central

    Valero, Enrique; Adan, Antonio; Cerrada, Carlos

    2012-01-01

    This paper is focused on the automatic construction of 3D basic-semantic models of inhabited interiors using laser scanners with the help of RFID technologies. This is an innovative approach, in whose field scarce publications exist. The general strategy consists of carrying out a selective and sequential segmentation from the cloud of points by means of different algorithms which depend on the information that the RFID tags provide. The identification of basic elements of the scene, such as walls, floor, ceiling, windows, doors, tables, chairs and cabinets, and the positioning of their corresponding models can then be calculated. The fusion of both technologies thus allows a simplified 3D semantic indoor model to be obtained. This method has been tested in real scenes under difficult clutter and occlusion conditions, and has yielded promising results. PMID:22778609

  9. Simplified estimation of age-specific reference intervals for skewed data.

    PubMed

    Wright, E M; Royston, P

    1997-12-30

    Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.

  10. Building Efficiency Evaluation and Uncertainty Analysis with DOE's Asset Score Preview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-08-12

    Building Energy Asset Score Tool, developed by the U.S. Department of Energy (DOE), is a program to encourage energy efficiency improvement by helping building owners and managers assess a building's energy-related systems independent of operations and maintenance. Asset Score Tool uses a simplified EnergyPlus model to provide an assessment of building systems, through minimum user inputs of basic building characteristics. Asset Score Preview is a newly developed option that allows users to assess their building's systems and the potential value of a more in-depth analysis via an even more simplified approach. This methodology provides a preliminary approach to estimating amore » building's energy efficiency and potential for improvement. This paper provides an overview of the methodology used for the development of Asset Score Preview and the scoring methodology.« less

  11. Plant Development, Auxin, and the Subsystem Incompleteness Theorem

    PubMed Central

    Niklas, Karl J.; Kutschera, Ulrich

    2012-01-01

    Plant morphogenesis (the process whereby form develops) requires signal cross-talking among all levels of organization to coordinate the operation of metabolic and genomic subsystems operating in a larger network of subsystems. Each subsystem can be rendered as a logic circuit supervising the operation of one or more signal-activated system. This approach simplifies complex morphogenetic phenomena and allows for their aggregation into diagrams of progressively larger networks. This technique is illustrated here by rendering two logic circuits and signal-activated subsystems, one for auxin (IAA) polar/lateral intercellular transport and another for IAA-mediated cell wall loosening. For each of these phenomena, a circuit/subsystem diagram highlights missing components (either in the logic circuit or in the subsystem it supervises) that must be identified experimentally if each of these basic plant phenomena is to be fully understood. We also illustrate the “subsystem incompleteness theorem,” which states that no subsystem is operationally self-sufficient. Indeed, a whole-organism perspective is required to understand even the most simple morphogenetic process, because, when isolated, every biological signal-activated subsystem is morphogenetically ineffective. PMID:22645582

  12. Standard Versus Simplified Consent Materials for Biobank Participation: Differences in Patient Knowledge and Trial Accrual.

    PubMed

    Garrett, Sarah B; Murphy, Marie; Wiley, James; Dohan, Daniel

    2017-12-01

    Replacing standard consent materials with simplified materials is a promising intervention to improve patient comprehension, but there is little evidence on its real-world implementation. We employed a sequential two-arm design to compare the effect of standard versus simplified consent materials on potential donors' understanding of biobank processes and their accrual to an active biobanking program. Participants were female patients of a California breast health clinic. Subjects from the simplified arm answered more items correctly ( p = .064), reported "don't know" for fewer items ( p = .077), and consented to donate to the biobank at higher rates ( p = .025) than those from the standard arm. Replacing an extant consent form with a simplified version is feasible and may benefit patient comprehension and study accrual.

  13. Spin Choreography: Basic Steps in High Resolution NMR (by Ray Freeman)

    NASA Astrophysics Data System (ADS)

    Minch, Michael J.

    1998-02-01

    There are three orientations that NMR courses may take. The traditional molecular structure course focuses on the interpretation of spectra and the use of chemical shifts, coupling constants, and nuclear Overhauser effects (NOE) to sort out subtle details of structure and stereochemistry. Courses can also focus on the fundamental quantum mechanics of observable NMR parameters and processes such a spin-spin splitting and relaxation. More recently there are courses devoted to the manipulation of nuclear spins and the basic steps of one- and two-dimensional NMR experiments. Freeman's book is directed towards the latter audience. Modern NMR methods offer a myriad ways to extract information about molecular structure and motion by observing the behavior of nuclear spins under a variety of conditions. In Freeman's words: "We can lead the spins through an intricate dance, carefully programmed in advance, to enhance, simplify, correlate, decouple, edit or assign NMR spectra." This is a carefully written, well-illustrated account of how this dance is choreographed by pulse programming, double resonance, and gradient effects. Although well written, this book is not an easy read; every word counts. It is recommended for graduate courses that emphasize the fundamentals of magnetic resonance. It is not a text on interpretation of spectra.

  14. A Non-Invasive Multichannel Hybrid Fiber-Optic Sensor System for Vital Sign Monitoring

    PubMed Central

    Fajkus, Marcel; Nedoma, Jan; Martinek, Radek; Vasinek, Vladimir; Nazeran, Homer; Siska, Petr

    2017-01-01

    In this article, we briefly describe the design, construction, and functional verification of a hybrid multichannel fiber-optic sensor system for basic vital sign monitoring. This sensor uses a novel non-invasive measurement probe based on the fiber Bragg grating (FBG). The probe is composed of two FBGs encapsulated inside a polydimethylsiloxane polymer (PDMS). The PDMS is non-reactive to human skin and resistant to electromagnetic waves, UV absorption, and radiation. We emphasize the construction of the probe to be specifically used for basic vital sign monitoring such as body temperature, respiratory rate and heart rate. The proposed sensor system can continuously process incoming signals from up to 128 individuals. We first present the overall design of this novel multichannel sensor and then elaborate on how it has the potential to simplify vital sign monitoring and consequently improve the comfort level of patients in long-term health care facilities, hospitals and clinics. The reference ECG signal was acquired with the use of standard gel electrodes fixed to the monitored person’s chest using a real-time monitoring system for ECG signals with virtual instrumentation. The outcomes of these experiments have unambiguously proved the functionality of the sensor system and will be used to inform our future research in this fast developing and emerging field. PMID:28075341

  15. Axisymmetric Plasma Equilibria in General Relativity

    NASA Astrophysics Data System (ADS)

    Elsässer, Klaus

    Axisymmetric plasma equilibria near a rotating black hole are considered within the multifluid description. An isothermal two-component plasma with electrons and positrons or ions is determined by four structure functions and the boundary conditions. These structure functions are the Bernoulli function and the toroidal canonical momentum per mass for each species; they remain arbitrary if no gain and loss processes are considered, in close analogy to the free flux functions in ideal magnetohydrodynamics. Several simplifying assumptions allow the reduction of the basic equations to one single scalar equation for the stream function χ of positrons or ions, respectively, playing the rôle of the Grad/Shafranov equation in magnetohydrodynamics; in particular, Maxwell's equations can be solved analytically for a quasineutral plasma when both the charge density and the toroidal electric current density are negligible (in contrast to the Tokamak situation). The basic smallness parameter is the ratio of the skin depth of electrons to the scale length of the metric and fluid quantities, and, in the case of an electron-ion plasma, the mass ratio me/mi. The χ-equation can be solved by standard methods, and simple solutions for a Kerr geometry are available; they show characteristic flow patterns, depending on the structure functions and the boundary conditions.

  16. [The subject matters concerned with use of simplified analytical systems from the perspective of the Japanese Association of Medical Technologists].

    PubMed

    Morishita, Y

    2001-05-01

    The subject matters concerned with use of so-called simplified analytical systems for the purpose of useful utilizing are mentioned from the perspective of a laboratory technician. 1. The data from simplified analytical systems should to be agreed with those of particular reference methods not to occur the discrepancy of the data from different laboratories. 2. Accuracy of the measured results using simplified analytical systems is hard to be scrutinized thoroughly and correctly with the quality control surveillance procedure on the stored pooled serum or partly-processed blood. 3. It is necessary to present the guide line to follow about the contents of evaluation to guarantee on quality of simplified analytical systems. 4. Maintenance and manual performance of simplified analytical systems have to be standardized by a laboratory technician and a selling agent technician. 5. It calls attention, further that the cost of simplified analytical systems is much expensive compared to that of routine method with liquid reagents. 6. Various substances in human serum, like cytokine, hormone, tumor marker, and vitamin, etc. are also hoped to be measured by simplified analytical systems.

  17. Integrating the healthcare enterprise in radiation oncology plug and play--the future of radiation oncology?

    PubMed

    Abdel-Wahab, May; Rengan, Ramesh; Curran, Bruce; Swerdloff, Stuart; Miettinen, Mika; Field, Colin; Ranjitkar, Sunita; Palta, Jatinder; Tripuraneni, Prabhakar

    2010-02-01

    To describe the processes and benefits of the integrating healthcare enterprises in radiation oncology (IHE-RO). The IHE-RO process includes five basic steps. The first step is to identify common interoperability issues encountered in radiation treatment planning and the delivery process. IHE-RO committees partner with vendors to develop solutions (integration profiles) to interoperability problems. The broad application of these integration profiles across a variety of vender platforms is tested annually at the Connectathon event. Demonstration of the seamless integration and transfer of patient data to the potential users are then presented by vendors at the public demonstration event. Users can then integrate these profiles into requests for proposals and vendor contracts by institutions. Incorporation of completed integration profiles into requests for proposals can be done when purchasing new equipment. Vendors can publish IHE integration statements to document the integration profiles supported by their products. As a result, users can reference integration profiles in requests for proposals, simplifying the systems acquisition process. These IHE-RO solutions are now available in many of the commercial radiation oncology-related treatment planning, delivery, and information systems. They are also implemented at cancer care sites around the world. IHE-RO serves an important purpose for the radiation oncology community at large. Copyright 2010 Elsevier Inc. All rights reserved.

  18. High-performance computational fluid dynamics: a custom-code approach

    NASA Astrophysics Data System (ADS)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  19. QRS detection based ECG quality assessment.

    PubMed

    Hayn, Dieter; Jammerbund, Bernhard; Schreier, Günter

    2012-09-01

    Although immediate feedback concerning ECG signal quality during recording is useful, up to now not much literature describing quality measures is available. We have implemented and evaluated four ECG quality measures. Empty lead criterion (A), spike detection criterion (B) and lead crossing point criterion (C) were calculated from basic signal properties. Measure D quantified the robustness of QRS detection when applied to the signal. An advanced Matlab-based algorithm combining all four measures and a simplified algorithm for Android platforms, excluding measure D, were developed. Both algorithms were evaluated by taking part in the Computing in Cardiology Challenge 2011. Each measure's accuracy and computing time was evaluated separately. During the challenge, the advanced algorithm correctly classified 93.3% of the ECGs in the training-set and 91.6 % in the test-set. Scores for the simplified algorithm were 0.834 in event 2 and 0.873 in event 3. Computing time for measure D was almost five times higher than for other measures. Required accuracy levels depend on the application and are related to computing time. While our simplified algorithm may be accurate for real-time feedback during ECG self-recordings, QRS detection based measures can further increase the performance if sufficient computing power is available.

  20. Early vision and focal attention

    NASA Astrophysics Data System (ADS)

    Julesz, Bela

    1991-07-01

    At the thirty-year anniversary of the introduction of the technique of computer-generated random-dot stereograms and random-dot cinematograms into psychology, the impact of the technique on brain research and on the study of artificial intelligence is reviewed. The main finding-that stereoscopic depth perception (stereopsis), motion perception, and preattentive texture discrimination are basically bottom-up processes, which occur without the help of the top-down processes of cognition and semantic memory-greatly simplifies the study of these processes of early vision and permits the linking of human perception with monkey neurophysiology. Particularly interesting are the unexpected findings that stereopsis (assumed to be local) is a global process, while texture discrimination (assumed to be a global process, governed by statistics) is local, based on some conspicuous local features (textons). It is shown that the top-down process of "shape (depth) from shading" does not affect stereopsis, and some of the models of machine vision are evaluated. The asymmetry effect of human texture discrimination is discussed, together with recent nonlinear spatial filter models and a novel extension of the texton theory that can cope with the asymmetry problem. This didactic review attempts to introduce the physicist to the field of psychobiology and its problems-including metascientific problems of brain research, problems of scientific creativity, the state of artificial intelligence research (including connectionist neural networks) aimed at modeling brain activity, and the fundamental role of focal attention in mental events.

  1. Simplified method for creating a density-absorbed dose calibration curve for the low dose range from Gafchromic EBT3 film.

    PubMed

    Gotanda, Tatsuhiro; Katsuda, Toshizo; Gotanda, Rumi; Kuwano, Tadao; Akagawa, Takuya; Tanki, Nobuyoshi; Tabuchi, Akihiko; Shimono, Tetsunori; Kawaji, Yasuyuki

    2016-01-01

    Radiochromic film dosimeters have a disadvantage in comparison with an ionization chamber in that the dosimetry process is time-consuming for creating a density-absorbed dose calibration curve. The purpose of this study was the development of a simplified method of creating a density-absorbed dose calibration curve from radiochromic film within a short time. This simplified method was performed using Gafchromic EBT3 film with a low energy dependence and step-shaped Al filter. The simplified method was compared with the standard method. The density-absorbed dose calibration curves created using the simplified and standard methods exhibited approximately similar straight lines, and the gradients of the density-absorbed dose calibration curves were -32.336 and -33.746, respectively. The simplified method can obtain calibration curves within a much shorter time compared to the standard method. It is considered that the simplified method for EBT3 film offers a more time-efficient means of determining the density-absorbed dose calibration curve within a low absorbed dose range such as the diagnostic range.

  2. Simplified method for creating a density-absorbed dose calibration curve for the low dose range from Gafchromic EBT3 film

    PubMed Central

    Gotanda, Tatsuhiro; Katsuda, Toshizo; Gotanda, Rumi; Kuwano, Tadao; Akagawa, Takuya; Tanki, Nobuyoshi; Tabuchi, Akihiko; Shimono, Tetsunori; Kawaji, Yasuyuki

    2016-01-01

    Radiochromic film dosimeters have a disadvantage in comparison with an ionization chamber in that the dosimetry process is time-consuming for creating a density-absorbed dose calibration curve. The purpose of this study was the development of a simplified method of creating a density-absorbed dose calibration curve from radiochromic film within a short time. This simplified method was performed using Gafchromic EBT3 film with a low energy dependence and step-shaped Al filter. The simplified method was compared with the standard method. The density-absorbed dose calibration curves created using the simplified and standard methods exhibited approximately similar straight lines, and the gradients of the density-absorbed dose calibration curves were −32.336 and −33.746, respectively. The simplified method can obtain calibration curves within a much shorter time compared to the standard method. It is considered that the simplified method for EBT3 film offers a more time-efficient means of determining the density-absorbed dose calibration curve within a low absorbed dose range such as the diagnostic range. PMID:28144120

  3. Simplified procedures for correlation of experimentally measured and predicted thrust chamber performance

    NASA Technical Reports Server (NTRS)

    Powell, W. B.

    1973-01-01

    Thrust chamber performance is evaluated in terms of an analytical model incorporating all the loss processes that occur in a real rocket motor. The important loss processes in the real thrust chamber were identified, and a methodology and recommended procedure for predicting real thrust chamber vacuum specific impulse were developed. Simplified equations for the calculation of vacuum specific impulse are developed to relate the delivered performance (both vacuum specific impulse and characteristic velocity) to the ideal performance as degraded by the losses corresponding to a specified list of loss processes. These simplified equations enable the various performance loss components, and the corresponding efficiencies, to be quantified separately (except that interaction effects are arbitrarily assigned in the process). The loss and efficiency expressions presented can be used to evaluate experimentally measured thrust chamber performance, to direct development effort into the areas most likely to yield improvements in performance, and as a basis to predict performance of related thrust chamber configurations.

  4. A multiscale approach to modelling electrochemical processes occurring across the cell membrane with application to transmission of action potentials.

    PubMed

    Richardson, G

    2009-09-01

    By application of matched asymptotic expansions, a simplified partial differential equation (PDE) model for the dynamic electrochemical processes occurring in the vicinity of a membrane, as ions selectively permeate across it, is formally derived from the Poisson-Nernst-Planck equations of electrochemistry. It is demonstrated that this simplified model reduces itself, in the limit of a long thin axon, to the cable equation used by Hodgkin and Huxley to describe the propagation of action potentials in the unmyelinated squid giant axon. The asymptotic reduction from the simplified PDE model to the cable equation leads to insights that are not otherwise apparent; these include an explanation of why the squid giant axon attains a diameter in the region of 1 mm. The simplified PDE model has more general application than the Hodgkin-Huxley cable equation and can, e.g. be used to describe action potential propagation in myelinated axons and neuronal cell bodies.

  5. Approximations of Two-Attribute Utility Functions

    DTIC Science & Technology

    1976-09-01

    preferred to") be a bina-zy relation on the set • of simple probability measures or ’gambles’ defined on a set T of consequences. Throughout this study it...simplifying independence assumptions. Although there are several approaches to this problem, the21 present study will focus on approximations of u... study will elicit additional interest in the topic. 2. REMARKS ON APPROXIMATION THEORY This section outlines a few basic ideas of approximation theory

  6. 'Are you siding with a personality or the grant proposal?': observations on how peer review panels function.

    PubMed

    Coveney, John; Herbert, Danielle L; Hill, Kathy; Mow, Karen E; Graves, Nicholas; Barnett, Adrian

    2017-01-01

    In Australia, the peer review process for competitive funding is usually conducted by a peer review group in conjunction with prior assessment from external assessors. This process is quite mysterious to those outside it. The purpose of this research was to throw light on grant review panels (sometimes called the 'black box') through an examination of the impact of panel procedures, panel composition and panel dynamics on the decision-making in the grant review process. A further purpose was to compare experience of a simplified review process with more conventional processes used in assessing grant proposals in Australia. This project was one aspect of a larger study into the costs and benefits of a simplified peer review process. The Queensland University of Technology (QUT)-simplified process was compared with the National Health and Medical Research Council's (NHMRC) more complex process. Grant review panellists involved in both processes were interviewed about their experience of the decision-making process that assesses the excellence of an application. All interviews were recorded and transcribed. Each transcription was de-identified and returned to the respondent for review. Final transcripts were read repeatedly and coded, and similar codes were amalgamated into categories that were used to build themes. Final themes were shared with the research team for feedback. Two major themes arose from the research: (1) assessing grant proposals and (2) factors influencing the fairness, integrity and objectivity of review. Issues such as the quality of writing in a grant proposal, comparison of the two review methods, the purpose and use of the rebuttal, assessing the financial value of funded projects, the importance of the experience of the panel membership and the role of track record and the impact of group dynamics on the review process were all discussed. The research also examined the influence of research culture on decision-making in grant review panels. One of the aims of this study was to compare a simplified review process with more conventional processes. Generally, participants were supportive of the simplified process. Transparency in the grant review process will result in better appreciation of the outcome. Despite the provision of clear guidelines for peer review, reviewing processes are likely to be subjective to the extent that different reviewers apply different rules. The peer review process will come under more scrutiny as funding for research becomes even more competitive. There is justification for further research on the process, especially of a kind that taps more deeply into the 'black box' of peer review.

  7. Simplified power processing for ion-thruster subsystems

    NASA Technical Reports Server (NTRS)

    Wessel, F. J.; Hancock, D. J.

    1983-01-01

    Compared to chemical propulsion, ion propulsion offers distinct payload-mass increases for many future low-thrust earth-orbital and deep-space missions. Despite this advantage, the high initial cost and complexity of ion-propulsion subsystems reduce their attractiveness for most present and near-term spacecraft missions. Investigations have, therefore, been conducted with the objective to attempt to simplify the power-processing unit (PPU), which is the single most complex and expensive component in the thruster subsystem. The present investigation is concerned with a program to simplify the design of the PPU employed in a 8-cm mercury-ion-thruster subsystem. In this program a dramatic simplification in the design of the PPU could be achieved, while retaining essential thruster control and subsystem operational flexibility.

  8. Unimolecular decomposition reactions at low-pressure: A comparison of competitive methods

    NASA Technical Reports Server (NTRS)

    Adams, G. F.

    1980-01-01

    The lack of a simple rate coefficient expression to describe the pressure and temperature dependence hampers chemical modeling of flame systems. Recently developed simplified models to describe unimolecular processes include the calculation of rate constants for thermal unimolecular reactions and recombinations at the low pressure limit, at the high pressure limit and in the intermediate fall-off region. Comparison between two different applications of Troe's simplified model and a comparison between the simplified model and the classic RRKM theory are described.

  9. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS

    NASA Astrophysics Data System (ADS)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.

    2012-12-01

    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  10. Role of Suzanne Mubarak Science Exploration Center in Motivating Physics Learning (abstract)

    NASA Astrophysics Data System (ADS)

    Mohsen, Mona

    2009-04-01

    The role of Science Exploration centers to promote learning ``beyond school walls'' is demonstrated. The Suzane Mubarak Science Exploration Center (www.smsec.com) at Hadaek El Kobba, Cairo, was inaugurated in 1998 with the assistance of Zusane Mubarak, the first lady of Egypt and the minister of education. It was the first interactive science and technology center in Egypt. After 10 years, the number of centers has increased to 33 nationwide. Since its inauguration the center has received over 3 million visitors. Through different facilities, such as the internet, science cities, multimedia, and virtual reality programs, basic principles of science are simplified and their technological applications in our daily lives are explored. These facilities are fully equipped with new media such as video conferencing, videotapes, overhead projectors, data shows, and libraries, as well as demonstration tools for basic science. The main objectives of the science exploration centers are discussed such as: (1) curricula development for on-line learning; (2) integration of e-learning programs into basic science (physics, mathematics, chemistry, and biology) and (3) workshops and organizations for students, teachers, and communities dealing with basic science programs.

  11. A simplified scheme for computing radiation transfer in the troposphere

    NASA Technical Reports Server (NTRS)

    Katayama, A.

    1973-01-01

    A scheme is presented, for the heating of clear and cloudy air by solar and infrared radiation transfer, designed for use in tropospheric general circulation models with coarse vertical resolution. A bulk transmission function is defined for the infrared transfer. The interpolation factors, required for computing the bulk transmission function, are parameterized as functions of such physical parameters as the thickness of the layer, the pressure, and the mixing ratio at a reference level. The computation procedure for solar radiation is significantly simplified by the introduction of two basic concepts. The first is that the solar radiation spectrum can be divided into a scattered part, for which Rayleigh scattering is significant but absorption by water vapor is negligible, and an absorbed part for which absorption by water vapor is significant but Rayleigh scattering is negligible. The second concept is that of an equivalent cloud water vapor amount which absorbs the same amount of radiation as the cloud.

  12. Measuring Phantom Recollection in the Simplified Conjoint Recognition Paradigm

    ERIC Educational Resources Information Center

    Stahl, Christoph; Klauer, Karl Christoph

    2009-01-01

    False memories are sometimes strong enough to elicit recollective experiences. This phenomenon has been termed Phantom Recollection (PR). The Conjoint Recognition (CR) paradigm has been used to empirically separate PR from other memory processes. Recently, a simplification of the CR procedure has been proposed. We herein extend the simplified CR…

  13. Simplified aerosol modeling for variational data assimilation

    NASA Astrophysics Data System (ADS)

    Huneeus, N.; Boucher, O.; Chevallier, F.

    2009-11-01

    We have developed a simplified aerosol model together with its tangent linear and adjoint versions for the ultimate aim of optimizing global aerosol and aerosol precursor emission using variational data assimilation. The model was derived from the general circulation model LMDz; it groups together the 24 aerosol species simulated in LMDz into 4 species, namely gaseous precursors, fine mode aerosols, coarse mode desert dust and coarse mode sea salt. The emissions have been kept as in the original model. Modifications, however, were introduced in the computation of aerosol optical depth and in the processes of sedimentation, dry and wet deposition and sulphur chemistry to ensure consistency with the new set of species and their composition. The simplified model successfully manages to reproduce the main features of the aerosol distribution in LMDz. The largest differences in aerosol load are observed for fine mode aerosols and gaseous precursors. Differences between the original and simplified models are mainly associated to the new deposition and sedimentation velocities consistent with the definition of species in the simplified model and the simplification of the sulphur chemistry. Furthermore, simulated aerosol optical depth remains within the variability of monthly AERONET observations for all aerosol types and all sites throughout most of the year. Largest differences are observed over sites with strong desert dust influence. In terms of the daily aerosol variability, the model is less able to reproduce the observed variability from the AERONET data with larger discrepancies in stations affected by industrial aerosols. The simplified model however, closely follows the daily simulation from LMDz. Sensitivity analyses with the tangent linear version show that the simplified sulphur chemistry is the dominant process responsible for the strong non-linearity of the model.

  14. Continuous/Batch Mg/MgH2/H2O-Based Hydrogen Generator

    NASA Technical Reports Server (NTRS)

    Kindler, Andrew; Huang, Yuhong

    2010-01-01

    A proposed apparatus for generating hydrogen by means of chemical reactions of magnesium and magnesium hydride with steam would exploit the same basic principles as those discussed in the immediately preceding article, but would be designed to implement a hybrid continuous/batch mode of operation. The design concept would simplify the problem of optimizing thermal management and would help to minimize the size and weight necessary for generating a given amount of hydrogen.

  15. Journal of Special Operations Medicine. Volume 10, Edition 1, Winter 2010

    DTIC Science & Technology

    2010-01-01

    defini- tive veterinary care, therefore enabling a more expeditious return to duty.. (Editor’s Note: Veterinary personnel have been lo - cated in 15 to 17...Abdominal distension may indi- cate the presence of a hemo-abdomen or uro -abdomen or gastric dilatation and volvulus. Abdominal pain, while a non...the U.S. Armed Forces, which simplifies the basics of finding a sponsoring agency and determining that agency’s ability to provide lo - gistic support

  16. System Aware Cybersecurity: A Multi-Sentinel Scheme to Protect a Weapons Research Lab

    DTIC Science & Technology

    2015-12-07

    In the simplified deployment scenario, some sensors report their output over a wireless link and other sensors are connected via CAT 5 (Ethernet...cable to reduce the chance of a wireless ‘jamming’ event impacting ALL sensors . In addition to this first sensor suite ( Sensor Suite “A”), the team...generating wind turbines , and video reconnaissance systems on unmanned aerial vehicles (UAVs). The most basic decision problem in designing a systems

  17. Manager's Role in Electromagnetic Interference (EMI) Control

    NASA Technical Reports Server (NTRS)

    Sargent, Noel B.; Lewis, Catherine C.

    2013-01-01

    This presentation captures the essence of electromagnetic compatibility (EMC) engineering from a project manager's perspective. It explains the basics of EMC and the benefits to the project of early incorporation of EMC best practices. The EMC requirement products during a project life cycle are identified, along with the requirement verification methods that should be utilized. The goal of the presentation is to raise awareness and simplify the mystique surrounding electromagnetic compatibility for managers that have little or no electromagnetics background

  18. Quadrennial Review of Military Compensation (7th). Compensation Structure. Major Topical Summary (MTS) 1

    DTIC Science & Technology

    1992-08-01

    professional sports franchises , fast food restaurants , or a widget factory as well as the uniformed services. The 7’ QRMC identified two additional...1990 ................. C-8 Figure C-7. Basic Pay as a Percentage of RMC, by Grade, 1991 ................... C-11 Figure C-8. Current Enlisted BAS vs ... independent survey. "* A separate but simplified system of special and incentive pays. "* Expense reimbursements. "* Other allowances and so-called fringe

  19. Simplified filtered Smith predictor for MIMO processes with multiple time delays.

    PubMed

    Santos, Tito L M; Torrico, Bismark C; Normey-Rico, Julio E

    2016-11-01

    This paper proposes a simplified tuning strategy for the multivariable filtered Smith predictor. It is shown that offset-free control can be achieved with step references and disturbances regardless of the poles of the primary controller, i.e., integral action is not explicitly required. This strategy reduces the number of design parameters and simplifies tuning procedure because the implicit integrative poles are not considered for design purposes. The simplified approach can be used to design continuous-time or discrete-time controllers. Three case studies are used to illustrate the advantages of the proposed strategy if compared with the standard approach, which is based on the explicit integrative action. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Hypersonic Vehicle Propulsion System Simplified Model Development

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Raitano, Paul; Le, Dzu K.; Ouzts, Peter

    2007-01-01

    This document addresses the modeling task plan for the hypersonic GN&C GRC team members. The overall propulsion system modeling task plan is a multi-step process and the task plan identified in this document addresses the first steps (short term modeling goals). The procedures and tools produced from this effort will be useful for creating simplified dynamic models applicable to a hypersonic vehicle propulsion system. The document continues with the GRC short term modeling goal. Next, a general description of the desired simplified model is presented along with simulations that are available to varying degrees. The simulations may be available in electronic form (FORTRAN, CFD, MatLab,...) or in paper form in published documents. Finally, roadmaps outlining possible avenues towards realizing simplified model are presented.

  1. Nonlinear Dynamic Inversion Baseline Control Law: Architecture and Performance Predictions

    NASA Technical Reports Server (NTRS)

    Miller, Christopher J.

    2011-01-01

    A model reference dynamic inversion control law has been developed to provide a baseline control law for research into adaptive elements and other advanced flight control law components. This controller has been implemented and tested in a hardware-in-the-loop simulation; the simulation results show excellent handling qualities throughout the limited flight envelope. A simple angular momentum formulation was chosen because it can be included in the stability proofs for many basic adaptive theories, such as model reference adaptive control. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as basic as possible to simplify the addition of the adaptive elements. Those design choices are explained, along with their predicted impact on the handling qualities.

  2. Thirty Years of Nearshore Research

    NASA Astrophysics Data System (ADS)

    Stanton, T. P.

    2006-12-01

    An assessment of Ed Thornton's contributions to nearshore wave, current and morphology research on the eve of his retirement reveals his ability to identify important unresolved processes, and to participate in multidisciplinary research programs that address those issues. While doing this, he has consistently helped foster the new generations of field scientists both by supervising strong masters and PhD students from around the world, and through working with successful postdoctoral students, many of whom will present talks in this session. This presentation will summarize the major field programs that he has very actively participated in starting from my first field work with him as a colleague/helper in the NSTS Blacks Beach and Santa Barbara experiments. In reviewing these experiments it is interesting to see the evolution of our understanding of the surf zone from simplified 2D wave breaking processes to the 3D wave/current/sediment transport problems seen in morphologically controlled rip currents, both through our ability to measure these processes, but also in the sophistication of numerical models of the nearshore. This review also shows how the consistent and well directed basic research funding from the Coastal Geosciences program at ONR has greatly benefited the Navy and the community at large. I know the nearshore community looks forward to continued associations with him during his very active retirement.

  3. Research on transient thermal process of a friction brake during repetitive cycles of operation

    NASA Astrophysics Data System (ADS)

    Slavchev, Yanko; Dimitrov, Lubomir; Dimitrov, Yavor

    2017-12-01

    Simplified models are used in the classical engineering analyses of the friction brake heating temperature during repetitive cycles of operation to determine basically the maximum and minimum brake temperatures. The objective of the present work is to broaden and complement the possibilities for research through a model that is based on the classical scheme of the Newton's law of cooling and improves the studies by adding a disturbance function for a corresponding braking process. A general case of braking in non-periodic repetitive mode is considered, for which a piecewise function is defined to apply pulse thermal loads to the system. Cases with rectangular and triangular waveforms are presented. Periodic repetitive braking process is also studied using a periodic rectangular waveform until a steady thermal state is achieved. Different numerical methods such as the Euler's method, the classical fourth order Runge-Kutta (RK4) and the Runge-Kutta-Fehlberg 4-5 (RKF45) are used to solve the non-linear differential equation of the model. The constructed model allows during pre-engineering calculations to be determined effectively the time for reaching the steady thermal state of the brake, to be simulated actual braking modes in vehicles and material handling machines, and to be accounted for the thermal impact when performing fatigue calculations.

  4. Fast ray-tracing of human eye optics on Graphics Processing Units.

    PubMed

    Wei, Qi; Patkar, Saket; Pai, Dinesh K

    2014-05-01

    We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Auditory Alterations in Children Infected by Human Immunodeficiency Virus Verified Through Auditory Processing Test

    PubMed Central

    Romero, Ana Carla Leite; Alfaya, Lívia Marangoni; Gonçales, Alina Sanches; Frizzo, Ana Claudia Figueiredo; Isaac, Myriam de Lima

    2016-01-01

    Introduction The auditory system of HIV-positive children may have deficits at various levels, such as the high incidence of problems in the middle ear that can cause hearing loss. Objective The objective of this study is to characterize the development of children infected by the Human Immunodeficiency Virus (HIV) in the Simplified Auditory Processing Test (SAPT) and the Staggered Spondaic Word Test. Methods We performed behavioral tests composed of the Simplified Auditory Processing Test and the Portuguese version of the Staggered Spondaic Word Test (SSW). The participants were 15 children infected by HIV, all using antiretroviral medication. Results The children had abnormal auditory processing verified by Simplified Auditory Processing Test and the Portuguese version of SSW. In the Simplified Auditory Processing Test, 60% of the children presented hearing impairment. In the SAPT, the memory test for verbal sounds showed more errors (53.33%); whereas in SSW, 86.67% of the children showed deficiencies indicating deficit in figure-ground, attention, and memory auditory skills. Furthermore, there are more errors in conditions of background noise in both age groups, where most errors were in the left ear in the Group of 8-year-olds, with similar results for the group aged 9 years. Conclusion The high incidence of hearing loss in children with HIV and comorbidity with several biological and environmental factors indicate the need for: 1) familiar and professional awareness of the impact on auditory alteration on the developing and learning of the children with HIV, and 2) access to educational plans and follow-up with multidisciplinary teams as early as possible to minimize the damage caused by auditory deficits. PMID:28050213

  6. Light-Gated Memristor with Integrated Logic and Memory Functions.

    PubMed

    Tan, Hongwei; Liu, Gang; Yang, Huali; Yi, Xiaohui; Pan, Liang; Shang, Jie; Long, Shibing; Liu, Ming; Wu, Yihong; Li, Run-Wei

    2017-11-28

    Memristive devices are able to store and process information, which offers several key advantages over the transistor-based architectures. However, most of the two-terminal memristive devices have fixed functions once made and cannot be reconfigured for other situations. Here, we propose and demonstrate a memristive device "memlogic" (memory logic) as a nonvolatile switch of logic operations integrated with memory function in a single light-gated memristor. Based on nonvolatile light-modulated memristive switching behavior, a single memlogic cell is able to achieve optical and electrical mixed basic Boolean logic of reconfigurable "AND", "OR", and "NOT" operations. Furthermore, the single memlogic cell is also capable of functioning as an optical adder and digital-to-analog converter. All the memlogic outputs are memristive for in situ data storage due to the nonvolatile resistive switching and persistent photoconductivity effects. Thus, as a memdevice, the memlogic has potential for not only simplifying the programmable logic circuits but also building memristive multifunctional optoelectronics.

  7. Continuous tooth generation in mouse is induced by activated epithelial Wnt/β-catenin signaling

    PubMed Central

    Järvinen, Elina; Salazar-Ciudad, Isaac; Birchmeier, Walter; Taketo, Makoto M.; Jernvall, Jukka; Thesleff, Irma

    2006-01-01

    The single replacement from milk teeth to permanent teeth makes mammalian teeth different from teeth of most nonmammalian vertebrates and other epithelial organs such as hair and feathers, whose continuous replacement has been linked to Wnt signaling. Here we show that mouse tooth buds expressing stabilized β-catenin in epithelium give rise to dozens of teeth. The molar crowns, however, are typically simplified unicusped cones. We demonstrate that the supernumerary teeth develop by a renewal process where new signaling centers, the enamel knots, bud off from the existing dental epithelium. The basic aspects of the unlocked tooth renewal can be reproduced with a computer model on tooth development by increasing the intrinsic level of activator production, supporting the role of β-catenin pathway as an upstream activator of enamel knot formation. These results may implicate Wnt signaling in tooth renewal, a capacity that was all but lost when mammals evolved progressively more complicated tooth shapes. PMID:17121988

  8. Microgravity

    NASA Image and Video Library

    2004-04-15

    Ribbons is a program developed at UAB used worldwide to graphically depict complicated protein structures in a simplified format. The program uses sophisticated computer systems to understand the implications of protein structures. The Influenza virus remains a major causative agent for a large number of deaths among the elderly and young children and huge economic losses due to illness. Finding a cure will have a general impact both on the basic research of viral pathologists of fast evolving infectious agents and clinical treatment of influenza virus infection. The reproduction process of all strains of influenza are dependent on the same enzyme neuraminidase. Shown here is a segmented representation of the neuraminidase inhibitor compound sitting inside a cave-like contour of the neuraminidase enzyme surface. This cave-like formation present in every neuraminidase enzyme is the active site crucial to the flu's ability to infect. The space-grown crystals of neuraminidase have provided significant new details about the three-dimensional characteristics of this active site thus allowing researchers to design drugs that fit tighter into the site. Principal Investigator: Dr. Larry DeLucas

  9. Point of Care- A Novel Approach to Periodontal Diagnosis-A Review

    PubMed Central

    Nayak, Prathibha Anand; Rana, Shivendra

    2017-01-01

    Periodontal disease, one of the prevalent oral diseases, is characterized by gingival inflammation and periodontal tissue destruction. Diagnosing this disease is challenging to the clinicians as the disease process is discontinuous and shows periods of exacerbation and remission. Traditional diagnostic methods basically tells about the past tissue destruction so new diagnostic methods are required which is able to detect the active state of the disease, determine the future progression and also estimates the response to the therapy, thereby helping in the better clinical management of the patient. Both saliva and Gingival crevicular fluid (GCF) are believed to be reliable medium to detect the biomarkers which plays a pivotal role in measuring the disease activity. Keeping these observations in mind rapid chairside tests are developed to diagnose periodontal disease called as Point of Care (POC) diagnostics which simplifies diagnosis and helps in improving the prognosis. This review article highlights about the biomarkers used in the diagnosis and throws light on the various available point of care diagnostic devices. PMID:28969294

  10. Understanding the Electrical Interplay Between a Firing Set and Exploding Metal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Malley, Patrick D.; Garasi, Christopher J.

    There is a significant body of work going back centuries that describes in detail the workings of metals that are rapidly transitioned from a solid to a vapor and beyond. These are known as exploding metals and have a variety of applications. A common way to cause metals to explode is through the use of a capacitive discharge circuit (CDC). In the past, methods have been used to simplify the complex, non-linear interaction between the CDC and the metal but in the process some important physics has been lost. This report provides insight into the complex interplay of the metalmore » and the various elements of the CDC. In explaining the basic phenomena in greater detail than has been done before, other interesting cases such as "dwell" are understood in a new light. The net result is a detailed look at the mechanisms which shape the current pulses that scientists and engineers have observed for many decades.« less

  11. Management information system of medical equipment using mobile devices

    NASA Astrophysics Data System (ADS)

    Núñez, C.; Castro, D.

    2011-09-01

    The large numbers of technologies currently incorporated into mobile devices transform them into excellent tools for capture and to manage the information, because of the increasing computing power and storage that allow to add many miscellaneous applications. In order to obtain benefits of these technologies, in the biomedical engineering field, it was developed a mobile information system for medical equipment management. The central platform for the system it's a mobile phone, which by a connection with a web server, it's capable to send and receive information relative to any medical equipment. Decoding a type of barcodes, known as QR-Codes, the management process is simplified and improved. These barcodes identified the medical equipments in a database, when these codes are photographed and decoded with the mobile device, you can access to relevant information about the medical equipment in question. This Project in it's actual state is a basic support tool for the maintenance of medical equipment. It is also a modern alternative, competitive and economic in the actual market.

  12. High-performance vertical organic transistors.

    PubMed

    Kleemann, Hans; Günther, Alrun A; Leo, Karl; Lüssem, Björn

    2013-11-11

    Vertical organic thin-film transistors (VOTFTs) are promising devices to overcome the transconductance and cut-off frequency restrictions of horizontal organic thin-film transistors. The basic physical mechanisms of VOTFT operation, however, are not well understood and VOTFTs often require complex patterning techniques using self-assembly processes which impedes a future large-area production. In this contribution, high-performance vertical organic transistors comprising pentacene for p-type operation and C60 for n-type operation are presented. The static current-voltage behavior as well as the fundamental scaling laws of such transistors are studied, disclosing a remarkable transistor operation with a behavior limited by injection of charge carriers. The transistors are manufactured by photolithography, in contrast to other VOTFT concepts using self-assembled source electrodes. Fluorinated photoresist and solvent compounds allow for photolithographical patterning directly and strongly onto the organic materials, simplifying the fabrication protocol and making VOTFTs a prospective candidate for future high-performance applications of organic transistors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Terry Turbopump Analytical Modeling Efforts in Fiscal Year 2016 ? Progress Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Douglas; Ross, Kyle; Cardoni, Jeffrey N

    This document details the Fiscal Year 2016 modeling efforts to define the true operating limitations (margins) of the Terry turbopump systems used in the nuclear industry for Milestone 3 (full-scale component experiments) and Milestone 4 (Terry turbopump basic science experiments) experiments. The overall multinational-sponsored program creates the technical basis to: (1) reduce and defer additional utility costs, (2) simplify plant operations, and (3) provide a better understanding of the true margin which could reduce overall risk of operations.

  14. A presentation of the black hole stretching effect

    NASA Astrophysics Data System (ADS)

    Vasileios Kontomaris, Stylianos; Malamou, Anna

    2018-01-01

    Black holes and the physics behind them is a fascinating topic for students of all levels. The exotic conditions which prevail near a black hole should be discussed and presented to undergraduate students in order to increase their interest in studying physics and to provide useful insights into basic physics concepts, such as non-uniform gravitational fields. For this purpose, a simplified presentation of the stretching effect which is experienced by an object near a black hole is presented in this paper.

  15. Algorithm Estimates Microwave Water-Vapor Delay

    NASA Technical Reports Server (NTRS)

    Robinson, Steven E.

    1989-01-01

    Accuracy equals or exceeds conventional linear algorithms. "Profile" algorithm improved algorithm using water-vapor-radiometer data to produce estimates of microwave delays caused by water vapor in troposphere. Does not require site-specific and weather-dependent empirical parameters other than standard meteorological data, latitude, and altitude for use in conjunction with published standard atmospheric data. Basic premise of profile algorithm, wet-path delay approximated closely by solution to simplified version of nonlinear delay problem and generated numerically from each radiometer observation and simultaneous meteorological data.

  16. Hardware implementation of Lorenz circuit systems for secure chaotic communication applications.

    PubMed

    Chen, Hsin-Chieh; Liau, Ben-Yi; Hou, Yi-You

    2013-02-18

    This paper presents the synchronization between the master and slave Lorenz chaotic systems by slide mode controller (SMC)-based technique. A proportional-integral (PI) switching surface is proposed to simplify the task of assigning the performance of the closed-loop error system in sliding mode. Then, extending the concept of equivalent control and using some basic electronic components, a secure communication system is constructed. Experimental results show the feasibility of synchronizing two Lorenz circuits via the proposed SMC. 

  17. Retrofit implementation of Zernike phase plate imaging for cryo-TEM

    PubMed Central

    Marko, Michael; Leith, ArDean; Hsieh, Chyongere; Danev, Radostin

    2011-01-01

    In-focus phase-plate imaging is particularly beneficial for cryo-TEM because it offers a substantial overall increase in image contrast, without an electron dose penalty, and it simplifies image interpretation. We show how phase-plate cryo-TEM can be implemented with an appropriate existing TEM, and provide a basic practical introduction to use of thin-film (carbon) phase plates. We point out potential pitfalls of phase-plate operation, and discuss solutions. We provide information on evaluating a particular TEM for its suitability. PMID:21272647

  18. Translations on USSR Science and Technology Biomedical and Behavioral Sciences No. 47.

    DTIC Science & Technology

    1978-10-27

    problem is allowed to be simplified,, then the basic contents of the ergonomic section may be castrated ; if it is complicated, then the document will be...demands of agriculture. The rise of livestock farming does not mean only an increase in cattle stock, but also an increase in the productivity of... cattle and poultry. Leonid Il’yich Brezhnev stressed atthe July Plenum: "All that we want to have rom the animal husbandry is more meat, milk, and other

  19. System for rapid detection of antibiotic resistance of airborne pathogens

    NASA Astrophysics Data System (ADS)

    Fortin, M.; Noiseux, I.; Mouslinkina, L.; Vernon, M. L.; Laflamme, C.; Filion, G.; Duchaine, C.; Ho, J.

    2009-05-01

    This project uses function-based detection via a fundamental understanding of the genetic markers of AR to distinguish harmful organisms from innocuous ones. This approach circumvents complex analyses to unravel the taxonomic details of 1399 pathogen species, enormously simplifying detection requirements. Laval Hospital's fast permeabilization strategy enables AR revelation in <1hr. Packaging the AR protocols in liquid-processing cartridges and coupling these to our in-house miniature fiber optic flow cell (FOFC) provides first responders with timely information on-site. INO's FOFC platform consists of a specialty optical fiber through which a hole is transversally bored by laser micromachining. The analyte solution is injected into the hole of the fiber and the particles are detected and counted. The advantage with respect to classic free space FC is that alignment occurs in the fabrication process only and complex excitation and collection optics are replaced by optical fibers. Moreover, we use a sheathless configuration which has the advantage of increase the portability of the system, to reduce excess biohazard material and the need for weekly maintenance. In this paper we present the principle of our FOFC along with a, demonstration of the basic capability of the platform for detection of bacillus cereus spores using permeabilized staining.

  20. The influence of battery degradation level on the selected traction parameters of a light-duty electric vehicle

    NASA Astrophysics Data System (ADS)

    Juda, Z.; Noga, M.

    2016-09-01

    The article describes results of an analysis of the impact of degradation level of battery made in lead-acid technology on selected traction parameters of an electric light duty vehicle. Lead-acid batteries are still used in these types of vehicles. They do not require complex systems of performance management and monitoring and are easy to maintaining. Despite the basic disadvantage, which is the low value of energy density, low price is a decisive factor for their use in low-speed electric vehicles. The process of aging of the battery related with an increase in internal resistance of the cells and the loss of electric capacity of the battery was considered. A simplified model of cooperation of the DC electric motor with the battery assuming increased internal resistance was presented. In the paper the results of comparative traction research of the light-duty vehicle equipped with a set of new batteries and set of batteries having a significant degradation level were showed. The analysis of obtained results showed that the correct exploitation of the battery can slow down the processes of degradation and, thus, extend battery life cycle.

  1. A neural-network-based approach to the double traveling salesman problem.

    PubMed

    Plebe, Alessio; Anile, Angelo Marcello

    2002-02-01

    The double traveling salesman problem is a variation of the basic traveling salesman problem where targets can be reached by two salespersons operating in parallel. The real problem addressed by this work concerns the optimization of the harvest sequence for the two independent arms of a fruit-harvesting robot. This application poses further constraints, like a collision-avoidance function. The proposed solution is based on a self-organizing map structure, initialized with as many artificial neurons as the number of targets to be reached. One of the key components of the process is the combination of competitive relaxation with a mechanism for deleting and creating artificial neurons. Moreover, in the competitive relaxation process, information about the trajectory connecting the neurons is combined with the distance of neurons from the target. This strategy prevents tangles in the trajectory and collisions between the two tours. Results of tests indicate that the proposed approach is efficient and reliable for harvest sequence planning. Moreover, the enhancements added to the pure self-organizing map concept are of wider importance, as proved by a traveling salesman problem version of the program, simplified from the double version for comparison.

  2. Simplified signal processing for impedance spectroscopy with spectrally sparse sequences

    NASA Astrophysics Data System (ADS)

    Annus, P.; Land, R.; Reidla, M.; Ojarand, J.; Mughal, Y.; Min, M.

    2013-04-01

    Classical method for measurement of the electrical bio-impedance involves excitation with sinusoidal waveform. Sinusoidal excitation at fixed frequency points enables wide variety of signal processing options, most general of them being Fourier transform. Multiplication with two quadrature waveforms at desired frequency could be easily accomplished both in analogue and in digital domains, even simplest quadrature square waves can be considered, which reduces signal processing task in analogue domain to synchronous switching followed by low pass filter, and in digital domain requires only additions. So called spectrally sparse excitation sequences (SSS), which have been recently introduced into bio-impedance measurement domain, are very reasonable choice when simultaneous multifrequency excitation is required. They have many good properties, such as ease of generation and good crest factor compared to similar multisinusoids. Typically, the usage of discrete or fast Fourier transform in signal processing step is considered so far. Usage of simplified methods nevertheless would reduce computational burden, and enable simpler, less costly and less energy hungry signal processing platforms. Accuracy of the measurement with SSS excitation when using different waveforms for quadrature demodulation will be compared in order to evaluate the feasibility of the simplified signal processing. Sigma delta modulated sinusoid (binary signal) is considered to be a good alternative for a synchronous demodulation.

  3. Baby Talk as a Simplified Register. Papers and Reports on Child Language Development, No. 9.

    ERIC Educational Resources Information Center

    Ferguson, Charles A.

    Every speech community has a baby talk register (BT) of phonological, grammatical, and lexical features regarded as primarily appropriate for addressing young children and also for other displaced or extended uses. Much BT is analyzable as derived from normal adult speech (AS) by such simplifying processes as reduction, substitution, assimilation,…

  4. Modeling and simulation of anion-exchange membrane chromatography for purification of Sf9 insect cell-derived virus-like particles.

    PubMed

    Ladd Effio, Christopher; Hahn, Tobias; Seiler, Julia; Oelmeier, Stefan A; Asen, Iris; Silberer, Christine; Villain, Louis; Hubbuch, Jürgen

    2016-01-15

    Recombinant protein-based virus-like particles (VLPs) are steadily gaining in importance as innovative vaccines against cancer and infectious diseases. Multiple VLPs are currently evaluated in clinical phases requiring a straightforward and rational process design. To date, there is no generic platform process available for the purification of VLPs. In order to accelerate and simplify VLP downstream processing, there is a demand for novel development approaches, technologies, and purification tools. Membrane adsorbers have been identified as promising stationary phases for the processing of bionanoparticles due to their large pore sizes. In this work, we present the potential of two strategies for designing VLP processes following the basic tenet of 'quality by design': High-throughput experimentation and process modeling of an anion-exchange membrane capture step. Automated membrane screenings allowed the identification of optimal VLP binding conditions yielding a dynamic binding capacity of 5.7 mg/mL for human B19 parvovirus-like particles derived from Spodoptera frugiperda Sf9 insect cells. A mechanistic approach was implemented for radial ion-exchange membrane chromatography using the lumped-rate model and stoichiometric displacement model for the in silico optimization of a VLP capture step. For the first time, process modeling enabled the in silico design of a selective, robust and scalable process with minimal experimental effort for a complex VLP feedstock. The optimized anion-exchange membrane chromatography process resulted in a protein purity of 81.5%, a DNA clearance of 99.2%, and a VLP recovery of 59%. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Simplifier: a web tool to eliminate redundant NGS contigs.

    PubMed

    Ramos, Rommel Thiago Jucá; Carneiro, Adriana Ribeiro; Azevedo, Vasco; Schneider, Maria Paula; Barh, Debmalya; Silva, Artur

    2012-01-01

    Modern genomic sequencing technologies produce a large amount of data with reduced cost per base; however, this data consists of short reads. This reduction in the size of the reads, compared to those obtained with previous methodologies, presents new challenges, including a need for efficient algorithms for the assembly of genomes from short reads and for resolving repetitions. Additionally after abinitio assembly, curation of the hundreds or thousands of contigs generated by assemblers demands considerable time and computational resources. We developed Simplifier, a stand-alone software that selectively eliminates redundant sequences from the collection of contigs generated by ab initio assembly of genomes. Application of Simplifier to data generated by assembly of the genome of Corynebacterium pseudotuberculosis strain 258 reduced the number of contigs generated by ab initio methods from 8,004 to 5,272, a reduction of 34.14%; in addition, N50 increased from 1 kb to 1.5 kb. Processing the contigs of Escherichia coli DH10B with Simplifier reduced the mate-paired library 17.47% and the fragment library 23.91%. Simplifier removed redundant sequences from datasets produced by assemblers, thereby reducing the effort required for finalization of genome assembly in tests with data from Prokaryotic organisms. Simplifier is available at http://www.genoma.ufpa.br/rramos/softwares/simplifier.xhtmlIt requires Sun jdk 6 or higher.

  6. A novel simplified model for torsional vibration analysis of a series-parallel hybrid electric vehicle

    NASA Astrophysics Data System (ADS)

    Tang, Xiaolin; Yang, Wei; Hu, Xiaosong; Zhang, Dejiu

    2017-02-01

    In this study, based on our previous work, a novel simplified torsional vibration dynamic model is established to study the torsional vibration characteristics of a compound planetary hybrid propulsion system. The main frequencies of the hybrid driveline are determined. In contrast to vibration characteristics of the previous 16-degree of freedom model, the simplified model can be used to accurately describe the low-frequency vibration property of this hybrid powertrain. This study provides a basis for further vibration control of the hybrid powertrain during the process of engine start/stop.

  7. A simplified and powerful image processing methods to separate Thai jasmine rice and sticky rice varieties

    NASA Astrophysics Data System (ADS)

    Khondok, Piyoros; Sakulkalavek, Aparporn; Suwansukho, Kajpanya

    2018-03-01

    A simplified and powerful image processing procedures to separate the paddy of KHAW DOK MALI 105 or Thai jasmine rice and the paddy of sticky rice RD6 varieties were proposed. The procedures consist of image thresholding, image chain coding and curve fitting using polynomial function. From the fitting, three parameters of each variety, perimeters, area, and eccentricity, were calculated. Finally, the overall parameters were determined by using principal component analysis. The result shown that these procedures can be significantly separate both varieties.

  8. Particle circulation and solids transport in large bubbling fluidized beds. Progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Homsy, G.M.

    1982-04-01

    We have undertaken a theoretical study of the possibility of the formation of plumes or channeling when coal particles volatilize upon introduction to a fluidized bed, Fitzgerald (1980). We have completed the analysis of the basic state of uniform flow and are currently completing a stability analysis. We have modified the continuum equations of fluidization, Homsy et al. (1980), to include the source of gas due to volatilization, which we assume to be uniformly distributed spatially. Simplifying these equations and solving leads to the prediction of a basic state analogous to the state of uniform fluidization found when no sourcemore » is present within the medium. We are currently completing a stability analysis of this basic state which will give the critical volatilization rate above which the above simple basic state is unstable. Because of the experimental evidence of Jewett and Lawless (1981), who observed regularly spaced plume-like instabilities upon drying a bed of saturated silica gel, we are considering two-dimensional periodic disturbances. The analysis is similar to that given by Homsy, et al. (1980) and Medlin et al. (1974). We hope to determine the stability limits for this system shortly.« less

  9. Estimating the circuit delay of FPGA with a transfer learning method

    NASA Astrophysics Data System (ADS)

    Cui, Xiuhai; Liu, Datong; Peng, Yu; Peng, Xiyuan

    2017-10-01

    With the increase of FPGA (Field Programmable Gate Array, FPGA) functionality, FPGA has become an on-chip system platform. Due to increase the complexity of FPGA, estimating the delay of FPGA is a very challenge work. To solve the problems, we propose a transfer learning estimation delay (TLED) method to simplify the delay estimation of different speed grade FPGA. In fact, the same style different speed grade FPGA comes from the same process and layout. The delay has some correlation among different speed grade FPGA. Therefore, one kind of speed grade FPGA is chosen as a basic training sample in this paper. Other training samples of different speed grade can get from the basic training samples through of transfer learning. At the same time, we also select a few target FPGA samples as training samples. A general predictive model is trained by these samples. Thus one kind of estimation model is used to estimate different speed grade FPGA circuit delay. The framework of TRED includes three phases: 1) Building a basic circuit delay library which includes multipliers, adders, shifters, and so on. These circuits are used to train and build the predictive model. 2) By contrasting experiments among different algorithms, the forest random algorithm is selected to train predictive model. 3) The target circuit delay is predicted by the predictive model. The Artix-7, Kintex-7, and Virtex-7 are selected to do experiments. Each of them includes -1, -2, -2l, and -3 different speed grade. The experiments show the delay estimation accuracy score is more than 92% with the TLED method. This result shows that the TLED method is a feasible delay assessment method, especially in the high-level synthesis stage of FPGA tool, which is an efficient and effective delay assessment method.

  10. Children with autism spectrum disorder are skilled at reading emotion body language.

    PubMed

    Peterson, Candida C; Slaughter, Virginia; Brownell, Celia

    2015-11-01

    Autism is commonly believed to impair the ability to perceive emotions, yet empirical evidence is mixed. Because face processing may be difficult for those with autism spectrum disorder (ASD), we developed a novel test of recognizing emotion via static body postures (Body-Emotion test) and evaluated it with children aged 5 to 12 years in two studies. In Study 1, 34 children with ASD and 41 typically developing (TD) controls matched for age and verbal intelligence (VIQ [verbal IQ]) were tested on (a) our new Body-Emotion test, (b) a widely used test of emotion recognition using photos of eyes as stimuli (Baron-Cohen et al.'s "Reading Mind in the Eyes: Child" or RMEC [Journal of Developmental and Learning Disorders, 2001, Vol. 5, pp. 47-78]), (c) a well-validated theory of mind (ToM) battery, and (d) a teacher-rated empathy scale. In Study 2 (33 children with ASD and 31 TD controls), the RMEC test was simplified to the six basic human emotions. Results of both studies showed that children with ASD performed as well as their TD peers on the Body-Emotion test. Yet TD children outperformed the ASD group on ToM and on both the standard RMEC test and the simplified version. VIQ was not related to perceiving emotions via either body posture or eyes for either group. However, recognizing emotions from body posture was correlated with ToM, especially for children with ASD. Finally, reading emotions from body posture was easier than reading emotions from eyes for both groups. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Failure mode and effects analysis: a comparison of two common risk prioritisation methods.

    PubMed

    McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L

    2016-05-01

    Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. Service management at CERN with Service-Now

    NASA Astrophysics Data System (ADS)

    Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.

    2012-12-01

    The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.

  13. Simplified process model discovery based on role-oriented genetic mining.

    PubMed

    Zhao, Weidong; Liu, Xi; Dai, Weihui

    2014-01-01

    Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.

  14. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  15. Simplified Interval Observer Scheme: A New Approach for Fault Diagnosis in Instruments

    PubMed Central

    Martínez-Sibaja, Albino; Astorga-Zaragoza, Carlos M.; Alvarado-Lassman, Alejandro; Posada-Gómez, Rubén; Aguila-Rodríguez, Gerardo; Rodríguez-Jarquin, José P.; Adam-Medina, Manuel

    2011-01-01

    There are different schemes based on observers to detect and isolate faults in dynamic processes. In the case of fault diagnosis in instruments (FDI) there are different diagnosis schemes based on the number of observers: the Simplified Observer Scheme (SOS) only requires one observer, uses all the inputs and only one output, detecting faults in one detector; the Dedicated Observer Scheme (DOS), which again uses all the inputs and just one output, but this time there is a bank of observers capable of locating multiple faults in sensors, and the Generalized Observer Scheme (GOS) which involves a reduced bank of observers, where each observer uses all the inputs and m-1 outputs, and allows the localization of unique faults. This work proposes a new scheme named Simplified Interval Observer SIOS-FDI, which does not requires the measurement of any input and just with just one output allows the detection of unique faults in sensors and because it does not require any input, it simplifies in an important way the diagnosis of faults in processes in which it is difficult to measure all the inputs, as in the case of biologic reactors. PMID:22346593

  16. A Fast Proceduere for Optimizing Thermal Protection Systems of Re-Entry Vehicles

    NASA Astrophysics Data System (ADS)

    Ferraiuolo, M.; Riccio, A.; Tescione, D.; Gigliotti, M.

    The aim of the present work is to introduce a fast procedure to optimize thermal protection systems for re-entry vehicles subjected to high thermal loads. A simplified one-dimensional optimization process, performed in order to find the optimum design variables (lengths, sections etc.), is the first step of the proposed design procedure. Simultaneously, the most suitable materials able to sustain high temperatures and meeting the weight requirements are selected and positioned within the design layout. In this stage of the design procedure, simplified (generalized plane strain) FEM models are used when boundary and geometrical conditions allow the reduction of the degrees of freedom. Those simplified local FEM models can be useful because they are time-saving and very simple to build; they are essentially one dimensional and can be used for optimization processes in order to determine the optimum configuration with regard to weight, temperature and stresses. A triple-layer and a double-layer body, subjected to the same aero-thermal loads, have been optimized to minimize the overall weight. Full two and three-dimensional analyses are performed in order to validate those simplified models. Thermal-structural analyses and optimizations are executed by adopting the Ansys FEM code.

  17. A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth

    PubMed Central

    Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai

    2017-01-01

    State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production. PMID:28848565

  18. A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth.

    PubMed

    Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai

    2017-01-01

    State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production.

  19. 48 CFR 1513.507 - Clauses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... automatic data processing equipment, word processing, and similar types of commercially available equipment... CONTRACT TYPES SIMPLIFIED ACQUISITION PROCEDURES Purchase Orders 1513.507 Clauses. (a) It is the general...

  20. Diffusion of Supercritical Fluids through Single-Layer Nanoporous Solids: Theory and Molecular Simulations.

    PubMed

    Oulebsir, Fouad; Vermorel, Romain; Galliero, Guillaume

    2018-01-16

    With the advent of graphene material, membranes based on single-layer nanoporous solids appear as promising devices for fluid separation, be it liquid or gaseous mixtures. The design of such architectured porous materials would greatly benefit from accurate models that can predict their transport and separation properties. More specifically, there is no universal understanding of how parameters such as temperature, fluid loading conditions, or the ratio of the pore size to the fluid molecular diameter influence the permeation process. In this study, we address the problem of pure supercritical fluids diffusing through simplified models of single-layer porous materials. Basically, we investigate a toy model that consists of a single-layer lattice of Lennard-Jones interaction sites with a slit gap of controllable width. We performed extensive equilibrium and biased molecular dynamics simulations to document the physical mechanisms involved at the molecular scale. We propose a general constitutive equation for the diffusional transport coefficient derived from classical statistical mechanics and kinetic theory, which can be further simplified in the ideal gas limit. This transport coefficient relates the molecular flux to the fluid density jump across the single-layer membrane. It is found to be proportional to the accessible surface porosity of the single-layer porous solid and to a thermodynamic factor accounting for the inhomogeneity of the fluid close to the pore entrance. Both quantities directly depend on the potential of mean force that results from molecular interactions between solid and fluid atoms. Comparisons with the simulations data show that the kinetic model captures how narrowing the pore size below the fluid molecular diameter lowers dramatically the value of the transport coefficient. Furthermore, we demonstrate that our general constitutive equation allows for a consistent interpretation of the intricate effects of temperature and fluid loading conditions on the permeation process.

  1. Development of a one-dimensional Position Sensitive Detector for tracking applications

    NASA Astrophysics Data System (ADS)

    Lydecker, Leigh Kent, IV

    Optical Position Sensitive Detectors (PSDs) are a non-contact method of tracking the location of a light spot. Silicon-based versions of such sensors are fabricated with standard CMOS processing, are inexpensive and provide a real-time, analog signal output corresponding to the position of the light spot. Because they are non-contact, they do not degrade over time from surface friction due to repetitive sliding motion associated with standard full contact sliding potentiometers. This results in long, reliable device lifetimes. In this work, an innovative PSD was developed to replace the linear hard contact potentiometer currently being used in a human-computer interface architecture. First, a basic lateral effect PSD was developed to provide real-time positioning of the mouthpiece used in the interface architecture which tracks along a single axis. During the course of this work, multiple device geometries were fabricated and analyzed resulting in a down selection of a final design. This final device design was then characterized in terms of resolution and responsivity and produced in larger quantities as initial prototypes for the test product integration. Finally, an electronic readout circuit was developed in order to interface the dual- line lateral effect PSD developed in this thesis with specifications required for product integration. To simplify position sensing, an innovative type of optical position sensor was developed using a linear photodiodes with back-to-back connections. This so- called Self-Balancing Position Sensitive Detector (SBPSD) requires significantly fewer processing steps than the basic lateral effect position sensitive detector discussed above and eliminates the need for external readout circuitry entirely. Prototype devices were fabricated in this work, and the performance characteristics of these devices were established paving the way for ultimate integration into the target product as well as additional applications.

  2. Basic science of pain.

    PubMed

    DeLeo, Joyce A

    2006-04-01

    The origin of the theory that the transmission of pain is through a single channel from the skin to the brain can be traced to the philosopher and scientist René Descartes. This simplified scheme of the reflex was the beginning of the development of the modern doctrine of reflexes. Unfortunately, Descartes' reflex theory directed both the study and treatment of pain for more than 330 years. It is still described in physiology and neuroscience textbooks as fact rather than theory. The gate control theory proposed by Melzack and Wall in 1965 rejuvenated the field of pain study and led to further investigation into the phenomena of spinal sensitization and central nervous system plasticity, which are the potential pathophysiologic correlates of chronic pain. The processing of pain takes place in an integrated matrix throughout the neuroaxis and occurs on at least three levels-at peripheral, spinal, and supraspinal sites. Basic strategies of pain control monopolize on this concept of integration by attenuation or blockade of pain through intervention at the periphery, by activation of inhibitory processes that gate pain at the spinal cord and brain, and by interference with the perception of pain. This article discusses each level of pain modulation and reviews the mechanisms of action of opioids and potential new analgesics. A brief description of animal models frames a discussion about recent advances regarding the role of glial cells and central nervous system neuroimmune activation and innate immunity in the etiology of chronic pain states. Future investigation into the discovery and development of novel, nonopioid drug therapy may provide needed options for the millions of patients who suffer from chronic pain syndromes, including syndromes in which the pain originates from peripheral nerve, nerve root, spinal cord, bone, muscle, and disc.

  3. Modified optimal control pilot model for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Schmidt, David K.

    1992-01-01

    This paper presents the theoretical development of a modified optimal control pilot model based upon the optimal control model (OCM) of the human operator developed by Kleinman, Baron, and Levison. This model is input compatible with the OCM and retains other key aspects of the OCM, such as a linear quadratic solution for the pilot gains with inclusion of control rate in the cost function, a Kalman estimator, and the ability to account for attention allocation and perception threshold effects. An algorithm designed for each implementation in current dynamic systems analysis and design software is presented. Example results based upon the analysis of a tracking task using three basic dynamic systems are compared with measured results and with similar analyses performed with the OCM and two previously proposed simplified optimal pilot models. The pilot frequency responses and error statistics obtained with this modified optimal control model are shown to compare more favorably to the measured experimental results than the other previously proposed simplified models evaluated.

  4. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. Part 1: Pressure distribution. Part 2: Wall shear stress. Part 3: Simplified formulas for the prediction of surface pressures and skin friction

    NASA Technical Reports Server (NTRS)

    Adamson, T. C., Jr.; Liou, M. S.; Messiter, A. F.

    1980-01-01

    An asymptotic description is derived for the interaction between a shock wave and a turbulent boundary layer in transonic flow, for a particular limiting case. The dimensionless difference between the external flow velocity and critical sound speed is taken to be much smaller than one, but large in comparison with the dimensionless friction velocity. The basic results are derived for a flat plate, and corrections for longitudinal wall curvature and for flow in a circular pipe are also shown. Solutions are given for the wall pressure distribution and the shape of the shock wave. Solutions for the wall shear stress are obtained, and a criterion for incipient separation is derived. Simplified solutions for both the wall pressure and skin friction distributions in the interaction region are given. These results are presented in a form suitable for use in computer programs.

  5. Characterization of impulse noise and analysis of its effect upon correlation receivers

    NASA Technical Reports Server (NTRS)

    Houts, R. C.; Moore, J. D.

    1971-01-01

    A noise model is formulated to describe the impulse noise in many digital systems. A simplified model, which assumes that each noise burst contains a randomly weighted version of the same basic waveform, is used to derive the performance equations for a correlation receiver. The expected number of bit errors per noise burst is expressed as a function of the average signal energy, signal-set correlation coefficient, bit time, noise-weighting-factor variance and probability density function, and a time range function which depends on the crosscorrelation of the signal-set basis functions and the noise waveform. A procedure is established for extending the results for the simplified noise model to the general model. Unlike the performance results for Gaussian noise, it is shown that for impulse noise the error performance is affected by the choice of signal-set basis functions and that Orthogonal signaling is not equivalent to On-Off signaling with the same average energy.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, V.K.; Patel, A.S.; Sharma, A.

    This paper presents the design of magnetic coil for relativistic magnetron (RM) for LIA (Linear Induction Accelerator)-400 systems. Vacuum improves the efficiency of RM for HPM generation. Magnetic field in RM is very critical parameter and should be nearly constant in the active region. Typical coils are helical in nature, which have multi turns of varying radius. Magnetic field calculation of such coils with basic equations of Helmholtz coils or solenoid with mean radius can only give estimation. Field computational softwares like CST require small mesh size and boundary at very far so consume large memory and take very muchmore » time. Helical coils are simplified such that the basic law of magnetic field calculation i.e. Bio-Savart law can be applied with less complexity. Pairs of spiral coils have been analyzed for magnetic field and Lorenz's force. The approach is field experimentally validated. (author)« less

  7. A simplified boron diffusion for preparing the silicon single crystal p-n junction as an educational device

    NASA Astrophysics Data System (ADS)

    Shiota, Koki; Kai, Kazuho; Nagaoka, Shiro; Tsuji, Takuto; Wakahara, Akihiro; Rusop, Mohamad

    2016-07-01

    The educational method which is including designing, making, and evaluating actual semiconductor devices with learning the theory is one of the best way to obtain the fundamental understanding of the device physics and to cultivate the ability to make unique ideas using the knowledge in the semiconductor device. In this paper, the simplified Boron thermal diffusion process using Sol-Gel material under normal air environment was proposed based on simple hypothesis and the feasibility of the reproducibility and reliability were investigated to simplify the diffusion process for making the educational devices, such as p-n junction, bipolar and pMOS devices. As the result, this method was successfully achieved making p+ region on the surface of the n-type silicon substrates with good reproducibility. And good rectification property of the p-n junctions was obtained successfully. This result indicates that there is a possibility to apply on the process making pMOS or bipolar transistors. It suggests that there is a variety of the possibility of the applications in the educational field to foster an imagination of new devices.

  8. A simplified conjoint recognition paradigm for the measurement of gist and verbatim memory.

    PubMed

    Stahl, Christoph; Klauer, Karl Christoph

    2008-05-01

    The distinction between verbatim and gist memory traces has furthered the understanding of numerous phenomena in various fields, such as false memory research, research on reasoning and decision making, and cognitive development. To measure verbatim and gist memory empirically, an experimental paradigm and multinomial measurement model has been proposed but rarely applied. In the present article, a simplified conjoint recognition paradigm and multinomial model is introduced and validated as a measurement tool for the separate assessment of verbatim and gist memory processes. A Bayesian metacognitive framework is applied to validate guessing processes. Extensions of the model toward incorporating the processes of phantom recollection and erroneous recollection rejection are discussed.

  9. Self-assembly kinetics of microscale components: A parametric evaluation

    NASA Astrophysics Data System (ADS)

    Carballo, Jose M.

    The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.

  10. Automatic methods of the processing of data from track detectors on the basis of the PAVICOM facility

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-02-01

    New automatic methods essentially simplify and increase the rate of the processing of data from track detectors. This provides a possibility of processing large data arrays and considerably improves their statistical significance. This fact predetermines the development of new experiments which plan to use large-volume targets, large-area emulsion, and solid-state track detectors [1]. In this regard, the problem of training qualified physicists who are capable of operating modern automatic equipment is very important. Annually, about ten Moscow students master the new methods, working at the Lebedev Physical Institute at the PAVICOM facility [2 4]. Most students specializing in high-energy physics are only given an idea of archaic manual methods of the processing of data from track detectors. In 2005, on the basis of the PAVICOM facility and the physicstraining course of Moscow State University, a new training work was prepared. This work is devoted to the determination of the energy of neutrons passing through a nuclear emulsion. It provides the possibility of acquiring basic practical skills of the processing of data from track detectors using automatic equipment and can be included in the educational process of students of any physical faculty. Those who have mastered the methods of automatic data processing in a simple and pictorial example of track detectors will be able to apply their knowledge in various fields of science and technique. Formulation of training works for pregraduate and graduate students is a new additional aspect of application of the PAVICOM facility described earlier in [4].

  11. Quantitative accuracy of the simplified strong ion equation to predict serum pH in dogs.

    PubMed

    Cave, N J; Koo, S T

    2015-01-01

    Electrochemical approach to the assessment of acid-base states should provide a better mechanistic explanation of the metabolic component than methods that consider only pH and carbon dioxide. Simplified strong ion equation (SSIE), using published dog-specific values, would predict the measured serum pH of diseased dogs. Ten dogs, hospitalized for various reasons. Prospective study of a convenience sample of a consecutive series of dogs admitted to the Massey University Veterinary Teaching Hospital (MUVTH), from which serum biochemistry and blood gas analyses were performed at the same time. Serum pH was calculated (Hcal+) using the SSIE, and published values for the concentration and dissociation constant for the nonvolatile weak acids (Atot and Ka ), and subsequently Hcal+ was compared with the dog's actual pH (Hmeasured+). To determine the source of discordance between Hcal+ and Hmeasured+, the calculations were repeated using a series of substituted values for Atot and Ka . The Hcal+ did not approximate the Hmeasured+ for any dog (P = 0.499, r(2) = 0.068), and was consistently more basic. Substituted values Atot and Ka did not significantly improve the accuracy (r(2) = 0.169 to <0.001). Substituting the effective SID (Atot-[HCO3-]) produced a strong association between Hcal+ and Hmeasured+ (r(2) = 0.977). Using the simplified strong ion equation and the published values for Atot and Ka does not appear to provide a quantitative explanation for the acid-base status of dogs. Efficacy of substituting the effective SID in the simplified strong ion equation suggests the error lies in calculating the SID. Copyright © 2015 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  12. Changes in the width of the tropical belt due to simple radiative forcing changes in the GeoMIP simulations

    NASA Astrophysics Data System (ADS)

    Davis, Nicholas A.; Seidel, Dian J.; Birner, Thomas; Davis, Sean M.; Tilmes, Simone

    2016-08-01

    Model simulations of future climates predict a poleward expansion of subtropical arid climates at the edges of Earth's tropical belt, which would have significant environmental and societal impacts. This expansion may be related to the poleward shift of the Hadley cell edges, where subsidence stabilizes the atmosphere and suppresses precipitation. Understanding the primary drivers of tropical expansion is hampered by the myriad forcing agents in most model projections of future climate. While many previous studies have examined the response of idealized models to simplified climate forcings and the response of comprehensive climate models to more complex climate forcings, few have examined how comprehensive climate models respond to simplified climate forcings. To shed light on robust processes associated with tropical expansion, here we examine how the tropical belt width, as measured by the Hadley cell edges, responds to simplified forcings in the Geoengineering Model Intercomparison Project (GeoMIP). The tropical belt expands in response to a quadrupling of atmospheric carbon dioxide concentrations and contracts in response to a reduction in the solar constant, with a range of a factor of 3 in the response among nine models. Models with more surface warming and an overall stronger temperature response to quadrupled carbon dioxide exhibit greater tropical expansion, a robust result in spite of inter-model differences in the mean Hadley cell width, parameterizations, and numerical schemes. Under a scenario where the solar constant is reduced to offset an instantaneous quadrupling of carbon dioxide, the Hadley cells remain at their preindustrial width, despite the residual stratospheric cooling associated with elevated carbon dioxide levels. Quadrupled carbon dioxide produces greater tropical belt expansion in the Southern Hemisphere than in the Northern Hemisphere. This expansion is strongest in austral summer and autumn. Ozone depletion has been argued to cause this pattern of changes in observations and model experiments, but the results here indicate that seasonally and hemispherically asymmetric tropical expansion can be a basic response of the general circulation to climate forcings.

  13. A simplified computational memory model from information processing.

    PubMed

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  14. A new formal graphic language for the representation of complex energy distribution systems

    NASA Astrophysics Data System (ADS)

    Benes, E.; Viehboeck, F. P.

    A schematic notation system for the representation in design and analysis of multi-component heating systems is presented. This graphic language is clear and rigorous and allows quick changes between two basic levels of abstraction, as shown by two examples: a swimming pool with combined solar/electric heating system and the low temperature heating system of the Institute of Molecular Biology in Salzburg, Austria. The notation's 'energy path graphs' are more adequate for judging the relative merits of alternative system configurations than commonly used simplified installation schemes.

  15. Retrofit implementation of Zernike phase plate imaging for cryo-TEM.

    PubMed

    Marko, Michael; Leith, Ardean; Hsieh, Chyongere; Danev, Radostin

    2011-05-01

    In-focus phase-plate imaging is particularly beneficial for cryo-TEM because it offers a substantial overall increase in image contrast, without an electron dose penalty, and it simplifies image interpretation. We show how phase-plate cryo-TEM can be implemented with an appropriate existing TEM, and provide a basic practical introduction to use of thin-film (carbon) phase plates. We point out potential pitfalls of phase-plate operation, and discuss solutions. We provide information on evaluating a particular TEM for its suitability. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Solar dynamic heat receiver thermal characteristics in low earth orbit

    NASA Technical Reports Server (NTRS)

    Wu, Y. C.; Roschke, E. J.; Birur, G. C.

    1988-01-01

    A simplified system model is under development for evaluating the thermal characteristics and thermal performance of a solar dynamic spacecraft energy system's heat receiver. Results based on baseline orbit, power system configuration, and operational conditions, are generated for three basic receiver concepts and three concentrator surface slope errors. Receiver thermal characteristics and thermal behavior in LEO conditions are presented. The configuration in which heat is directly transferred to the working fluid is noted to generate the best system and thermal characteristics. as well as the lowest performance degradation with increasing slope error.

  17. Gasdynamic lasers and photon machines.

    NASA Technical Reports Server (NTRS)

    Christiansen, W. H.; Hertzberg, A.

    1973-01-01

    The basic operational highlights of CO2-N2 gasdynamic lasers (GDL's) are described. Features common to powerful gas lasers are indicated. A simplified model of the vibrational kinetics of the system is presented, and the importance of rapid expansion nozzles is shown from analytic solutions of the equations. A high-power pulsed GDL is described, along with estimations of power extraction. A closed-cycle laser is suggested, leading to a description of a photon generator/engine. Thermodynamic analysis of the closed-cycle laser illustrates in principle the possibility of direct conversion of laser energy to work.

  18. 3D Feature Extraction for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Silver, Deborah

    1996-01-01

    Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.

  19. Possibility-induced simplified neutrosophic aggregation operators and their application to multi-criteria group decision-making

    NASA Astrophysics Data System (ADS)

    Şahin, Rıdvan; Liu, Peide

    2017-07-01

    Simplified neutrosophic set (SNS) is an appropriate tool used to express the incompleteness, indeterminacy and uncertainty of the evaluation objects in decision-making process. In this study, we define the concept of possibility SNS including two types of information such as the neutrosophic performance provided from the evaluation objects and its possibility degree using a value ranging from zero to one. Then by extending the existing neutrosophic information, aggregation models for SNSs that cannot be used effectively to fusion the two different information described above, we propose two novel neutrosophic aggregation operators considering possibility, which are named as a possibility-induced simplified neutrosophic weighted arithmetic averaging operator and possibility-induced simplified neutrosophic weighted geometric averaging operator, and discuss their properties. Moreover, we develop a useful method based on the proposed aggregation operators for solving a multi-criteria group decision-making problem with the possibility simplified neutrosophic information, in which the weights of decision-makers and decision criteria are calculated based on entropy measure. Finally, a practical example is utilised to show the practicality and effectiveness of the proposed method.

  20. A hip joint simulator study using simplified loading and motion cycles generating physiological wear paths and rates.

    PubMed

    Barbour, P S; Stone, M H; Fisher, J

    1999-01-01

    In some designs of hip joint simulator the cost of building a highly complex machine has been offset with the requirement for a large number of test stations. The application of the wear results generated by these machines depends on their ability to reproduce physiological wear rates and processes. In this study a hip joint simulator has been shown to reproduce physiological wear using only one load vector and two degrees of motion with simplified input cycles. The actual path of points on the femoral head relative to the acetabular cup were calculated and compared for physiological and simplified input cycles. The in vitro wear rates were found to be highly dependent on the shape of these paths and similarities could be drawn between the shape of the physiological paths and the simplified elliptical paths.

  1. Unbiased methods for removing systematics from galaxy clustering measurements

    NASA Astrophysics Data System (ADS)

    Elsner, Franz; Leistedt, Boris; Peiris, Hiranya V.

    2016-02-01

    Measuring the angular clustering of galaxies as a function of redshift is a powerful method for extracting information from the three-dimensional galaxy distribution. The precision of such measurements will dramatically increase with ongoing and future wide-field galaxy surveys. However, these are also increasingly sensitive to observational and astrophysical contaminants. Here, we study the statistical properties of three methods proposed for controlling such systematics - template subtraction, basic mode projection, and extended mode projection - all of which make use of externally supplied template maps, designed to characterize and capture the spatial variations of potential systematic effects. Based on a detailed mathematical analysis, and in agreement with simulations, we find that the template subtraction method in its original formulation returns biased estimates of the galaxy angular clustering. We derive closed-form expressions that should be used to correct results for this shortcoming. Turning to the basic mode projection algorithm, we prove it to be free of any bias, whereas we conclude that results computed with extended mode projection are biased. Within a simplified setup, we derive analytical expressions for the bias and discuss the options for correcting it in more realistic configurations. Common to all three methods is an increased estimator variance induced by the cleaning process, albeit at different levels. These results enable unbiased high-precision clustering measurements in the presence of spatially varying systematics, an essential step towards realizing the full potential of current and planned galaxy surveys.

  2. Unit mechanisms of fission gas release: Current understanding and future needs

    DOE PAGES

    Tonks, Michael; Andersson, David; Devanathan, Ram; ...

    2018-03-01

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less

  3. A comparison of experimental and calculated thin-shell leading-edge buckling due to thermal stresses

    NASA Technical Reports Server (NTRS)

    Jenkins, Jerald M.

    1988-01-01

    High-temperature thin-shell leading-edge buckling test data are analyzed using NASA structural analysis (NASTRAN) as a finite element tool for predicting thermal buckling characteristics. Buckling points are predicted for several combinations of edge boundary conditions. The problem of relating the appropriate plate area to the edge stress distribution and the stress gradient is addressed in terms of analysis assumptions. Local plasticity was found to occur on the specimen analyzed, and this tended to simplify the basic problem since it effectively equalized the stress gradient from loaded edge to loaded edge. The initial loading was found to be difficult to select for the buckling analysis because of the transient nature of thermal stress. Multiple initial model loadings are likely required for complicated thermal stress time histories before a pertinent finite element buckling analysis can be achieved. The basic mode shapes determined from experimentation were correctly identified from computation.

  4. Unit mechanisms of fission gas release: Current understanding and future needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonks, Michael; Andersson, David; Devanathan, Ram

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less

  5. Slew maneuvers of Spacecraft Control Laboratory Experiment (SCOLE)

    NASA Technical Reports Server (NTRS)

    Kakad, Yogendra P.

    1992-01-01

    This is the final report on the dynamics and control of slew maneuvers of the Spacecraft Control Laboratory Experiment (SCOLE) test facility. The report documents the basic dynamical equation derivations for an arbitrary large angle slew maneuver as well as the basic decentralized slew maneuver control algorithm. The set of dynamical equations incorporate rigid body slew maneuver and three dimensional vibrations of the complete assembly comprising the rigid shuttle, the flexible beam, and the reflector with an offset mass. The analysis also includes kinematic nonlinearities of the entire assembly during the maneuver and the dynamics of the interactions between the rigid shuttle and the flexible appendage. The equations are simplified and evaluated numerically to include the first ten flexible modes to yield a model for designing control systems to perform slew maneuvers. The control problem incorporates the nonlinear dynamical equations and is expressed in terms of a two point boundary value problem.

  6. Mechanical modeling for magnetorheological elastomer isolators based on constitutive equations and electromagnetic analysis

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Dong, Xufeng; Li, Luyu; Ou, Jinping

    2018-06-01

    As constitutive models are too complicated and existing mechanical models lack universality, these models are beyond satisfaction for magnetorheological elastomer (MRE) devices. In this article, a novel universal method is proposed to build concise mechanical models. Constitutive model and electromagnetic analysis were applied in this method to ensure universality, while a series of derivations and simplifications were carried out to obtain a concise formulation. To illustrate the proposed modeling method, a conical MRE isolator was introduced. Its basic mechanical equations were built based on equilibrium, deformation compatibility, constitutive equations and electromagnetic analysis. An iteration model and a highly efficient differential equation editor based model were then derived to solve the basic mechanical equations. The final simplified mechanical equations were obtained by re-fitting the simulations with a novel optimal algorithm. In the end, verification test of the isolator has proved the accuracy of the derived mechanical model and the modeling method.

  7. Unit mechanisms of fission gas release: Current understanding and future needs

    NASA Astrophysics Data System (ADS)

    Tonks, Michael; Andersson, David; Devanathan, Ram; Dubourg, Roland; El-Azab, Anter; Freyss, Michel; Iglesias, Fernando; Kulacsy, Katalin; Pastore, Giovanni; Phillpot, Simon R.; Welland, Michael

    2018-06-01

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. This basic understanding of the fission gas behavior mechanisms has the potential to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.

  8. Computational Unification: a Vision for Connecting Researchers

    NASA Astrophysics Data System (ADS)

    Troy, R. M.; Kingrey, O. J.

    2002-12-01

    Computational Unification of science, once only a vision, is becoming a reality. This technology is based upon a scientifically defensible, general solution for Earth Science data management and processing. The computational unification of science offers a real opportunity to foster inter and intra-discipline cooperation, and the end of 're-inventing the wheel'. As we move forward using computers as tools, it is past time to move from computationally isolating, "one-off" or discipline-specific solutions into a unified framework where research can be more easily shared, especially with researchers in other disciplines. The author will discuss how distributed meta-data, distributed processing and distributed data objects are structured to constitute a working interdisciplinary system, including how these resources lead to scientific defensibility through known lineage of all data products. Illustration of how scientific processes are encapsulated and executed illuminates how previously written processes and functions are integrated into the system efficiently and with minimal effort. Meta-data basics will illustrate how intricate relationships may easily be represented and used to good advantage. Retrieval techniques will be discussed including trade-offs of using meta-data versus embedded data, how the two may be integrated, and how simplifying assumptions may or may not help. This system is based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, whose goals were to find an alternative to the Hughes EOS-DIS system and is presently offered by Science Tools corporation, of which the author is a principal.

  9. The influence of a wind tunnel on helicopter rotational noise: Formulation of analysis

    NASA Technical Reports Server (NTRS)

    Mosher, M.

    1984-01-01

    An analytical model is discussed that can be used to examine the effects of wind tunnel walls on helicopter rotational noise. A complete physical model of an acoustic source in a wind tunnel is described and a simplified version is then developed. This simplified model retains the important physical processes involved, yet it is more amenable to analysis. The simplified physical model is then modeled as a mathematical problem. An inhomogeneous partial differential equation with mixed boundary conditions is set up and then transformed into an integral equation. Details of generating a suitable Green's function and integral equation are included and the equation is discussed and also given for a two-dimensional case.

  10. Simplifying the negotiating process with physicians: critical elements in negotiating from private practice to employed physician.

    PubMed

    Gallucci, Armen; Deutsch, Thomas; Youngquist, Jaymie

    2013-01-01

    The authors attempt to simplify the key elements to the process of negotiating successfully with private physicians. From their experience, the business elements that have resulted in the most discussion center on the compensation including the incentive plan. Secondarily, how the issue of malpractice is handled will also consume a fair amount of time. What the authors have also learned is that the intangible issues can often be the reason for an unexpectedly large amount of discussion and therefore add time to the negotiation process. To assist with this process, they have derived a negotiation checklist, which seeks to help hospital leaders and administrators set the proper framework to ensure successful negotiation conversations. More importantly, being organized and recognizing these broad issues upfront and remaining transparent throughout the process will help to ensure a successful negotiation.

  11. A simplified computational memory model from information processing

    PubMed Central

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  12. Simplified Physics Based Models Research Topical Report on Task #2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, Srikanta; Ganesh, Priya

    We present a simplified-physics based approach, where only the most important physical processes are modeled, to develop and validate simplified predictive models of CO2 sequestration in deep saline formation. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. We use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and themore » nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. Similar correlations are also developed to predict the average pressure within the injection reservoir, and the pressure buildup within the caprock.« less

  13. Simplified Perovskite Solar Cell with 4.1% Efficiency Employing Inorganic CsPbBr3 as Light Absorber.

    PubMed

    Duan, Jialong; Zhao, Yuanyuan; He, Benlin; Tang, Qunwei

    2018-05-01

    Perovskite solar cells with cost-effectiveness, high power conversion efficiency, and improved stability are promising solutions to the energy crisis and environmental pollution. However, a wide-bandgap inorganic-semiconductor electron-transporting layer such as TiO 2 can harvest ultraviolet light to photodegrade perovskite halides, and the high cost of a state-of-the-art hole-transporting layer is an economic burden for commercialization. Here, the building of a simplified cesium lead bromide (CsPbBr 3 ) perovskite solar cell with fluorine-doped tin oxide (FTO)/CsPbBr 3 /carbon architecture by a multistep solution-processed deposition technology is demonstrated, achieving an efficiency as high as 4.1% and improved stability upon interfacial modification by graphene quantum dots and CsPbBrI 2 quantum dots. This work provides new opportunities of building next-generation solar cells with significantly simplified processes and reduced production costs. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The Condition for Generous Trust.

    PubMed

    Shinya, Obayashi; Yusuke, Inagaki; Hiroki, Takikawa

    2016-01-01

    Trust has been considered the "cement" of a society and is much studied in sociology and other social sciences. Most studies, however, have neglected one important aspect of trust: it involves an act of forgiving and showing tolerance toward another's failure. In this study, we refer to this concept as "generous trust" and examine the conditions under which generous trust becomes a more viable option when compared to other types of trust. We investigate two settings. First, we introduce two types of uncertainties: uncertainty as to whether trustees have the intention to cooperate, and uncertainty as to whether trustees have enough competence to accomplish the entrusted tasks. Second, we examine the manner in which trust functions in a broader social context, one that involves matching and commitment processes. Since we expect generosity or forgiveness to work differently in the matching and commitment processes, we must differentiate trust strategies into generous trust in the matching process and that in the commitment process. Our analytical strategy is two-fold. First, we analyze the "modified" trust game that incorporates the two types of uncertainties without the matching process. This simplified setting enables us to derive mathematical results using game theory, thereby giving basic insight into the trust mechanism. Second, we investigate socially embedded trust relationships in contexts involving the matching and commitment processes, using agent-based simulation. Results show that uncertainty about partner's intention and competence makes generous trust a viable option. In contrast, too much uncertainty undermines the possibility of generous trust. Furthermore, a strategy that is too generous cannot stand alone. Generosity should be accompanied with moderate punishment. As for socially embedded trust relationships, generosity functions differently in the matching process versus the commitment process. Indeed, these two types of generous trust coexist, and their coexistence enables a society to function well.

  15. Estimating inelastic heavy-particle-hydrogen collision data. I. Simplified model and application to potassium-hydrogen collisions

    NASA Astrophysics Data System (ADS)

    Belyaev, Andrey K.; Yakovleva, Svetlana A.

    2017-10-01

    Aims: We derive a simplified model for estimating atomic data on inelastic processes in low-energy collisions of heavy-particles with hydrogen, in particular for the inelastic processes with high and moderate rate coefficients. It is known that these processes are important for non-LTE modeling of cool stellar atmospheres. Methods: Rate coefficients are evaluated using a derived method, which is a simplified version of a recently proposed approach based on the asymptotic method for electronic structure calculations and the Landau-Zener model for nonadiabatic transition probability determination. Results: The rate coefficients are found to be expressed via statistical probabilities and reduced rate coefficients. It turns out that the reduced rate coefficients for mutual neutralization and ion-pair formation processes depend on single electronic bound energies of an atom, while the reduced rate coefficients for excitation and de-excitation processes depend on two electronic bound energies. The reduced rate coefficients are calculated and tabulated as functions of electronic bound energies. The derived model is applied to potassium-hydrogen collisions. For the first time, rate coefficients are evaluated for inelastic processes in K+H and K++H- collisions for all transitions from ground states up to and including ionic states. Tables with calculated data are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/606/A147

  16. Challenges and Practices in Building and Implementing Biosafety and Biosecurity Programs to Enable Basic and Translational Research with Select Agents.

    PubMed

    Jonsson, Colleen B; Cole, Kelly Stefano; Roy, Chad J; Perlin, David S; Byrne, Gerald

    2013-04-29

    Select agent research in the United States must meet federally-mandated biological surety guidelines and rules which are comprised of two main components: biosecurity and biosafety. Biosecurity is the process employed for ensuring biological agents are properly safeguarded against theft, loss, diversion, unauthorized access or use/release. Biosafety is those processes that ensure that operations with such agents are conducted in a safe, secure and reliable manner. As such, a biological surety program is generally concerned with biological agents that present high risk for adverse medical and/or agricultural consequences upon release outside of proper containment. The U.S. Regional and National Biocontainment Laboratories (RBL, NBL) represent expertise in this type of research, and are actively engaged in the development of programs to address these critical needs and federal requirements. While this comprises an ongoing activity for the RBLs, NBLs and other facilities that handle select agents as new guidelines and regulations are implemented, the present article is written with the goal of presenting a simplified yet comprehensive review of these requirements. Herein, we discuss the requirements and the various activities that the RBL/NBL programs have implemented to achieve these metrics set forth by various agencies within the U.S. Federal government.

  17. Chemometric aided NIR portable instrument for rapid assessment of medicine quality.

    PubMed

    Zontov, Y V; Balyklova, K S; Titova, A V; Rodionova, O Ye; Pomerantsev, A L

    2016-11-30

    The progress in instrumentation technology has led to miniaturization of NIR instruments. Fast systems that contain no moving parts were developed to be used in the field, warehouses, drugstores, etc. At the same time, in general these portable/handheld spectrometers have a lower spectral resolution and a narrower spectral region than stationary ones. Vendors of portable instruments supply their equipment with special software for spectra processing, which aims at simplifying the analyst's work to the highest degree possible. Often such software is not fully capable of solving complex problems. In application to a real-world problem of counterfeit drug detection we demonstrate that even impaired spectral data do carry information sufficient for drug authentication. The chemometrics aided approach helps to extract this information and thus to extend the applicability of miniaturized NIR instruments. MicroPhazir-RX NIR spectrometer is used as an example of a portable instrument. The data driven soft independent modeling of class analogy (DD-SIMCA) method is employed for data processing. A representative set of tablets of a calcium channel blocker from 6 different manufacturers is used to illustrate the proposed approach. It is shown that the DD-SIMCA approach yields a better result than the basic method provided by the instrument vendor. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Modeling Boulder Transport by Smooth Particle Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Karpytchev, M.

    2017-12-01

    Large coastal boulders are often believed to have been transported by strong tsunami andstorm waves. Understanding and quantifying the boulder transport processes is, therefore,crucial for evaluation of strength and timing of the past tsunamis and storms. Over the last10-15 year, a series of studies have obtained estimates of basic wave parameters neededto set in motion a boulder of given size, shape and mass by using simplified paramaterizationsof fluid-particle interactions. Although, parameterizing the principal hydraulic forces drivingboulder transport was succefull in reproducing effects of several historical tsunamis, someimportant details about initiation of boulder motion and the contribution of coastal wavetransformations as well as of suspended sediment to enhancing coastal currents are still lacking.These essentially non-linear processes can be particularly important for distingushing, in everyparticular case, whether it is a storm wave or a tsunami (or both) that was capable to transportspecific boulder to a given site.In this study, we employ the Smooth Particle Hydrodynamics (SPH) method in orderto get new insights on interaction of waves with boulders in the nearshore area.We first compare the SPH predictions with available laboratory experiments and thenexplore the effects of realistic 3D coastal bathymetry, non-linear behaviour of coastal waves,boulders shape and the impact of bedload and suspended sediment on dislodgement and initiationof boulder transport.

  19. A novel VLSI processor architecture for supercomputing arrays

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Pattabiraman, S.; Devanathan, R.; Ahmed, Ashaf; Venkataraman, S.; Ganesh, N.

    1993-01-01

    Design of the processor element for general purpose massively parallel supercomputing arrays is highly complex and cost ineffective. To overcome this, the architecture and organization of the functional units of the processor element should be such as to suit the diverse computational structures and simplify mapping of complex communication structures of different classes of algorithms. This demands that the computation and communication structures of different class of algorithms be unified. While unifying the different communication structures is a difficult process, analysis of a wide class of algorithms reveals that their computation structures can be expressed in terms of basic IP,IP,OP,CM,R,SM, and MAA operations. The execution of these operations is unified on the PAcube macro-cell array. Based on this PAcube macro-cell array, we present a novel processor element called the GIPOP processor, which has dedicated functional units to perform the above operations. The architecture and organization of these functional units are such to satisfy the two important criteria mentioned above. The structure of the macro-cell and the unification process has led to a very regular and simpler design of the GIPOP processor. The production cost of the GIPOP processor is drastically reduced as it is designed on high performance mask programmable PAcube arrays.

  20. Touchless attitude correction for satellite with constant magnetic moment

    NASA Astrophysics Data System (ADS)

    Ao, Hou-jun; Yang, Le-ping; Zhu, Yan-wei; Zhang, Yuan-wen; Huang, Huan

    2017-09-01

    Rescue of satellite with attitude fault is of great value. Satellite with improper injection attitude may lose contact with ground as the antenna points to the wrong direction, or encounter energy problems as solar arrays are not facing the sun. Improper uploaded command may set the attitude out of control, exemplified by Japanese Hitomi spacecraft. In engineering practice, traditional physical contact approaches have been applied, yet with a potential risk of collision and a lack of versatility since the mechanical systems are mission-specific. This paper puts forward a touchless attitude correction approach, in which three satellites are considered, one having constant dipole and two having magnetic coils to control attitude of the first. Particular correction configurations are designed and analyzed to maintain the target's orbit during the attitude correction process. A reference coordinate system is introduced to simplify the control process and avoid the singular value problem of Euler angles. Based on the spherical triangle basic relations, the accurate varying geomagnetic field is considered in the attitude dynamic mode. Sliding mode control method is utilized to design the correction law. Finally, numerical simulation is conducted to verify the theoretical derivation. It can be safely concluded that the no-contact attitude correction approach for the satellite with uniaxial constant magnetic moment is feasible and potentially applicable to on-orbit operations.

  1. Challenges and Practices in Building and Implementing Biosafety and Biosecurity Programs to Enable Basic and Translational Research with Select Agents

    PubMed Central

    Jonsson, Colleen B.; Cole, Kelly Stefano; Roy, Chad J.; Perlin, David S.; Byrne, Gerald

    2014-01-01

    Select agent research in the United States must meet federally-mandated biological surety guidelines and rules which are comprised of two main components: biosecurity and biosafety. Biosecurity is the process employed for ensuring biological agents are properly safeguarded against theft, loss, diversion, unauthorized access or use/release. Biosafety is those processes that ensure that operations with such agents are conducted in a safe, secure and reliable manner. As such, a biological surety program is generally concerned with biological agents that present high risk for adverse medical and/or agricultural consequences upon release outside of proper containment. The U.S. Regional and National Biocontainment Laboratories (RBL, NBL) represent expertise in this type of research, and are actively engaged in the development of programs to address these critical needs and federal requirements. While this comprises an ongoing activity for the RBLs, NBLs and other facilities that handle select agents as new guidelines and regulations are implemented, the present article is written with the goal of presenting a simplified yet comprehensive review of these requirements. Herein, we discuss the requirements and the various activities that the RBL/NBL programs have implemented to achieve these metrics set forth by various agencies within the U.S. Federal government. PMID:24900945

  2. An agent-based approach to modelling the effects of extreme events on global food prices

    NASA Astrophysics Data System (ADS)

    Schewe, Jacob; Otto, Christian; Frieler, Katja

    2015-04-01

    Extreme climate events such as droughts or heat waves affect agricultural production in major food producing regions and therefore can influence the price of staple foods on the world market. There is evidence that recent dramatic spikes in grain prices were at least partly triggered by actual and/or expected supply shortages. The reaction of the market to supply changes is however highly nonlinear and depends on complex and interlinked processes such as warehousing, speculation, and export restrictions. Here we present for the first time an agent-based modelling framework that accounts, in simplified terms, for these processes and allows to estimate the reaction of world food prices to supply shocks on a short (monthly) timescale. We test the basic model using observed historical supply, demand, and price data of wheat as a major food grain. Further, we illustrate how the model can be used in conjunction with biophysical crop models to assess the effect of future changes in extreme event regimes on the volatility of food prices. In particular, the explicit representation of storage dynamics makes it possible to investigate the potentially nonlinear interaction between simultaneous extreme events in different food producing regions, or between several consecutive events in the same region, which may both occur more frequently under future global warming.

  3. Wave interactions in a three-dimensional attachment line boundary layer

    NASA Technical Reports Server (NTRS)

    Hall, Philip; Mackerrell, Sharon O.

    1988-01-01

    The 3-D boundary layer on a swept wing can support different types of hydrodynamic instability. Attention is focused on the so-called spanwise contamination problem, which occurs when the attachment line boundary layer on the leading edge becomes unstable to Tollmien-Schlichting waves. In order to gain insight into the interactions important in that problem, a simplified basic state is considered. This simplified flow corresponds to the swept attachment line boundary layer on an infinite flat plate. The basic flow here is an exact solution of the Navier-Stokes equations and its stability to 2-D waves propagating along the attachment can be considered exactly at finite Reynolds number. This has been done in the linear and weakly nonlinear regimes. The corresponding problem is studied for oblique waves and their interaction with 2-D waves is investigated. In fact, oblique modes cannot be described exactly at finite Reynolds number so it is necessary to make a high Reynolds number approximation and use triple deck theory. It is shown that there are two types of oblique wave which, if excited, cause the destabilization of the 2-D mode and the breakdown of the disturbed flow at a finite distance from the leading edge. First, a low frequency mode related to the viscous stationary crossflow mode is a possible cause of breakdown. Second, a class of oblique wave with frequency comparable with that of the 2-D mode is another cause of breakdown. It is shown that the relative importance of the modes depends on the distance from the attachment line.

  4. Application of the top specified boundary layer (TSBL) approximation to initial characterization of an inland aquifer mineralization 1. Direct contact between fresh and saltwater

    USGS Publications Warehouse

    Rubin, H.; Buddemeier, R.W.

    1998-01-01

    This paper presents a basic study in generalized terms that originates from two needs: (1) to understand the major mechanisms involved in the mineralization of groundwater of the Great Bend Prairie aquifer of Kansas by saltwater originating from a deeper Permian bedrock formation, and (2) to develop simple, robust tools that can readily be used for local assessment and management activities in the salt-affected region. A simplified basic conceptual model is adopted, incorporating two horizontal layers of porous medium which come into contact at a specific location within the model domain. The top layer is saturated with freshwater, and the bottom layer is saturated with saltwater. The paper considers various stages of approximation which can be useful for simplified simulation of the build-up of the transition zone (TZ) between the freshwater and the saltwater. The hierarchy of approximate approaches leads to the development of the top specified boundary layer (TSBL) method, which is the major tool used in this study for initial characterization of the development of the TZ. It is shown that the thickness of the TZ is mainly determined by the characteristic dispersivity. The build-up of the TZ is completed after a time period equal to the time needed to advect a fluid particle along the whole extent of the TZ. Potential applications and the effects of natural recharge and pumpage on salinity transport in the domain are discussed and evaluated in the context of demonstrating the practicality of the TSBL approach.

  5. Development and validation of a simplified titration method for monitoring volatile fatty acids in anaerobic digestion.

    PubMed

    Sun, Hao; Guo, Jianbin; Wu, Shubiao; Liu, Fang; Dong, Renjie

    2017-09-01

    The volatile fatty acids (VFAs) concentration has been considered as one of the most sensitive process performance indicators in anaerobic digestion (AD) process. However, the accurate determination of VFAs concentration in AD processes normally requires advanced equipment and complex pretreatment procedures. A simplified method with fewer sample pretreatment procedures and improved accuracy is greatly needed, particularly for on-site application. This report outlines improvements to the Nordmann method, one of the most popular titrations used for VFA monitoring. The influence of ion and solid interfering subsystems in titrated samples on results accuracy was discussed. The total solid content in titrated samples was the main factor affecting accuracy in VFA monitoring. Moreover, a high linear correlation was established between the total solids contents and VFA measurement differences between the traditional Nordmann equation and gas chromatography (GC). Accordingly, a simplified titration method was developed and validated using a semi-continuous experiment of chicken manure anaerobic digestion with various organic loading rates. The good fitting of the results obtained by this method in comparison with GC results strongly supported the potential application of this method to VFA monitoring. Copyright © 2017. Published by Elsevier Ltd.

  6. A simplified boron diffusion for preparing the silicon single crystal p-n junction as an educational device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiota, Koki, E-mail: a14510@sr.kagawa-nct.ac.jp; Kai, Kazuho; Nagaoka, Shiro, E-mail: nagaoka@es.kagawa-nct.ac.jp

    The educational method which is including designing, making, and evaluating actual semiconductor devices with learning the theory is one of the best way to obtain the fundamental understanding of the device physics and to cultivate the ability to make unique ideas using the knowledge in the semiconductor device. In this paper, the simplified Boron thermal diffusion process using Sol-Gel material under normal air environment was proposed based on simple hypothesis and the feasibility of the reproducibility and reliability were investigated to simplify the diffusion process for making the educational devices, such as p-n junction, bipolar and pMOS devices. As themore » result, this method was successfully achieved making p+ region on the surface of the n-type silicon substrates with good reproducibility. And good rectification property of the p-n junctions was obtained successfully. This result indicates that there is a possibility to apply on the process making pMOS or bipolar transistors. It suggests that there is a variety of the possibility of the applications in the educational field to foster an imagination of new devices.« less

  7. Single-Scale Fusion: An Effective Approach to Merging Images.

    PubMed

    Ancuti, Codruta O; Ancuti, Cosmin; De Vleeschouwer, Christophe; Bovik, Alan C

    2017-01-01

    Due to its robustness and effectiveness, multi-scale fusion (MSF) based on the Laplacian pyramid decomposition has emerged as a popular technique that has shown utility in many applications. Guided by several intuitive measures (weight maps) the MSF process is versatile and straightforward to be implemented. However, the number of pyramid levels increases with the image size, which implies sophisticated data management and memory accesses, as well as additional computations. Here, we introduce a simplified formulation that reduces MSF to only a single level process. Starting from the MSF decomposition, we explain both mathematically and intuitively (visually) a way to simplify the classical MSF approach with minimal loss of information. The resulting single-scale fusion (SSF) solution is a close approximation of the MSF process that eliminates important redundant computations. It also provides insights regarding why MSF is so effective. While our simplified expression is derived in the context of high dynamic range imaging, we show its generality on several well-known fusion-based applications, such as image compositing, extended depth of field, medical imaging, and blending thermal (infrared) images with visible light. Besides visual validation, quantitative evaluations demonstrate that our SSF strategy is able to yield results that are highly competitive with traditional MSF approaches.

  8. Neuraminidase Ribbon Diagram

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Ribbons is a program developed at UAB used worldwide to graphically depict complicated protein structures in a simplified format. The program uses sophisticated computer systems to understand the implications of protein structures. The Influenza virus remains a major causative agent for a large number of deaths among the elderly and young children and huge economic losses due to illness. Finding a cure will have a general impact both on the basic research of viral pathologists of fast evolving infectious agents and clinical treatment of influenza virus infection. The reproduction process of all strains of influenza are dependent on the same enzyme neuraminidase. Shown here is a segmented representation of the neuraminidase inhibitor compound sitting inside a cave-like contour of the neuraminidase enzyme surface. This cave-like formation present in every neuraminidase enzyme is the active site crucial to the flu's ability to infect. The space-grown crystals of neuraminidase have provided significant new details about the three-dimensional characteristics of this active site thus allowing researchers to design drugs that fit tighter into the site. Principal Investigator: Dr. Larry DeLucas

  9. New approaches for calculating Moran's index of spatial autocorrelation.

    PubMed

    Chen, Yanguang

    2013-01-01

    Spatial autocorrelation plays an important role in geographical analysis; however, there is still room for improvement of this method. The formula for Moran's index is complicated, and several basic problems remain to be solved. Therefore, I will reconstruct its mathematical framework using mathematical derivation based on linear algebra and present four simple approaches to calculating Moran's index. Moran's scatterplot will be ameliorated, and new test methods will be proposed. The relationship between the global Moran's index and Geary's coefficient will be discussed from two different vantage points: spatial population and spatial sample. The sphere of applications for both Moran's index and Geary's coefficient will be clarified and defined. One of theoretical findings is that Moran's index is a characteristic parameter of spatial weight matrices, so the selection of weight functions is very significant for autocorrelation analysis of geographical systems. A case study of 29 Chinese cities in 2000 will be employed to validate the innovatory models and methods. This work is a methodological study, which will simplify the process of autocorrelation analysis. The results of this study will lay the foundation for the scaling analysis of spatial autocorrelation.

  10. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing.

    PubMed

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai; Weber, Tilmann; Lee, Sang Yup

    2018-01-01

    Bacteria of the order Actinomycetales are one of the most important sources of bioactive natural products, which are the source of many drugs. However, many of them still lack efficient genome editing methods, some strains even cannot be manipulated at all. This restricts systematic metabolic engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes.

  11. Development and implementation of the NMR-spectrometer on the basis of the National Instruments technologies

    NASA Astrophysics Data System (ADS)

    Narakidze, N. D.; Shaykhutdinov, D. V.; Shirokov, K. M.; Gorbatenko, N. I.; Yanvarev, S. G.

    2017-02-01

    The quality of lubricating oil in mechanical engineering, technology of creation of units, in particular in equipment of transmission gears, is a factor which considerably defines reliability and safety of the whole propulsion system or the greased constructive components. There are many soluble oil additives such as, for example, different additives for extreme compression conditions or additives against wear. Additives are used with mineral oils, products from mineral oils or synthetic oils for lubricant action or chemical properties improvement. The most exact way of definition of the chemical composition of a substance at the moment is the method of nuclear magnetic resonance (NMR). In the first section of this article, a brief and very simplified review of the NMR basic principles using classical physics is provided. The second section is focused on the description of the hardware solutions and the architecture of the NMR spectrometers. The software developments (LabVIEW programs) of the data-acquisition and signal processing techniques are presented in the third section. At the end, results of measurements are provided.

  12. A solid-state digital temperature recorder for space use

    NASA Technical Reports Server (NTRS)

    Westbrook, R. M.; Bennett, L. D.; Steinhaver, R. A.; Deboo, G. J.

    1981-01-01

    A solid-state, digital, temperature recorder has been developed for use in space experiments. The recorder is completely self-contained and includes a temperature sensor; all necessary electronics for signal conditioning, processing, storing, control and timing; and a battery power supply. No electrical interfacing with the particular spacecraft on which the unit is used is required. The recorder is small, light, and sturdy, and has no moving parts. It uses only biocompatible materials and has passed vibration and shock spaceflight qualification tests. The unit is capable of storing 2048, -10 to +45 C, 8-bit temperature measurements taken at intervals selectable by factors of 2 from 1.875 to 240 min; data can be retained for at least 6 months. The basic recorder can be simplified to accommodate a variety of applications by adding memory to allow more data to be recorded, by changing the front end to permit measurements other than temperature to be made, and by using different batteries to realize various operating periods. Stored flight data are read out from the recorder by means of a ground read-out unit.

  13. Transcriptomic responses of a simplified soil microcosm to a plant pathogen and its biocontrol agent reveal a complex reaction to harsh habitat.

    PubMed

    Perazzolli, Michele; Herrero, Noemí; Sterck, Lieven; Lenzi, Luisa; Pellegrini, Alberto; Puopolo, Gerardo; Van de Peer, Yves; Pertot, Ilaria

    2016-10-27

    Soil microorganisms are key determinants of soil fertility and plant health. Soil phytopathogenic fungi are one of the most important causes of crop losses worldwide. Microbial biocontrol agents have been extensively studied as alternatives for controlling phytopathogenic soil microorganisms, but molecular interactions between them have mainly been characterised in dual cultures, without taking into account the soil microbial community. We used an RNA sequencing approach to elucidate the molecular interplay of a soil microbial community in response to a plant pathogen and its biocontrol agent, in order to examine the molecular patterns activated by the microorganisms. A simplified soil microcosm containing 11 soil microorganisms was incubated with a plant root pathogen (Armillaria mellea) and its biocontrol agent (Trichoderma atroviride) for 24 h under controlled conditions. More than 46 million paired-end reads were obtained for each replicate and 28,309 differentially expressed genes were identified in total. Pathway analysis revealed complex adaptations of soil microorganisms to the harsh conditions of the soil matrix and to reciprocal microbial competition/cooperation relationships. Both the phytopathogen and its biocontrol agent were specifically recognised by the simplified soil microcosm: defence reaction mechanisms and neutral adaptation processes were activated in response to competitive (T. atroviride) or non-competitive (A. mellea) microorganisms, respectively. Moreover, activation of resistance mechanisms dominated in the simplified soil microcosm in the presence of both A. mellea and T. atroviride. Biocontrol processes of T. atroviride were already activated during incubation in the simplified soil microcosm, possibly to occupy niches in a competitive ecosystem, and they were not further enhanced by the introduction of A. mellea. This work represents an additional step towards understanding molecular interactions between plant pathogens and biocontrol agents within a soil ecosystem. Global transcriptional analysis of the simplified soil microcosm revealed complex metabolic adaptation in the soil environment and specific responses to antagonistic or neutral intruders.

  14. Induced simplified neutrosophic correlated aggregation operators for multi-criteria group decision-making

    NASA Astrophysics Data System (ADS)

    Şahin, Rıdvan; Zhang, Hong-yu

    2018-03-01

    Induced Choquet integral is a powerful tool to deal with imprecise or uncertain nature. This study proposes a combination process of the induced Choquet integral and neutrosophic information. We first give the operational properties of simplified neutrosophic numbers (SNNs). Then, we develop some new information aggregation operators, including an induced simplified neutrosophic correlated averaging (I-SNCA) operator and an induced simplified neutrosophic correlated geometric (I-SNCG) operator. These operators not only consider the importance of elements or their ordered positions, but also take into account the interactions phenomena among decision criteria or their ordered positions under multiple decision-makers. Moreover, we present a detailed analysis of I-SNCA and I-SNCG operators, including the properties of idempotency, commutativity and monotonicity, and study the relationships among the proposed operators and existing simplified neutrosophic aggregation operators. In order to handle the multi-criteria group decision-making (MCGDM) situations where the weights of criteria and decision-makers usually correlative and the criterion values are considered as SNNs, an approach is established based on I-SNCA operator. Finally, a numerical example is presented to demonstrate the proposed approach and to verify its effectiveness and practicality.

  15. Simplified Method for Preparing Methylene-Blue-Sensitized Dichromated Gelatin

    NASA Astrophysics Data System (ADS)

    Kurokawa, Kazumasa; Koike, Satoshi; Namba, Sinji; Mizuno, Toru; Kubota, Toshihiro

    1998-05-01

    Methylene-blue-sensitized dichromated gelatin (MBDCG) is a suitable material for recording full-color holograms in a single layer. However, a drying process in an ammonia atmosphere is necessary to prepare the MBDCG plate. This process is time-consuming and unstable. A simplified method for preparing the MBDCG plate is presented in which the MBDCG can be dried without ammonia. Elimination of the drying process is possible when the methylene blue in MBDCG does not separate. This is achieved by a decrease in the concentration of dichromate in the photosensitized solution and the addition of an ammonia solution to the photosensitized solution. Last, the gelatin is allowed to gel. A Lippmann color hologram grating with a diffraction efficiency of more than 80% is obtained by use of this MBDCG.

  16. Low energy production processes in manufacturing of silicon solar cells

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, A. R.

    1976-01-01

    Ion implantation and pulsed energy techniques are being combined for fabrication of silicon solar cells totally under vacuum and at room temperature. Simplified sequences allow very short processing times with small process energy consumption. Economic projections for fully automated production are excellent.

  17. Analysis of mode-locked and intracavity frequency-doubled Nd:YAG laser

    NASA Technical Reports Server (NTRS)

    Siegman, A. E.; Heritier, J.-M.

    1980-01-01

    The paper presents analytical and computer studies of the CW mode-locked and intracavity frequency-doubled Nd:YAG laser which provide new insight into the operation, including the detuning behavior, of this type of laser. Computer solutions show that the steady-state pulse shape for this laser is much closer to a truncated cosine than to a Gaussian; there is little spectral broadening for on-resonance operation; and the chirp is negligible. This leads to a simplified analytical model carried out entirely in the time domain, with atomic linewidth effects ignored. Simple analytical results for on-resonance pulse shape, pulse width, signal intensity, and harmonic conversion efficiency in terms of basic laser parameters are derived from this model. A simplified physical description of the detuning behavior is also developed. Agreement is found with experimental studies showing that the pulsewidth decreases as the modulation frequency is detuned off resonance; the harmonic power output initially increases and then decreases; and the pulse shape develops a sharp-edged asymmetry of opposite sense for opposite signs of detuning.

  18. Simplified Aircraft-Based Paired Approach: Concept Definition and Initial Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Lohr, Gary W.; McKissick, Burnell T.; Abbott, Terence S.; Geurreiro, Nelson M.; Volk, Paul

    2013-01-01

    Simplified Aircraft-based Parallel Approach (SAPA) is an advanced concept proposed by the Federal Aviation Administration (FAA) to support dependent parallel approach operations to runways with lateral spacing closer than 2500 ft. At the request of the FAA, NASA performed an initial assessment of the potential performance and feasibility of the SAPA concept, including developing and assessing an operational implementation of the concept and conducting a Monte Carlo wake simulation study to examine the longitudinal spacing requirements. The SAPA concept was shown to have significant operational advantages in supporting the pairing of aircraft with dissimilar final approach speeds. The wake simulation study showed that support for dissimilar final approach speeds could be significantly enhanced through the use of a two-phased altitudebased longitudinal positioning requirement, with larger longitudinal positioning allowed for higher altitudes out of ground effect and tighter longitudinal positioning defined for altitudes near and in ground effect. While this assessment is preliminary and there are a number of operational issues still to be examined, it has shown the basic SAPA concept to be technically and operationally feasible.

  19. Prediction of high temperature metal matrix composite ply properties

    NASA Technical Reports Server (NTRS)

    Caruso, J. J.; Chamis, C. C.

    1988-01-01

    The application of the finite element method (superelement technique) in conjunction with basic concepts from mechanics of materials theory is demonstrated to predict the thermomechanical behavior of high temperature metal matrix composites (HTMMC). The simulated behavior is used as a basis to establish characteristic properties of a unidirectional composite idealized an as equivalent homogeneous material. The ply properties predicted include: thermal properties (thermal conductivities and thermal expansion coefficients) and mechanical properties (moduli and Poisson's ratio). These properties are compared with those predicted by a simplified, analytical composite micromechanics model. The predictive capabilities of the finite element method and the simplified model are illustrated through the simulation of the thermomechanical behavior of a P100-graphite/copper unidirectional composite at room temperature and near matrix melting temperature. The advantage of the finite element analysis approach is its ability to more precisely represent the composite local geometry and hence capture the subtle effects that are dependent on this. The closed form micromechanics model does a good job at representing the average behavior of the constituents to predict composite behavior.

  20. The influence of wind-tunnel walls on discrete frequency noise

    NASA Technical Reports Server (NTRS)

    Mosher, M.

    1984-01-01

    This paper describes an analytical model that can be used to examine the effects of wind-tunnel walls on discrete frequency noise. First, a complete physical model of an acoustic source in a wind tunnel is described, and a simplified version is then developed. This simplified model retains the important physical processes involved, yet it is more amenable to analysis. Second, the simplified physical model is formulated as a mathematical problem. An inhomogeneous partial differential equation with mixed boundary conditions is set up and then transformed into an integral equation. The integral equation has been solved with a panel program on a computer. Preliminary results from a simple model problem will be shown and compared with the approximate analytic solution.

  1. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  2. Using color histogram normalization for recovering chromatic illumination-changed images.

    PubMed

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  3. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  4. Ultralow percolation threshold of single walled carbon nanotube-epoxy composites synthesized via an ionic liquid dispersant/initiator

    NASA Astrophysics Data System (ADS)

    Watters, Arianna L.; Palmese, Giuseppe R.

    2014-09-01

    Uniform dispersion of single walled carbon nanotubes (SWNTs) in an epoxy was achieved by a streamlined mechano-chemical processing method. SWNT-epoxy composites were synthesized using a room temperature ionic liquid (IL) with an imidazolium cation and dicyanamide anion. The novel approach of using ionic liquid that behaves as a dispersant for SWNTs and initiator for epoxy polymerization greatly simplifies nanocomposite synthesis. The material was processed using simple and scalable three roll milling. The SWNT dispersion of the resultant composite was evaluated by electron microscopy and electrical conductivity measurements in conjunction with percolation theory. Processing conditions were optimized to achieve the lowest possible percolation threshold, 4.29 × 10-5 volume fraction SWNTs. This percolation threshold is among the best reported in literature yet it was obtained using a streamlined method that greatly simplifies processing.

  5. Analytical model for investigation of interior noise characteristics in aircraft with multiple propellers including synchrophasing

    NASA Technical Reports Server (NTRS)

    Fuller, C. R.

    1986-01-01

    A simplified analytical model of transmission of noise into the interior of propeller-driven aircraft has been developed. The analysis includes directivity and relative phase effects of the propeller noise sources, and leads to a closed form solution for the coupled motion between the interior and exterior fields via the shell (fuselage) vibrational response. Various situations commonly encountered in considering sound transmission into aircraft fuselages are investigated analytically and the results obtained are compared to measurements in real aircraft. In general the model has proved successful in identifying basic mechanisms behind noise transmission phenomena.

  6. Procedures and equipment for staining large numbers of plant root samples for endomycorrhizal assay.

    PubMed

    Kormanik, P P; Bryan, W C; Schultz, R C

    1980-04-01

    A simplified method of clearing and staining large numbers of plant roots for vesicular-arbuscular (VA) mycorrhizal assay is presented. Equipment needed for handling multiple samples is described, and two formulations for the different chemical solutions are presented. Because one formulation contains phenol, its use should be limited to basic studies for which adequate laboratory exhaust hoods are available and great clarity of fungal structures is required. The second staining formulation, utilizing lactic acid instead of phenol, is less toxic, requires less elaborate laboratory facilities, and has proven to be completely satisfactory for VA assays.

  7. A Hospital Local Area Communication Network—The First Year's Experience

    PubMed Central

    Simborg, D. W.; Chadwick, M.; Whiting-O'Keefe, Q. E.; Tolchin, S. G.; Stewart, R. L.; Kahn, S. A.; Bergan, E. S.; Gafke, G. P.

    1982-01-01

    A local area communications network has been implemented at the University of California, San Francisco Hospital to integrate major components of the hospital's information system. This microprocessor-based network technology was developed by The Applied Physics Laboratory of the Johns Hopkins University. The first year's experience has demonstrated the basic feasibility of this technology in simplifying the integration of diverse hardware and software systems. Four minicomputer-based UCSF systems now use the network to synchronize key patient identification and registration information among the systems. Clinical uses of the network will begin during the second year of the project.

  8. Estimation of Dynamic Discrete Choice Models by Maximum Likelihood and the Simulated Method of Moments

    PubMed Central

    Eisenhauer, Philipp; Heckman, James J.; Mosso, Stefano

    2015-01-01

    We compare the performance of maximum likelihood (ML) and simulated method of moments (SMM) estimation for dynamic discrete choice models. We construct and estimate a simplified dynamic structural model of education that captures some basic features of educational choices in the United States in the 1980s and early 1990s. We use estimates from our model to simulate a synthetic dataset and assess the ability of ML and SMM to recover the model parameters on this sample. We investigate the performance of alternative tuning parameters for SMM. PMID:26494926

  9. International Conference on the Methods of Aerophysical Research 98 "ICMAR 98". Proceedings, Part 1

    DTIC Science & Technology

    1998-01-01

    pumping air through device and airdrying due to vapour condensation on cooled surfaces. Fig. 1 In this report, approximate estimates are presented...picture is used for flow field between disks and for water vapor condensation on cooled moving surfaces. Shown in Fig. 1 is a simplified flow...frequency of disks rotation), thus, breaking away from channel walls. Regarding condensation process, a number of usual simplifying assumptions is made

  10. Advanced wastewater treatment simplified through research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souther, R.H.

    A waste water treatment plant was built based on results of a small-scale pilot plant study, conducted largely in a search for efficiency as well as economy. Results were that 98 percent carbonaceous BOD (BOD/sub C/) and nitrogenous BOD (BOD/sub N/) were removed in a simplified, low-cost, single-stage advanced treatment process surpassing even some of the most sophisticated advanced complex waste treatment methods. The single-stage process treats domestic waste alone or combined with very high amounts of textile, electroplating, chemical, food, and other processing industrial wastewater. The process removed 100 percent of the sulfides above 98 percent of NH/sub 3/-N,more » over 90 percent of COD and phenols; chromium was converted from highly toxic hexavalent CrVI to nearly nontoxic trivalent chrome (CrIII). A pH up to 12 may be tolerated if no free hydroxyl (OH) ions are present. Equalization ponds, primary settling tanks, trickling filters, extra nitrogen removal tanks, carbon columns, and chemical treatment are not required. Color removal is excellent with clear effluent suitable for recycling after chlorination to water supply lakes. The construction cost of the single-stage advanced treatment plant is surprisingly low, about /sup 1///sub 2/ to /sup 1///sub 6/ as much as most conventional ineffective complex plants. This simplified, innovative process developed in independent research at Guilford College is considered by some a breakthrough in waste treatment efficiency and economy. (MU)« less

  11. Development of a global aerosol model using a two-dimensional sectional method: 1. Model design

    NASA Astrophysics Data System (ADS)

    Matsui, H.

    2017-08-01

    This study develops an aerosol module, the Aerosol Two-dimensional bin module for foRmation and Aging Simulation version 2 (ATRAS2), and implements the module into a global climate model, Community Atmosphere Model. The ATRAS2 module uses a two-dimensional (2-D) sectional representation with 12 size bins for particles from 1 nm to 10 μm in dry diameter and 8 black carbon (BC) mixing state bins. The module can explicitly calculate the enhancement of absorption and cloud condensation nuclei activity of BC-containing particles by aging processes. The ATRAS2 module is an extension of a 2-D sectional aerosol module ATRAS used in our previous studies within a framework of a regional three-dimensional model. Compared with ATRAS, the computational cost of the aerosol module is reduced by more than a factor of 10 by simplifying the treatment of aerosol processes and 2-D sectional representation, while maintaining good accuracy of aerosol parameters in the simulations. Aerosol processes are simplified for condensation of sulfate, ammonium, and nitrate, organic aerosol formation, coagulation, and new particle formation processes, and box model simulations show that these simplifications do not substantially change the predicted aerosol number and mass concentrations and their mixing states. The 2-D sectional representation is simplified (the number of advected species is reduced) primarily by the treatment of chemical compositions using two interactive bin representations. The simplifications do not change the accuracy of global aerosol simulations. In part 2, comparisons with measurements and the results focused on aerosol processes such as BC aging processes are shown.

  12. 25 CFR 15.11 - What are the basic steps of the probate process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps of the probate process? 15.11 Section 15.11 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE PROBATE OF INDIAN... are the basic steps of the probate process? The basic steps of the probate process are: (a) We learn...

  13. Molecular dynamics of conformational substates for a simplified protein model

    NASA Astrophysics Data System (ADS)

    Grubmüller, Helmut; Tavan, Paul

    1994-09-01

    Extended molecular dynamics simulations covering a total of 0.232 μs have been carried out on a simplified protein model. Despite its simplified structure, that model exhibits properties similar to those of more realistic protein models. In particular, the model was found to undergo transitions between conformational substates at a time scale of several hundred picoseconds. The computed trajectories turned out to be sufficiently long as to permit a statistical analysis of that conformational dynamics. To check whether effective descriptions neglecting memory effects can reproduce the observed conformational dynamics, two stochastic models were studied. A one-dimensional Langevin effective potential model derived by elimination of subpicosecond dynamical processes could not describe the observed conformational transition rates. In contrast, a simple Markov model describing the transitions between but neglecting dynamical processes within conformational substates reproduced the observed distribution of first passage times. These findings suggest, that protein dynamics generally does not exhibit memory effects at time scales above a few hundred picoseconds, but confirms the existence of memory effects at a picosecond time scale.

  14. A Simplified Program Needs Assessment Process.

    ERIC Educational Resources Information Center

    Clark, Larry

    A rationale, background information, and a discussion of methodology are presented for a needs assessment process intended for pilot implementation at Western Piedmont Community College (WPCC). This process was designed to assess the local need for paraprofessional programs in the Human Services area, i.e., Early Childhood Associate, Mental Health…

  15. The Complexity of Developmental Predictions from Dual Process Models

    ERIC Educational Resources Information Center

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  16. Applying the Theory of Constraints to a Base Civil Engineering Operations Branch

    DTIC Science & Technology

    1991-09-01

    Figure Page 1. Typical Work Order Processing . .......... 7 2. Typical Job Order Processing . .......... 8 3. Typical Simplified In-Service Work Plan for...Customers’ Customer Request Service Planning Unit Production] Control Center Material Control Scheduling CE Shops Figure 1.. Typical Work Order Processing 7

  17. Simulation of Simple Controlled Processes with Dead-Time.

    ERIC Educational Resources Information Center

    Watson, Keith R.; And Others

    1985-01-01

    The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…

  18. Emotional valence of stimuli modulates false recognition: Using a modified version of the simplified conjoint recognition paradigm.

    PubMed

    Gong, Xianmin; Xiao, Hongrui; Wang, Dahua

    2016-11-01

    False recognition results from the interplay of multiple cognitive processes, including verbatim memory, gist memory, phantom recollection, and response bias. In the current study, we modified the simplified Conjoint Recognition (CR) paradigm to investigate the way in which the valence of emotional stimuli affects the cognitive process and behavioral outcome of false recognition. In Study 1, we examined the applicability of the modification to the simplified CR paradigm and model. Twenty-six undergraduate students (13 females, aged 21.00±2.30years) learned and recognized both the large and small categories of photo objects. The applicability of the paradigm and model was confirmed by a fair goodness-of-fit of the model to the observational data and by their competence in detecting the memory differences between the large- and small-category conditions. In Study 2, we recruited another sample of 29 undergraduate students (14 females, aged 22.60±2.74years) to learn and recognize the categories of photo objects that were emotionally provocative. The results showed that negative valence increased false recognition, particularly the rate of false "remember" responses, by facilitating phantom recollection; positive valence did not influence false recognition significantly though enhanced gist processing. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Key issues in the design of NO{sub x} emission trading programs to reduce ground-level ozone. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, A.; Harrison, D.

    1994-07-01

    This report is the first product of a study being conducted by National Economic Research Associates for the Electric Power Research Institute to evaluate various market-based alternatives for managing emissions of nitrogen oxides (NO{sub x}) as part of strategies to achieve the ambient ozone standard. The report focuses on choices in the design of relatively broad, ambitious emission trading programs, rather than on more modest programs designed to generate offsets within a regulatory framework that continues to rely primarily on traditional emission standards and nontransferable permits. After a brief introductory chapter, Chapter 2 reviews both the conceptual underpinnings of emissionmore » trading and prior experience. This review suggests the need for clear initial allocations-generally based on emission caps-to simplify trading while assuring the achievement of emission-reduction goals. Chapter 3 lays out the basic choices required in establishing an emission trading program. For concreteness, the basic design is discussed in terms of trading among utilities and other large stationary sources of NO{sub x}, generally the most promising candidates for trading. Chapter 4 discusses various ways in which a basic trading program could be extended to other source categories and to volatile organic compounds (VOCs), the other major precursor of ozone. Chapter 5 analyzes various ways in which trading programs can be refined to focus control efforts on those times and at those locations where ozone problems are most severe. Although highly refined targeting programs are unlikely to be worth the effort, modest differentials can be implemented by making the number of allowances required for each ton of emissions vary with the time and location of emissions. Chapter 6 reviews various alternatives for making the initial allocation of emission allowances among sources in the trading program, breaking the process into two components, an emission rate and an activity level.« less

  20. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  1. Computational modeling of the pressurization process in a NASP vehicle propellant tank experimental simulation

    NASA Technical Reports Server (NTRS)

    Sasmal, G. P.; Hochstein, J. I.; Wendl, M. C.; Hardy, T. L.

    1991-01-01

    A multidimensional computational model of the pressurization process in a slush hydrogen propellant storage tank was developed and its accuracy evaluated by comparison to experimental data measured for a 5 ft diameter spherical tank. The fluid mechanic, thermodynamic, and heat transfer processes within the ullage are represented by a finite-volume model. The model was shown to be in reasonable agreement with the experiment data. A parameter study was undertaken to examine the dependence of the pressurization process on initial ullage temperature distribution and pressurant mass flow rate. It is shown that for a given heat flux rate at the ullage boundary, the pressurization process is nearly independent of initial temperature distribution. Significant differences were identified between the ullage temperature and velocity fields predicted for pressurization of slush and those predicted for pressurization of liquid hydrogen. A simplified model of the pressurization process was constructed in search of a dimensionless characterization of the pressurization process. It is shown that the relationship derived from this simplified model collapses all of the pressure history data generated during this study into a single curve.

  2. Information content and sensitivity of the 3β + 2α lidar measurement system for aerosol microphysical retrievals

    NASA Astrophysics Data System (ADS)

    Burton, Sharon P.; Chemyakin, Eduard; Liu, Xu; Knobelspiesse, Kirk; Stamnes, Snorre; Sawamura, Patricia; Moore, Richard H.; Hostetler, Chris A.; Ferrare, Richard A.

    2016-11-01

    There is considerable interest in retrieving profiles of aerosol effective radius, total number concentration, and complex refractive index from lidar measurements of extinction and backscatter at several wavelengths. The combination of three backscatter channels plus two extinction channels (3β + 2α) is particularly important since it is believed to be the minimum configuration necessary for the retrieval of aerosol microphysical properties and because the technological readiness of lidar systems permits this configuration on both an airborne and future spaceborne instrument. The second-generation NASA Langley airborne High Spectral Resolution Lidar (HSRL-2) has been making 3β + 2α measurements since 2012. The planned NASA Aerosol/Clouds/Ecosystems (ACE) satellite mission also recommends the 3β + 2α combination.Here we develop a deeper understanding of the information content and sensitivities of the 3β + 2α system in terms of aerosol microphysical parameters of interest. We use a retrieval-free methodology to determine the basic sensitivities of the measurements independent of retrieval assumptions and constraints. We calculate information content and uncertainty metrics using tools borrowed from the optimal estimation methodology based on Bayes' theorem, using a simplified forward model look-up table, with no explicit inversion. The forward model is simplified to represent spherical particles, monomodal log-normal size distributions, and wavelength-independent refractive indices. Since we only use the forward model with no retrieval, the given simplified aerosol scenario is applicable as a best case for all existing retrievals in the absence of additional constraints. Retrieval-dependent errors due to mismatch between retrieval assumptions and true atmospheric aerosols are not included in this sensitivity study, and neither are retrieval errors that may be introduced in the inversion process. The choice of a simplified model adds clarity to the understanding of the uncertainties in such retrievals, since it allows for separately assessing the sensitivities and uncertainties of the measurements alone that cannot be corrected by any potential or theoretical improvements to retrieval methodology but must instead be addressed by adding information content.The sensitivity metrics allow for identifying (1) information content of the measurements vs. a priori information; (2) error bars on the retrieved parameters; and (3) potential sources of cross-talk or "compensating" errors wherein different retrieval parameters are not independently captured by the measurements. The results suggest that the 3β + 2α measurement system is underdetermined with respect to the full suite of microphysical parameters considered in this study and that additional information is required, in the form of additional coincident measurements (e.g., sun-photometer or polarimeter) or a priori retrieval constraints. A specific recommendation is given for addressing cross-talk between effective radius and total number concentration.

  3. Practical modeling approaches for geological storage of carbon dioxide.

    PubMed

    Celia, Michael A; Nordbotten, Jan M

    2009-01-01

    The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.

  4. Simplifying the interaction between cognitive models and task environments with the JSON Network Interface.

    PubMed

    Hope, Ryan M; Schoelles, Michael J; Gray, Wayne D

    2014-12-01

    Process models of cognition, written in architectures such as ACT-R and EPIC, should be able to interact with the same software with which human subjects interact. By eliminating the need to simulate the experiment, this approach would simplify the modeler's effort, while ensuring that all steps required of the human are also required by the model. In practice, the difficulties of allowing one software system to interact with another present a significant barrier to any modeler who is not also skilled at this type of programming. The barrier increases if the programming language used by the modeling software differs from that used by the experimental software. The JSON Network Interface simplifies this problem for ACT-R modelers, and potentially, modelers using other systems.

  5. Simplified analysis and optimization of space base and space shuttle heat rejection systems

    NASA Technical Reports Server (NTRS)

    Wulff, W.

    1972-01-01

    A simplified radiator system analysis was performed to predict steady state radiator system performance. The system performance was found to be describable in terms of five non-dimensional system parameters. The governing differential equations are integrated numerically to yield the enthalpy rejection for the coolant fluid. The simplified analysis was extended to produce the derivatives of the coolant exit temperature with respect to the governing system parameters. A procedure was developed to find the optimum set of system parameters which yields the lowest possible coolant exit temperature for either a given projected area or a given total mass. The process can be inverted to yield either the minimum area or the minimum mass, together with the optimum geometry, for a specified heat rejection rate.

  6. 7 CFR 4280.102 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Improvements Program § 4280.102 General. (a) Sections 4280.103 through 4280.106 discuss definitions, exception... evaluation process, and post-grant Federal requirements for both the simplified and full application processes. Sections 4280.115 through 4280.117 address project planning, development, and completion as...

  7. On Anthologies.

    ERIC Educational Resources Information Center

    Jones, Nick

    1983-01-01

    Discusses the form and function of anthologies by distinguishing three "orders" of anthology, together with a fourth, or preliminary category, within a broadly simplified model of the anthological process. (HOD)

  8. Managing the construction bidding process : a move to simpler construction plan sets

    DOT National Transportation Integrated Search

    2001-01-31

    This project was conducted to determine whether construction plan sets could be significantly simplified to speed the process of moving projects to construction. The work steps included a literature review, a telephone survey of highway agencies in s...

  9. TOPICAL REVIEW: Physics and phenomena in pulsed magnetrons: an overview

    NASA Astrophysics Data System (ADS)

    Bradley, J. W.; Welzel, T.

    2009-05-01

    This paper reviews the contribution made to the observation and understanding of the basic physical processes occurring in an important type of magnetized low-pressure plasma discharge, the pulsed magnetron. In industry, these plasma sources are operated typically in reactive mode where a cathode is sputtered in the presence of both chemically reactive and noble gases typically with the power modulated in the mid-frequency (5-350 kHz) range. In this review, we concentrate mostly, however, on physics-based studies carried out on magnetron systems operated in argon. This simplifies the physical-chemical processes occurring and makes interpretation of the observations somewhat easier. Since their first recorded use in 1993 there have been more than 300 peer-reviewed paper publications concerned with pulsed magnetrons, dealing wholly or in part with fundamental observations and basic studies. The fundamentals of these plasmas and the relationship between the plasma parameters and thin film quality regularly have whole sessions at international conferences devoted to them; however, since many different types of magnetron geometries have been used worldwide with different operating parameters the important results are often difficult to tease out. For example, we find the detailed observations of the plasma parameter (particle density and temperature) evolution from experiment to experiment are at best difficult to compare and at worst contradictory. We review in turn five major areas of studies which are addressed in the literature and try to draw out the major results. These areas are: fast electron generation, bulk plasma heating, short and long-term plasma parameter rise and decay rates, plasma potential modulation and transient phenomena. The influence of these phenomena on the ion energy and ion energy flux at the substrate is discussed. This review, although not exhaustive, will serve as a useful guide for more in-depth investigations using the referenced literature and also hopefully as an inspiration for future studies.

  10. Study on diesel vertical migration characteristics and mechanism in water-bearing sand stratum using an automated resistivity monitoring system.

    PubMed

    Pan, Yuying; Jia, Yonggang; Wang, Yuhua; Xia, Xin; Guo, Lei

    2018-02-01

    Oil spills frequently occur on both land and sea. Petroleum in mobile phase will cause serious pollution in the sediment and can form a secondary pollution source. Therefore, it is very important to study the migration of petroleum in sediments ideally in a rapid and simplified approach. The release of diesel was simulated using fine beach sand to construct a model aquifer, and dynamic monitoring was carried out using an automated monitoring system including a resistivity probe originally developed by our research group. The mobile phase migration fronts were determined accurately using wavelet analysis method combined with resistivity curve method. Then, a relationship between resistivity and the joint oil-water content was established. The main conclusions were as follows. The seepage velocity of the diesel with high mobility at the initial stage of infiltration was faster, followed by a period when gravity seepage was dominant, and finally a redistribution period at the later stage, which was mainly an oil-water displacement process. The resistivity trends for diesel infiltration in different water-saturated soil layers varied with depth. The resistivity in the vadose zone fluctuated significantly, increasing initially and later decreasing. The resistivity change in the capillary zone was relatively small and constant in the initial stage; then, it increased and subsequently decreased. The resistivity in the saturated zone was basically unchanged with depth, and the value became slightly larger than the background value over time. Overall, for a large volume of mobile phase diesel leakage, the arrival migration fronts can be detected by wavelet analysis combined with resistivity curves. The thickness of the oil slick in the capillary zone can be estimated by resistivity changes. The relationships between resistivity and both the moisture content and oil-water joint saturation are in agreement with the linear models. The research results provide basic data and a new data processing method for monitoring of contaminated sites following major oil spills using the resistivity method.

  11. Intelligent Robotic Systems Study (IRSS), phase 3

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This phase of the Intelligent Robotic Systems Study (IRSS) examines some basic dynamics and control issues for a space manipulator attached to its worksite through a compliant base. One example of this scenario is depicted, which is a simplified, planar representation of the Flight Telerobotic Servicer (FTS) Development Test Flight 2 (DTF-2) experiment. The system consists of 4 major components: (1) dual FTS arms to perform dextrous tasks; (2) the main body to house power and electronics; (3) an Attachment Stabilization and Positioning Subsystem (ASPS) to provide coarse positioning and stabilization of the arms, and (4) the Worksite Attachment Mechanism (WAM) which anchors the system to its worksite, such as a Space Station truss node or Shuttle bay platform. The analysis is limited to the DTF-2 scenario. The goal is to understand the basic interaction dynamics between the arm, the positioner and/or stabilizer, and the worksite. The dynamics and controls simulation model are described. Analysis and simulation results are presented.

  12. Physical layer simulation study for the coexistence of WLAN standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howlader, M. K.; Keiger, C.; Ewing, P. D.

    This paper presents the results of a study on the performance of wireless local area network (WLAN) devices in the presence of interference from other wireless devices. To understand the coexistence of these wireless protocols, simplified physical-layer-system models were developed for the Bluetooth, Wireless Fidelity (WiFi), and Zigbee devices, all of which operate within the 2.4-GHz frequency band. The performances of these protocols were evaluated using Monte-Carlo simulations under various interference and channel conditions. The channel models considered were basic additive white Gaussian noise (AWGN), Rayleigh fading, and site-specific fading. The study also incorporated the basic modulation schemes, multiple accessmore » techniques, and channel allocations of the three protocols. This research is helping the U.S. Nuclear Regulatory Commission (NRC) understand the coexistence issues associated with deploying wireless devices and could prove useful in the development of a technical basis for guidance to address safety-related issues with the implementation of wireless systems in nuclear facilities. (authors)« less

  13. DNA-Based Nanobiosensors as an Emerging Platform for Detection of Disease

    PubMed Central

    Abu-Salah, Khalid M.; Zourob, Mohammed M.; Mouffouk, Fouzi; Alrokayan, Salman A.; Alaamery, Manal A.; Ansari, Anees A.

    2015-01-01

    Detection of disease at an early stage is one of the biggest challenges in medicine. Different disciplines of science are working together in this regard. The goal of nanodiagnostics is to provide more accurate tools for earlier diagnosis, to reduce cost and to simplify healthcare delivery of effective and personalized medicine, especially with regard to chronic diseases (e.g., diabetes and cardiovascular diseases) that have high healthcare costs. Up-to-date results suggest that DNA-based nanobiosensors could be used effectively to provide simple, fast, cost-effective, sensitive and specific detection of some genetic, cancer, and infectious diseases. In addition, they could potentially be used as a platform to detect immunodeficiency, and neurological and other diseases. This review examines different types of DNA-based nanobiosensors, the basic principles upon which they are based and their advantages and potential in diagnosis of acute and chronic diseases. We discuss recent trends and applications of new strategies for DNA-based nanobiosensors, and emphasize the challenges in translating basic research to the clinical laboratory. PMID:26102488

  14. 3D visual mechinism by neural networkings

    NASA Astrophysics Data System (ADS)

    Sugiyama, Shigeki

    2007-04-01

    There are some computer vision systems that are available on a market but those are quite far from a real usage of our daily life in a sense of security guard or in a sense of a usage of recognition of a target object behaviour. Because those surroundings' sensing might need to recognize a detail description of an object, like "the distance to an object" and "an object detail figure" and "its figure of edging", which are not possible to have a clear picture of the mechanisms of them with the present recognition system. So for doing this, here studies on mechanisms of how a pair of human eyes can recognize a distance apart, an object edging, and an object in order to get basic essences of vision mechanisms. And those basic mechanisms of object recognition are simplified and are extended logically for applying to a computer vision system. Some of the results of these studies are introduced on this paper.

  15. Moist convection: a key to tropical wave-moisture interaction in Indian monsoon intraseasonal oscillation

    NASA Astrophysics Data System (ADS)

    Wu, Longtao; Wong, Sun; Wang, Tao; Huffman, George J.

    2018-01-01

    Simulation of moist convective processes is critical for accurately representing the interaction among tropical wave activities, atmospheric water vapor transport, and clouds associated with the Indian monsoon Intraseasonal Oscillation (ISO). In this study, we apply the Weather Research and Forecasting (WRF) model to simulate Indian monsoon ISO with three different treatments of moist convective processes: (1) the Betts-Miller-Janjić (BMJ) adjustment cumulus scheme without explicit simulation of moist convective processes; (2) the New Simplified Arakawa-Schubert (NSAS) mass-flux scheme with simplified moist convective processes; and (3) explicit simulation of moist convective processes at convection permitting scale (Nest). Results show that the BMJ experiment is unable to properly reproduce the equatorial Rossby wave activities and the corresponding phase relationship between moisture advection and dynamical convergence during the ISO. These features associated with the ISO are approximately captured in the NSAS experiment. The simulation with resolved moist convective processes significantly improves the representation of the ISO evolution, and has good agreements with the observations. This study features the first attempt to investigate the Indian monsoon at convection permitting scale.

  16. Influence of mass transfer resistance on overall nitrate removal rate in upflow sludge bed reactors.

    PubMed

    Ting, Wen-Huei; Huang, Ju-Sheng

    2006-09-01

    A kinetic model with intrinsic reaction kinetics and a simplified model with apparent reaction kinetics for denitrification in upflow sludge bed (USB) reactors were proposed. USB-reactor performance data with and without sludge wasting were also obtained for model verification. An independent batch study showed that the apparent kinetic constants k' did not differ from the intrinsic k but the apparent Ks' was significantly larger than the intrinsic Ks suggesting that the intra-granule mass transfer resistance can be modeled by changes in Ks. Calculations of the overall effectiveness factor, Thiele modulus, and Biot number combined with parametric sensitivity analysis showed that the influence of internal mass transfer resistance on the overall nitrate removal rate in USB reactors is more significant than the external mass transfer resistance. The simulated residual nitrate concentrations using the simplified model were in good agreement with the experimental data; the simulated results using the simplified model were also close to those using the kinetic model. Accordingly, the simplified model adequately described the overall nitrate removal rate and can be used for process design.

  17. The lean service machine.

    PubMed

    Swank, Cynthia Karen

    2003-10-01

    Jefferson Pilot Financial, a life insurance and annuities firm, like many U.S. service companies at the end of the 1990s was looking for new ways to grow. Its top managers recognized that JPF needed to differentiate itself in the eyes of its customers, the independent life-insurance advisers who sell and service policies. To establish itself as these advisers' preferred partner, it set out to reduce the turnaround time on policy applications, simplify the submission process, and reduce errors. JPF's managers looked to the "lean production" practices that U.S. manufacturers adopted in response to competition from Japanese companies. Lean production is built around the concept of continuous-flow processing--a departure from traditional production systems, in which large batches are processed at each step. JPF appointed a "lean team" to reengineer its New Business unit's operations, beginning with the creation of a "model cell"--a fully functioning microcosm of JPF's entire process. This approach allowed managers to experiment and smooth out the kinks while working toward an optimal design. The team applied lean-manufacturing practices, including placing linked processes near one another, balancing employees' workloads, posting performance results, and measuring performance and productivity from the customer's perspective. Customer-focused metrics helped erode the employees' "My work is all that matters" mind-set. The results were so impressive that JPF is rolling out similar systems across many of its operations. To convince employees of the value of lean production, the lean team introduced a simulation in which teams compete to build the best paper airplane based on invented customer specifications. This game drives home lean production's basic principles, establishing a foundation for deep and far-reaching changes in the production system.

  18. Workbook, Basic Mathematics and Wastewater Processing Calculations.

    ERIC Educational Resources Information Center

    New York State Dept. of Environmental Conservation, Albany.

    This workbook serves as a self-learning guide to basic mathematics and treatment plant calculations and also as a reference and source book for the mathematics of sewage treatment and processing. In addition to basic mathematics, the workbook discusses processing and process control, laboratory calculations and efficiency calculations necessary in…

  19. A novel method of the image processing on irregular triangular meshes

    NASA Astrophysics Data System (ADS)

    Vishnyakov, Sergey; Pekhterev, Vitaliy; Sokolova, Elizaveta

    2018-04-01

    The paper describes a novel method of the image processing based on irregular triangular meshes implementation. The triangular mesh is adaptive to the image content, least mean square linear approximation is proposed for the basic interpolation within the triangle. It is proposed to use triangular numbers to simplify using of the local (barycentric) coordinates for the further analysis - triangular element of the initial irregular mesh is to be represented through the set of the four equilateral triangles. This allows to use fast and simple pixels indexing in local coordinates, e.g. "for" or "while" loops for access to the pixels. Moreover, representation proposed allows to use discrete cosine transform of the simple "rectangular" symmetric form without additional pixels reordering (as it is used for shape-adaptive DCT forms). Furthermore, this approach leads to the simple form of the wavelet transform on triangular mesh. The results of the method application are presented. It is shown that advantage of the method proposed is a combination of the flexibility of the image-adaptive irregular meshes with the simple form of the pixel indexing in local triangular coordinates and the using of the common forms of the discrete transforms for triangular meshes. Method described is proposed for the image compression, pattern recognition, image quality improvement, image search and indexing. It also may be used as a part of video coding (intra-frame or inter-frame coding, motion detection).

  20. MM wave SAR sensor design: Concept for an airborne low level reconnaissance system

    NASA Astrophysics Data System (ADS)

    Boesswetter, C.

    1986-07-01

    The basic system design considerations for a high resolution SAR system operating at 35 GHz or 94 GHz are given. First it is shown that only the focussed SAR concept in the side looking configuration matches the requirements and constraints. After definition of illumination geometry and airborne modes the fundamental SAR parameters in range and azimuth direction are derived. A review of the performance parameters of some critical mm wave components (coherent pulsed transmitters, front ends, antennas) establish the basis for further analysis. The power and contrast budget in the processed SAR image shows the feasibility of a 35/94 GHz SAR sensor design. The discussion of the resulting system parameters points out that this unusual system design implies both benefits and new risk areas. One of the benefits besides the compactness of sensor hardware turns out to be the short synthetic aperture length simplifying the design of the digital SAR processor, preferably operating in real time. A possible architecture based on current state-of-the-art correlator hardware is shown. One of the potential risk areas in achieving high resolution SAR imagery in the mm wave frequency band is motion compensation. However, it is shown that the short range and short synthetic aperture lengths ease the problem so that correction of motion induced phase errors and thus focussed synthetic aperture processing should be possible.

  1. Application of the top specified boundary layer (TSBL) approximation to initial characterization of an inland aquifer mineralization: 2. Seepage of saltwater through semi-confining layers

    USGS Publications Warehouse

    Rubin, H.; Buddemeier, R.W.

    1998-01-01

    This paper presents a generalized basic study that addresses practical needs for an understanding of the major mechanisms involved in the mineralization of groundwater in the Great Bend Prairie aquifer in south- central Kansas. This Quaternary alluvial aquifer and associated surface waters are subject to contamination by saltwater, which in some areas seeps from the deeper Permian bedrock formation into the overlying freshwater aquifer through semiconfining layers. A simplified conceptual model is adopted. It incorporates the freshwater aquifer whose bottom is comprised of a semiconfining layer through which a hydrologically minor but geochemically important saline water discharge seeps into the aquifer. A hierarchy of approximate approaches is considered to analyze the mineralization processes taking place in the aquifer. The recently developed top specified boundary layer (TSBL) approach is very convenient to use for the initial characterization of these processes, and is further adapted to characterization of head-driven seepage through semi-confining layers. TSBL calculations indicate that the seeping saline water may create two distinct new zones in the aquifer: (1) a completely saline zone (CSZ) adjacent to the semiconfining bottom of the aquifer, and (2) a transition zone (TZ) which develops between the CSZ and the freshwater zone. Some possible scenarios associated with the various mineralization patterns are analyzed and discussed.

  2. Development of Vehicle Model Test for Road Loading Analysis of Sedan Model

    NASA Astrophysics Data System (ADS)

    Mohd Nor, M. K.; Noordin, A.; Ruzali, M. F. S.; Hussen, M. H.

    2016-11-01

    Simple Structural Surfaces (SSS) method is offered as a means of organizing the process for rationalizing the basic vehicle body structure load paths. The application of this simplified approach is highly beneficial in the design development of modern passenger car structure especially during the conceptual stage. In Malaysia, however, there is no real physical model of SSS available to gain considerable insight and understanding into the function of each major subassembly in the whole vehicle structures. Based on this motivation, a physical model of SSS for sedan model with the corresponding model vehicle tests of bending and torsion is proposed in this work. The proposed approach is relatively easy to understand as compared to Finite Element Method (FEM). The results show that the proposed vehicle model test is capable to show that satisfactory load paths can give a sufficient structural stiffness within the vehicle structure. It is clearly observed that the global bending stiffness reduce significantly when more panels are removed from a complete SSS model. It is identified that parcel shelf is an important subassembly to sustain bending load. The results also match with the theoretical hypothesis, as the stiffness of the structure in an open section condition is shown weak when subjected to torsion load compared to bending load. The proposed approach can potentially be integrated with FEM to speed up the design process of automotive vehicle.

  3. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  4. Numerical simulation and analysis of impact of non-orographic gravity waves drag of middle atmosphere in framework of a general circulation model

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Wang, S.

    2017-12-01

    Gravity wave drag (GWD) is among the drivers of meridional overturning in the middle atmosphere, also known as the Brewer-Dobson Circulation, and of the quasi-biennial oscillation (QBO). The small spatial scales and complications due to wave breaking require their effects to be parameterised. GWD parameterizations are usually divided into two parts, orographic and non-orographic. The basic dynamical and physical processes of the middle atmosphere and the mechanism of the interactions between the troposphere and the middle atmosphere were studied in the frame of a general circulation model. The model for the troposphere was expanded to a global model considering middle atmosphere with the capability of describing the basic processes in the middle atmosphere and the troposphere-middle atmosphere interactions. Currently, it is too costly to include full non-hydrostatic and rotational wave dynamics in an operational parameterization. The hydrostatic non-rotational wave dynamics which allow an efficient implementation that is suitably fast for operation. The simplified parameterization of non-orographic GWD follows from the WM96 scheme in which a framework is developed using conservative propagation of gravity waves, critical level filtering, and non-linear dissipation. In order to simulate and analysis the influence of non-orographic GWD on the stratospheric wind and temperature fields, experiments using Stratospheric Sudden Warming (SSW) event case occurred in January 2013 were carried out, and results of objective weather forecast verifications of the two months period were compared in detail. The verification of monthly mean of forecast anomaly correlation (ACC) and root mean square (RMS) errors shows consistently positive impact of non-orographic GWD on skill score of forecasting for the three to eight days, both in the stratosphere and troposphere, and visible positive impact on prediction of the stratospheric wind and temperature fields. Numerical simulation during SSW event demonstrates that the influence on the temperature of middle stratosphere is mainly positive and there were larger departure both for the wind and temperature fields considering the non-orographic GWD during the warming process.

  5. Process for making carbon foam

    DOEpatents

    Klett, James W.

    2000-01-01

    The process obviates the need for conventional oxidative stabilization. The process employs mesophase or isotropic pitch and a simplified process using a single mold. The foam has a relatively uniform distribution of pore sizes and a highly aligned graphic structure in the struts. The foam material can be made into a composite which is useful in high temperature sandwich panels for both thermal and structural applications.

  6. Rotor design for maneuver performance

    NASA Technical Reports Server (NTRS)

    Berry, John D.; Schrage, Daniel

    1986-01-01

    A method of determining the sensitivity of helicopter maneuver performance to changes in basic rotor design parameters is developed. Maneuver performance is measured by the time required, based on a simplified rotor/helicopter performance model, to perform a series of specified maneuvers. This method identifies parameter values which result in minimum time quickly because of the inherent simplicity of the rotor performance model used. For the specific case studied, this method predicts that the minimum time required is obtained with a low disk loading and a relatively high rotor solidity. The method was developed as part of the winning design effort for the American Helicopter Society student design competition for 1984/1985.

  7. Targeted Therapy: Attacking Cancer with Molecular and Immunological Targeted Agents.

    PubMed

    Wilkes, Gail M

    2018-01-01

    Today, personalized cancer therapy with targeted agents has taken center stage, and offers individualized treatment to many. As the mysteries of the genes in a cell's DNA and their specific proteins are defined, advances in the understanding of cancer gene mutations and how cancer evades the immune system have been made. This article provides a basic and simplified understanding of the available (Food and Drug Administration- approved) molecularly and immunologically targeted agents in the USA. Other agents may be available in Asia, and throughout the USA and the world, many more agents are being studied. Nursing implications for drug classes are reviewed.

  8. Targeted Therapy: Attacking Cancer with Molecular and Immunological Targeted Agents

    PubMed Central

    Wilkes, Gail M.

    2018-01-01

    Today, personalized cancer therapy with targeted agents has taken center stage, and offers individualized treatment to many. As the mysteries of the genes in a cell's DNA and their specific proteins are defined, advances in the understanding of cancer gene mutations and how cancer evades the immune system have been made. This article provides a basic and simplified understanding of the available (Food and Drug Administration- approved) molecularly and immunologically targeted agents in the USA. Other agents may be available in Asia, and throughout the USA and the world, many more agents are being studied. Nursing implications for drug classes are reviewed. PMID:29607374

  9. Actin-based propulsion of a microswimmer.

    PubMed

    Leshansky, A M

    2006-07-01

    A simple hydrodynamic model of actin-based propulsion of microparticles in dilute cell-free cytoplasmic extracts is presented. Under the basic assumption that actin polymerization at the particle surface acts as a force dipole, pushing apart the load and the free (nonanchored) actin tail, the propulsive velocity of the microparticle is determined as a function of the tail length, porosity, and particle shape. The anticipated velocities of the cargo displacement and the rearward motion of the tail are in good agreement with recently reported results of biomimetic experiments. A more detailed analysis of the particle-tail hydrodynamic interaction is presented and compared to the prediction of the simplified model.

  10. Improved NASTRAN plotting

    NASA Technical Reports Server (NTRS)

    Chan, Gordon C.

    1991-01-01

    The new 1991 COSMIC/NASTRAN version, compatible with the older versions, tries to remove some old constraints and make it easier to extract information from the plot file. It also includes some useful improvements and new enhancements. New features available in the 1991 version are described. They include a new PLT1 tape with simplified ASCII plot commands and short records, combined hidden and shrunk plot, an x-y-z coordinate system on all structural plots, element offset plot, improved character size control, improved FIND and NOFIND logic, a new NASPLOT post-prosessor to perform screen plotting or generate PostScript files, and a BASIC/NASTPLOT program for PC.

  11. Navigation in large information spaces represented as hypertext: A review of the literature

    NASA Technical Reports Server (NTRS)

    Brown, Marcus

    1990-01-01

    The problem addressed is the failure of information-space navigation tools when the space grows to large. The basic goal is to provide the power of the hypertext interface in such a way as to be most easily comprehensible to the user. It was determined that the optimal structure for information is an overlapping, simplified hierarchy. The hierarchical structure should be made obvious to the user, and many of the non-hierarchical links in the information space should either by eliminated, or should be de-emphasized so that the novice user is not confused by them. Only one of the hierarchies should be very simple.

  12. Connections for solid oxide fuel cells

    DOEpatents

    Collie, Jeffrey C.

    1999-01-01

    A connection for fuel cell assemblies is disclosed. The connection includes compliant members connected to individual fuel cells and a rigid member connected to the compliant members. Adjacent bundles or modules of fuel cells are connected together by mechanically joining their rigid members. The compliant/rigid connection permits construction of generator fuel cell stacks from basic modular groups of cells of any desired size. The connections can be made prior to installation of the fuel cells in a generator, thereby eliminating the need for in-situ completion of the connections. In addition to allowing pre-fabrication, the compliant/rigid connections also simplify removal and replacement of sections of a generator fuel cell stack.

  13. A study of an alignment-less lithography method as an educational resource

    NASA Astrophysics Data System (ADS)

    Kai, Kazuho; Shiota, Koki; Nagaoka, Shiro; Mahmood, Mohamad Rusop Bin Haji; Kawai, Akira

    2016-07-01

    A simplification of the lithography process was studied. The simplification method of photolithography, named "alignment-less lithography" was proposed by omitting the photomask alignment process in photolithography process using mechanically aligned photomasks and substrate by using a simple jig on which countersinks were formed. Photomasks made of glass and the photomasks made of transparent plastic sheets were prepared for the process. As the result, approximately 5µm in the case of the glass mask, and 20µm in the case of the OHP mask were obtained with repetitive accuracies, respectively. It was confirmed that the alignment-less lithography method was successful. The possibility of the application to an educational program, such as a heuristic for solving problems was suggested using the method with the OHP mask. The nMOS FET fabrication process was successfully demonstrated using this method. The feasibility of this process was confirmed. It is expected that a totally simplified device fabrication process can be achievable when combined with other simplifications, such ass the simplified impurity diffusion processes using PSG and BSG thin film as diffusion source prepared by the Sol-Gel material under normal air environment.

  14. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    PubMed

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  15. Simplified Processing Method for Meter Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Colotelo, Alison H. A.; Downs, Janelle L.

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  16. Neutron residual stress measurement and numerical modeling in a curved thin-walled structure by laser powder bed fusion additive manufacturing

    DOE PAGES

    An, Ke; Yuan, Lang; Dial, Laura; ...

    2017-09-11

    Severe residual stresses in metal parts made by laser powder bed fusion additive manufacturing processes (LPBFAM) can cause both distortion and cracking during the fabrication processes. Limited data is currently available for both iterating through process conditions and design, and in particular, for validating numerical models to accelerate process certification. In this work, residual stresses of a curved thin-walled structure, made of Ni-based superalloy Inconel 625™ and fabricated by LPBFAM, were resolved by neutron diffraction without measuring the stress-free lattices along both the build and the transverse directions. The stresses of the entire part during fabrication and after cooling downmore » were predicted by a simplified layer-by-layer finite element based numerical model. The simulated and measured stresses were found in good quantitative agreement. The validated simplified simulation methodology will allow to assess residual stresses in more complex structures and to significantly reduce manufacturing cycle time.« less

  17. Neutron residual stress measurement and numerical modeling in a curved thin-walled structure by laser powder bed fusion additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Ke; Yuan, Lang; Dial, Laura

    Severe residual stresses in metal parts made by laser powder bed fusion additive manufacturing processes (LPBFAM) can cause both distortion and cracking during the fabrication processes. Limited data is currently available for both iterating through process conditions and design, and in particular, for validating numerical models to accelerate process certification. In this work, residual stresses of a curved thin-walled structure, made of Ni-based superalloy Inconel 625™ and fabricated by LPBFAM, were resolved by neutron diffraction without measuring the stress-free lattices along both the build and the transverse directions. The stresses of the entire part during fabrication and after cooling downmore » were predicted by a simplified layer-by-layer finite element based numerical model. The simulated and measured stresses were found in good quantitative agreement. The validated simplified simulation methodology will allow to assess residual stresses in more complex structures and to significantly reduce manufacturing cycle time.« less

  18. Agile green process design for the intensified Kolbe-Schmitt synthesis by accompanying (simplified) life cycle assessment.

    PubMed

    Kressirer, Sabine; Kralisch, Dana; Stark, Annegret; Krtschil, Ulrich; Hessel, Volker

    2013-05-21

    In order to investigate the potential for process intensification, various reaction conditions were applied to the Kolbe-Schmitt synthesis starting from resorcinol. Different CO₂ precursors such as aqueous potassium hydrogencarbonate, hydrogencarbonate-based ionic liquids, DIMCARB, or sc-CO₂, the application of microwave irradiation for fast volumetric heating of the reaction mixture, and the effect of harsh reaction conditions were investigated. The experiments, carried out in conventional batch-wise as well as in continuously operated microstructured reactors, aimed at the development of an environmentally benign process for the preparation of 2,4-dihydroxybenzoic acid. To provide decision support toward a green process design, a research-accompanying simplified life cycle assessment (SLCA) was performed throughout the whole investigation. Following this approach, it was found that convective heating methods such as oil bath or electrical heating were more beneficial than the application of microwave irradiation. Furthermore, the consideration of workup procedures was crucial for a holistic view on the environmental burdens.

  19. Research methodology simplification for teaching purposes illustrated by clutch automatic control device testing

    NASA Astrophysics Data System (ADS)

    Wojs, J.

    2016-09-01

    The paper proves that simplified, shorter examination of an object, feasible in laboratory classes, can produce results similar to those reached in scientific investigation of the device using extensive equipment. A thorough investigation of an object, an automatic clutch device in this case, enabled identifying the magnitudes that most significantly affect its operation. The knowledge of these most sensitive magnitudes allows focusing in the teaching process on simplified measurement of only selected magnitudes and verifying the given object in the positive or negative.

  20. The new American Heart Association cardiopulmonary resuscitation guidelines: should children and adults have to share?

    PubMed

    Sherman, Mindy

    2007-06-01

    The latest American Heart Association guidelines for pediatric cardiopulmonary resuscitation (CPR) were published in December 2005. Changes from the 2000 guidelines were directed toward simplifying CPR. Infants, children, and adults now share the same recommendation for the initial compression:ventilation ratio. This is a significant change for pediatricians trained in the importance of a respiratory etiology of pediatric cardiopulmonary arrest. The present review will focus on the rationale behind these guideline changes. The new guidelines for single rescuer CPR include a compression:ventilation ratio of 30: 2 for both adult and pediatric victims. The impetus for this recommendation is based on recent appreciation for the deleterious effects of hyperventilation as well as an attempt to increase bystander delivery of CPR. The physiologic results of hyperventilation are discussed. The new pediatric basic life support guideline changes are underscored. Research representing the spectrum of opinions on the optimal compression:ventilation ratio, including compression-only CPR, is presented. Although based primarily on adult, animal, and computational models, the new compression:ventilation ratio, recommended for both initial pediatric and adult CPR, is a reasonable recommendation. The simplified CPR guidelines released in 2005 will hopefully contribute to improved bystander delivery of CPR and improved outcome.

  1. A (137)Cs erosion model with moving boundary.

    PubMed

    Yin, Chuan; Ji, Hongbing

    2015-12-01

    A novel quantitative model of the relationship between diffused concentration changes and erosion rates using assessment of soil losses was developed. It derived from the analysis of surface soil (137)Cs flux variation under persistent erosion effect and based on the principle of geochemistry kinetics moving boundary. The new moving boundary model improves the basic simplified transport model (Zhang et al., 2008), and mainly applies to uniform rainfall areas which show a long-time soil erosion. The simulation results for this kind of erosion show under a long-time soil erosion, the influence of (137)Cs concentration will decrease exponentially with increasing depth. Using the new model fit to the measured (137)Cs depth distribution data in Zunyi site, Guizhou Province, China which has typical uniform rainfall provided a good fit with R(2) = 0.92. To compare the soil erosion rates calculated by the simple transport model and the new model, we take the Kaixian reference profile as example. The soil losses estimated by the previous simplified transport model are greater than those estimated by the new moving boundary model, which is consistent with our expectations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Development of Generation System of Simplified Digital Maps

    NASA Astrophysics Data System (ADS)

    Uchimura, Keiichi; Kawano, Masato; Tokitsu, Hiroki; Hu, Zhencheng

    In recent years, digital maps have been used in a variety of scenarios, including car navigation systems and map information services over the Internet. These digital maps are formed by multiple layers of maps of different scales; the map data most suitable for the specific situation are used. Currently, the production of map data of different scales is done by hand due to constraints related to processing time and accuracy. We conducted research concerning technologies for automatic generation of simplified map data from detailed map data. In the present paper, the authors propose the following: (1) a method to transform data related to streets, rivers, etc. containing widths into line data, (2) a method to eliminate the component points of the data, and (3) a method to eliminate data that lie below a certain threshold. In addition, in order to evaluate the proposed method, a user survey was conducted; in this survey we compared maps generated using the proposed method with the commercially available maps. From the viewpoint of the amount of data reduction and processing time, and on the basis of the results of the survey, we confirmed the effectiveness of the automatic generation of simplified maps using the proposed methods.

  3. Next Generation Remote Agent Planner

    NASA Technical Reports Server (NTRS)

    Jonsson, Ari K.; Muscettola, Nicola; Morris, Paul H.; Rajan, Kanna

    1999-01-01

    In May 1999, as part of a unique technology validation experiment onboard the Deep Space One spacecraft, the Remote Agent became the first complete autonomous spacecraft control architecture to run as flight software onboard an active spacecraft. As one of the three components of the architecture, the Remote Agent Planner had the task of laying out the course of action to be taken, which included activities such as turning, thrusting, data gathering, and communicating. Building on the successful approach developed for the Remote Agent Planner, the Next Generation Remote Agent Planner is a completely redesigned and reimplemented version of the planner. The new system provides all the key capabilities of the original planner, while adding functionality, improving performance and providing a modular and extendible implementation. The goal of this ongoing project is to develop a system that provides both a basis for future applications and a framework for further research in the area of autonomous planning for spacecraft. In this article, we present an introductory overview of the Next Generation Remote Agent Planner. We present a new and simplified definition of the planning problem, describe the basics of the planning process, lay out the new system design and examine the functionality of the core reasoning module.

  4. Basic principles to consider when opening a nurse practitioner-owned practice in Texas.

    PubMed

    Watson, Michael

    2015-12-01

    Advanced Practice Registered Nurse (APRN)-owned clinics in Texas are becoming more common and because of the success of these early clinics, more APRNs are considering opening their own practice; but Texas remains one of the most restrictive states for APRN practice and many questions remain. What are the regulations about physician delegation? Will you get reimbursed from insurance companies and at what rates? Can you be a primary care provider (PCP)? Changes enacted after the adoption of Senate Bill 406 improved the opportunities for APRNs in Texas yet several requirements must be met and early consultation with a lawyer and accountant can facilitate the initial business setup. The Prescriptive Authority Agreement simplified the delegation requirements and allows the APRN increased flexibility in obtaining and consulting with a delegating physician. Becoming credentialed as a PCP with private insurance companies is often complicated; however, utilizing the Council for Affordable Quality Healthcare's Universal Provider Data source for initial credentialing can facilitate this. Although this article does not discuss the financial implications of opening a practice, it does cover many aspects including legislative and regulatory requirements for practice, credentialing process and challenges, business structure, and tax implications. ©2015 American Association of Nurse Practitioners.

  5. Hypersonic Shock/Boundary-Layer Interaction Database

    NASA Technical Reports Server (NTRS)

    Settles, G. S.; Dodson, L. J.

    1991-01-01

    Turbulence modeling is generally recognized as the major problem obstructing further advances in computational fluid dynamics (CFD). A closed solution of the governing Navier-Stokes equations for turbulent flows of practical consequence is still far beyond grasp. At the same time, the simplified models of turbulence which are used to achieve closure of the Navier-Stokes equations are known to be rigorously incorrect. While these models serve a definite purpose, they are inadequate for the general prediction of hypersonic viscous/inviscid interactions, mixing problems, chemical nonequilibria, and a range of other phenomena which must be predicted in order to design a hypersonic vehicle computationally. Due to the complexity of turbulence, useful new turbulence models are synthesized only when great expertise is brought to bear and considerable intellectual energy is expended. Although this process is fundamentally theoretical, crucial guidance may be gained from carefully-executed basic experiments. Following the birth of a new model, its testing and validation once again demand comparisons with data of unimpeachable quality. This report concerns these issues which arise from the experimental aspects of hypersonic modeling and represents the results of the first phase of an effort to develop compressible turbulence models.

  6. Molluscan cells in culture: primary cell cultures and cell lines

    PubMed Central

    Yoshino, T. P.; Bickham, U.; Bayne, C. J.

    2013-01-01

    In vitro cell culture systems from molluscs have significantly contributed to our basic understanding of complex physiological processes occurring within or between tissue-specific cells, yielding information unattainable using intact animal models. In vitro cultures of neuronal cells from gastropods show how simplified cell models can inform our understanding of complex networks in intact organisms. Primary cell cultures from marine and freshwater bivalve and gastropod species are used as biomonitors for environmental contaminants, as models for gene transfer technologies, and for studies of innate immunity and neoplastic disease. Despite efforts to isolate proliferative cell lines from molluscs, the snail Biomphalaria glabrata Say, 1818 embryonic (Bge) cell line is the only existing cell line originating from any molluscan species. Taking an organ systems approach, this review summarizes efforts to establish molluscan cell cultures and describes the varied applications of primary cell cultures in research. Because of the unique status of the Bge cell line, an account is presented of the establishment of this cell line, and of how these cells have contributed to our understanding of snail host-parasite interactions. Finally, we detail the difficulties commonly encountered in efforts to establish cell lines from molluscs and discuss how these difficulties might be overcome. PMID:24198436

  7. Pleated and Creased Structures

    NASA Astrophysics Data System (ADS)

    Dudte, Levi; Wei, Zhiyan; Mahadevan, L.

    2012-02-01

    The strategic placement of curved folds on a paper annulus produces saddle-shaped origami. These exotic geometries resulting from simple design processes motivate our development of a computational tool to simulate the stretching, bending and folding of thin sheets of material. We seek to understand the shape of the curved origami figure by applying the computational tool to simulate a thin annulus with single or multiple folds. We aim to quantify the static geometry of this simplified model in order to delineate methods for actuation and control of similar developable structures with curved folds. Miura-ori pattern is a periodic pleated structure defined in terms of 2 angles and 2 lengths. The unit cell embodies the basic element in all non-trivial pleated structures - the mountain or valley folds, wherein four folds come together at a single vertex. The ability of this structure to pack and unpack with a few degrees of freedom leads to their use in deployable structures such as solar sails and maps, just as this feature is useful in insect wings, plant leaves and flowers. We probe the qualitative and quantitative aspects of the mechanical behavior of these structures with a view to optimizing material performance.

  8. Aligning interprofessional education collaborative sub-competencies to a progression of learning.

    PubMed

    Patel Gunaldo, Tina; Brisolara, Kari Fitzmorris; Davis, Alison H; Moore, Robert

    2017-05-01

    In the United States, the Interprofessional Education Collaborative (IPEC) developed four core competencies for interprofessional collaborative practice. Even though the IPEC competencies and respective sub-competencies were not created in a hierarchal manner, one might reflect upon a logical progression of learning as well as learners accruing skills allowing them to master one level of learning and building on the aggregate of skills before advancing to the next level. The Louisiana State University Health-New Orleans Center for Interprofessional Education and Collaborative Practice (CIPECP) determined the need to align the sub-competencies with the level of behavioural expectations in order to simplify the process of developing an interprofessional education experience targeted to specific learning levels. In order to determine the most effective alignment, CIPECP discussions revolved around current programmatic expectations across the institution. Faculty recognised the need to align sub-competencies with student learning objectives. Simultaneously, a progression of learning existing within each of the four IPEC domains was noted. Ultimately, the faculty and staff team agreed upon categorising the sub-competencies in a hierarchical manner for the four domains into either a "basic, intermediate, or advanced" level of competency.

  9. New Approaches for Calculating Moran’s Index of Spatial Autocorrelation

    PubMed Central

    Chen, Yanguang

    2013-01-01

    Spatial autocorrelation plays an important role in geographical analysis; however, there is still room for improvement of this method. The formula for Moran’s index is complicated, and several basic problems remain to be solved. Therefore, I will reconstruct its mathematical framework using mathematical derivation based on linear algebra and present four simple approaches to calculating Moran’s index. Moran’s scatterplot will be ameliorated, and new test methods will be proposed. The relationship between the global Moran’s index and Geary’s coefficient will be discussed from two different vantage points: spatial population and spatial sample. The sphere of applications for both Moran’s index and Geary’s coefficient will be clarified and defined. One of theoretical findings is that Moran’s index is a characteristic parameter of spatial weight matrices, so the selection of weight functions is very significant for autocorrelation analysis of geographical systems. A case study of 29 Chinese cities in 2000 will be employed to validate the innovatory models and methods. This work is a methodological study, which will simplify the process of autocorrelation analysis. The results of this study will lay the foundation for the scaling analysis of spatial autocorrelation. PMID:23874592

  10. Developing a Mobile Application "Educational Process Remote Management System" on the Android Operating System

    ERIC Educational Resources Information Center

    Abildinova, Gulmira M.; Alzhanov, Aitugan K.; Ospanova, Nazira N.; Taybaldieva, Zhymatay; Baigojanova, Dametken S.; Pashovkin, Nikita O.

    2016-01-01

    Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…

  11. Short Shot Tower for Silicon

    NASA Technical Reports Server (NTRS)

    Bates, H. E.; Hill, D. M.; Jewett, D. N.

    1983-01-01

    Drop length necessary to convert molten silicon to shot reduced by proposed new process. Conversion of silicon from powder or chunks to shot often simplifies processing. Shot is more easily handled in most processing equipment. Drops of liquid silicon fall through protective cloud of argon, then through rapidly cooling bath of methanol, where they quickly turn into solid shot.

  12. The Clone Factory

    ERIC Educational Resources Information Center

    Stoddard, Beryl

    2005-01-01

    Have humans been cloned? Is it possible? Immediate interest is sparked when students are asked these questions. In response to their curiosity, the clone factory activity was developed to help them understand the process of cloning. In this activity, students reenact the cloning process, in a very simplified simulation. After completing the…

  13. A transfer function type of simplified electrochemical model with modified boundary conditions and Padé approximation for Li-ion battery: Part 1. lithium concentration estimation

    NASA Astrophysics Data System (ADS)

    Yuan, Shifei; Jiang, Lei; Yin, Chengliang; Wu, Hongjie; Zhang, Xi

    2017-06-01

    To guarantee the safety, high efficiency and long lifetime for lithium-ion battery, an advanced battery management system requires a physics-meaningful yet computationally efficient battery model. The pseudo-two dimensional (P2D) electrochemical model can provide physical information about the lithium concentration and potential distributions across the cell dimension. However, the extensive computation burden caused by the temporal and spatial discretization limits its real-time application. In this research, we propose a new simplified electrochemical model (SEM) by modifying the boundary conditions for electrolyte diffusion equations, which significantly facilitates the analytical solving process. Then to obtain a reduced order transfer function, the Padé approximation method is adopted to simplify the derived transcendental impedance solution. The proposed model with the reduced order transfer function can be briefly computable and preserve physical meanings through the presence of parameters such as the solid/electrolyte diffusion coefficients (Ds&De) and particle radius. The simulation illustrates that the proposed simplified model maintains high accuracy for electrolyte phase concentration (Ce) predictions, saying 0.8% and 0.24% modeling error respectively, when compared to the rigorous model under 1C-rate pulse charge/discharge and urban dynamometer driving schedule (UDDS) profiles. Meanwhile, this simplified model yields significantly reduced computational burden, which benefits its real-time application.

  14. A practical method of determining water current velocities and diffusion coefficients in coastal waters by remote sensing techniques

    NASA Technical Reports Server (NTRS)

    James, W. P.

    1971-01-01

    A simplified procedure is presented for determining water current velocities and diffusion coefficients. Dye drops which form dye patches in the receiving water are made from an aircraft. The changes in position and size of the patches are recorded from two flights over the area. The simplified data processing procedure requires only that the ground coordinates about the dye patches be determined at the time of each flight. With an automatic recording coordinatograph for measuring coordinates and a computer for processing the data, this technique provides a practical method of determining circulation patterns and mixing characteristics of large aquatic systems. This information is useful in assessing the environmental impact of waste water discharges and for industrial plant siting.

  15. Navigation system and method

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Sennott, J. W. (Inventor)

    1984-01-01

    In a global positioning system (GPS), such as the NAVSTAR/GPS system, wherein the position coordinates of user terminals are obtained by processing multiple signals transmitted by a constellation of orbiting satellites, an acquisition-aiding signal generated by an earth-based control station is relayed to user terminals via a geostationary satellite to simplify user equipment. The aiding signal is FSK modulated on a reference channel slightly offset from the standard GPS channel. The aiding signal identifies satellites in view having best geometry and includes Doppler prediction data as well as GPS satellite coordinates and identification data associated with user terminals within an area being served by the control station and relay satellite. The aiding signal significantly reduces user equipment by simplifying spread spectrum signal demodulation and reducing data processing functions previously carried out at the user terminals.

  16. Development of quantitative security optimization approach for the picture archives and carrying system between a clinic and a rehabilitation center

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko

    2002-05-01

    The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.

  17. PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process

    Treesearch

    Henry Spelter

    1990-01-01

    This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...

  18. The Condition for Generous Trust

    PubMed Central

    Shinya, Obayashi; Yusuke, Inagaki; Hiroki, Takikawa

    2016-01-01

    Trust has been considered the “cement” of a society and is much studied in sociology and other social sciences. Most studies, however, have neglected one important aspect of trust: it involves an act of forgiving and showing tolerance toward another’s failure. In this study, we refer to this concept as “generous trust” and examine the conditions under which generous trust becomes a more viable option when compared to other types of trust. We investigate two settings. First, we introduce two types of uncertainties: uncertainty as to whether trustees have the intention to cooperate, and uncertainty as to whether trustees have enough competence to accomplish the entrusted tasks. Second, we examine the manner in which trust functions in a broader social context, one that involves matching and commitment processes. Since we expect generosity or forgiveness to work differently in the matching and commitment processes, we must differentiate trust strategies into generous trust in the matching process and that in the commitment process. Our analytical strategy is two-fold. First, we analyze the “modified” trust game that incorporates the two types of uncertainties without the matching process. This simplified setting enables us to derive mathematical results using game theory, thereby giving basic insight into the trust mechanism. Second, we investigate socially embedded trust relationships in contexts involving the matching and commitment processes, using agent-based simulation. Results show that uncertainty about partner’s intention and competence makes generous trust a viable option. In contrast, too much uncertainty undermines the possibility of generous trust. Furthermore, a strategy that is too generous cannot stand alone. Generosity should be accompanied with moderate punishment. As for socially embedded trust relationships, generosity functions differently in the matching process versus the commitment process. Indeed, these two types of generous trust coexist, and their coexistence enables a society to function well. PMID:27893759

  19. RILS: What are they, what are they good for, and do we have any?

    USDA-ARS?s Scientific Manuscript database

    RILs, or recombinant inbred lines, are a set of genetically related individuals that can simplify the gene discovery process. They are constructed using regular breeding processes rather than using tissue culture or other advanced biotechnology. Operationally, a hybrid is made, and this hybrid is se...

  20. 75 FR 77649 - Agency Information Collection Activities: Proposed Collection: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-13

    ... Division of Independent Review Grant Reviewer Recruitment Form (OMB No. 0915-0295)--[Extension] HRSA's... of all eligible applications submitted to HRSA. DIR ensures that the independent review process is... experience; and allows maximum use of drop-down menus to simplify the data collection process. The Web-based...

  1. Operational Control Procedures for the Activated Sludge Process, Part III-A: Calculation Procedures.

    ERIC Educational Resources Information Center

    West, Alfred W.

    This is the second in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. This document deals exclusively with the calculation procedures, including simplified mixing formulas, aeration tank…

  2. Evaluation of Models of the Reading Process.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  3. Initial Crisis Reaction and Poliheuristic Theory

    ERIC Educational Resources Information Center

    DeRouen, Karl, Jr.; Sprecher, Christopher

    2004-01-01

    Poliheuristic (PH) theory models foreign policy decisions using a two-stage process. The first step eliminates alternatives on the basis of a simplifying heuristic. The second step involves a selection from among the remaining alternatives and can employ a more rational and compensatory means of processing information. The PH model posits that…

  4. Polymer flammability

    DOT National Transportation Integrated Search

    2005-05-01

    This report provides an overview of polymer flammability from a material science perspective and describes currently accepted test methods to quantify burning behavior. Simplifying assumptions about the gas and condensed phase processes of flaming co...

  5. The Application of a Massively Parallel Computer to the Simulation of Electrical Wave Propagation Phenomena in the Heart Muscle Using Simplified Models

    NASA Technical Reports Server (NTRS)

    Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.

    1995-01-01

    The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.

  6. Simplified model of mean double step (MDS) in human body movement

    NASA Astrophysics Data System (ADS)

    Dusza, Jacek J.; Wawrzyniak, Zbigniew M.; Mugarra González, C. Fernando

    In this paper we present a simplified and useful model of the human body movement based on the full gait cycle description, called the Mean Double Step (MDS). It enables the parameterization and simplification of the human movement. Furthermore it allows a description of the gait cycle by providing standardized estimators to transform the gait cycle into a periodical movement process. Moreover the method of simplifying the MDS model and its compression are demonstrated. The simplification is achieved by reducing the number of bars of the spectrum and I or by reducing the number of samples describing the MDS both in terms of reducing their computational burden and their resources for the data storage. Our MDS model, which is applicable to the gait cycle method for examining patients, is non-invasive and provides the additional advantage of featuring a functional characterization of the relative or absolute movement of any part of the body.

  7. Examination of simplified travel demand model. [Internal volume forecasting model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, R.L. Jr.; McFarlane, W.J.

    1978-01-01

    A simplified travel demand model, the Internal Volume Forecasting (IVF) model, proposed by Low in 1972 is evaluated as an alternative to the conventional urban travel demand modeling process. The calibration of the IVF model for a county-level study area in Central Wisconsin results in what appears to be a reasonable model; however, analysis of the structure of the model reveals two primary mis-specifications. Correction of the mis-specifications leads to a simplified gravity model version of the conventional urban travel demand models. Application of the original IVF model to ''forecast'' 1960 traffic volumes based on the model calibrated for 1970more » produces accurate estimates. Shortcut and ad hoc models may appear to provide reasonable results in both the base and horizon years; however, as shown by the IVF mode, such models will not always provide a reliable basis for transportation planning and investment decisions.« less

  8. Simplified Model and Response Analysis for Crankshaft of Air Compressor

    NASA Astrophysics Data System (ADS)

    Chao-bo, Li; Jing-jun, Lou; Zhen-hai, Zhang

    2017-11-01

    The original model of crankshaft is simplified to the appropriateness to balance the calculation precision and calculation speed, and then the finite element method is used to analyse the vibration response of the structure. In order to study the simplification and stress concentration for crankshaft of air compressor, this paper compares calculative mode frequency and experimental mode frequency of the air compressor crankshaft before and after the simplification, the vibration response of reference point constraint conditions is calculated by using the simplified model, and the stress distribution of the original model is calculated. The results show that the error between calculative mode frequency and experimental mode frequency is controlled in less than 7%, the constraint will change the model density of the system, the position between the crank arm and the shaft appeared stress concentration, so the part of the crankshaft should be treated in the process of manufacture.

  9. Translating patient reported outcome measures: methodological issues explored using cognitive interviewing with three rheumatoid arthritis measures in six European languages.

    PubMed

    Hewlett, Sarah; Nicklin, Joanna; Bode, Chistina; Carmona, Loreto; Dures, Emma; Engelbrecht, Matthias; Hagel, Sofia; Kirwan, John; Molto, Anna; Redondo, Marta; Gossec, Laure

    2016-06-01

    Cross-cultural translation of patient-reported outcome measures (PROMs) is a lengthy process, often performed professionally. Cognitive interviewing assesses patient comprehension of PROMs. The objective was to evaluate the usefulness of cognitive interviewing to assess translations and compare professional (full) with non-professional (simplified) translation processes. A full protocol used for the Bristol RA Fatigue Multi-dimensional Questionnaire and Numerical Rating Scale (BRAF-MDQ, BRAF-NRS) was compared with a simplified protocol used for the RA Impact of Disease scale (RAID). RA patients in the UK, France, the Netherlands, Germany, Spain and Sweden completed the PROMs during cognitive interviewing (BRAFs in the UK were omitted as these were performed during development). Transcripts were deductively analysed for understanding, information retrieval, judgement and response options. Usefulness of cognitive interviewing was assessed by the nature of problems identified, and translation processes by percentage of consistently problematic items (⩾40% patients per country with similar concerns). Sixty patients participated (72% women). For the BRAFs (full protocol) one problematic item was identified (of 23 items × 5 languages, 1/115 = 0.9%). For the RAID (simplified protocol) two problematic items were identified (of 7 items × 6 languages, 2/42 = 4.8%), of which one was revised (Dutch). Coping questions were problematic in both PROMs. Conceptual and cultural challenges though rare were important, as identified by formal evaluation, demonstrating that cognitive interviewing is crucial in PROM translations. Proportionately fewer problematic items were found for the full than for the simplified translation procedure, suggesting that while both are acceptable, professional PROM translation might be preferable. Coping may be a particularly challenging notion cross-culturally. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Application of a simplified theory of ELF propagation to a simplified worldwide model of the ionosphere

    NASA Astrophysics Data System (ADS)

    Behroozi-Toosi, A. B.; Booker, H. G.

    1980-12-01

    The simplified theory of ELF wave propagation in the earth-ionosphere transmission lines developed by Booker (1980) is applied to a simplified worldwide model of the ionosphere. The theory, which involves the comparison of the local vertical refractive index gradient with the local wavelength in order to classify the altitude into regions of low and high gradient, is used for a model of electron and negative ion profiles in the D and E regions below 150 km. Attention is given to the frequency dependence of ELF propagation at a middle latitude under daytime conditions, the daytime latitude dependence of ELF propagation at the equinox, the effects of sunspot, seasonal and diurnal variations on propagation, nighttime propagation neglecting and including propagation above 100 km, and the effect on daytime ELF propagation of a sudden ionospheric disturbance. The numerical values obtained by the method for the propagation velocity and attenuation rate are shown to be in general agreement with the analytic Naval Ocean Systems Center computer program. It is concluded that the method employed gives more physical insights into propagation processes than any other method, while requiring less effort and providing maximal accuracy.

  11. No Pixel Left Behind - Peeling Away NASA's Satellite Swaths

    NASA Astrophysics Data System (ADS)

    Cechini, M. F.; Boller, R. A.; Schmaltz, J. E.; Roberts, J. T.; Alarcon, C.; Huang, T.; McGann, M.; Murphy, K. J.

    2014-12-01

    Discovery and identification of Earth Science products should not be the majority effort of scientific research. Search aides based on text metadata go to great lengths to simplify this process. However, the process is still cumbersome and requires too much data download and analysis to down select to valid products. The EOSDIS Global Imagery Browse Services (GIBS) is attempting to improve this process by providing "visual metadata" in the form of full-resolution visualizations representing geophysical parameters taken directly fromt he data. Through the use of accompanying interpretive information such as color legends and the natural visual processing of the human eye, researchers are able to search and filter through data products in a more natural and efficient way. The GIBS "visual metadata" products are generated as representations of Level 3 data or as temporal composites of the Level 2 granule- or swath-based data products projected across a geographic or polar region. Such an approach allows for low-latency tiled access to pre-generated imagery products. For many GIBS users, the resulting image suffices for a basic representation of the underlying data. However, composite imagery presents an insurmountable problem: for areas of spatial overlap within the composite, only one observation is visually represented. This is especially problematic in the polar regions where a significant portion of sensed data is "lost." In response to its user community, the GIBS team coordinated with its stakeholders to begin developing an approach to ensure that there is "no pixel left behind." In this presentation we will discuss the use cases and requirements guiding our efforts, considerations regarding standards compliance and interoperability, and near term goals. We will also discuss opportunities to actively engage with the GIBS team on this topic to continually improve our services.

  12. Modelling and simulation of cure in pultrusion processes

    NASA Astrophysics Data System (ADS)

    Tucci, F.; Rubino, F.; Paradiso, V.; Carlone, P.; Valente, R.

    2017-10-01

    Trial and error approach is not a suitable method to optimize the pultrusion process because of the high times required for the start up and the wide range of possible combinations of matrix and reinforcement. On the other hand, numerical approaches can be a suitable solution to test different parameter configuration. One of the main tasks in pultrusion processes is to obtain a complete and homogeneous resin polymerization. The formation of cross-links between polymeric chains is thermally induced but it leads to a strong exothermic heat generation, hence the thermal and the chemical phenomena are mutually affected. It requires that the two problems have to be modelled in coupled way. The mathematical model used in this work considers the composite as a lumped material, whose thermal and mechanical properties are evaluated as function of resin and fibers properties. The numerical pattern is based on a quasi-static approach in a three-dimensional Eulerian domain, which describes both thermal and chemical phenomena. The data obtained are used in a simplified C.H.I.L.E. (Cure Hardening Instantaneous Linear Elastic) model to compute the mechanical properties of the resin fraction in the pultruded. The two combined approaches allow to formulate a numerical model which takes into account the normal (no-penetration) and tangential (viscosity/friction) interactions between die and profile, the pulling force and the hydrostatic pressure of the liquid resin to evaluate the stress and strain fields induced by the process within the pultruded. The implementation of the numerical models has been carried out using the ABAQUS finite element suite, by means of several user subroutines (in Fortran language) which improve the basic software potentialities.

  13. Monitoring children's health and well-being by indicators and index: apples and oranges or fruit salad?

    PubMed

    Köhler, L

    2016-11-01

    The use of indicators is a fast and widely spread way to monitor groups of children's health and well-being. Indicators are useful in research; but they are also important tools for planners and politicians. Although they are constructed to simplify reality, in many reports they still offer a complex and confusing picture, not least by their sheer numbers. Although they are constructed to simplify reality, in many reports, they still offer a complex and confusing picture, not least by their sheer numbers. Therefore, there is an increasing demand for even further simplifications, where the indicators are combined into single summary numbers, composite indices. At the same time, as a composite index summarizes a complex and sometimes elusive process, making it more accessible for advocacy and political interventions, the combining of very dissimilar components makes the results difficult to interpret and use. There is an obvious dilemma between the need for rigour and evidence, the research orientation, and the wish for a simple and summarizing overview of the findings, the policy orientation. Models have been created to form indicator sets, either by combining them by simple addition or by weighting them or by just leaving them as separate indicators. Most index systems in operation use an equal weighting system after standardization, once the components have been selected. Examples of these models are described as well as their pros and cons, and a summary of suitable ways of handling the problems of indicators and composite indices is offered. Some surveys have taken the best from different approaches, presenting the results as a summary index for the great picture, as subindices for the various domains of child health and as separate indicators for the detailed study of the basic components. A Swedish Child Health Index is presented as an example of such a solution. © 2016 John Wiley & Sons Ltd.

  14. Population-level differences in disease transmission: A Bayesian analysis of multiple smallpox epidemics

    PubMed Central

    Elderd, Bret D.; Dwyer, Greg; Dukic, Vanja

    2013-01-01

    Estimates of a disease’s basic reproductive rate R0 play a central role in understanding outbreaks and planning intervention strategies. In many calculations of R0, a simplifying assumption is that different host populations have effectively identical transmission rates. This assumption can lead to an underestimate of the overall uncertainty associated with R0, which, due to the non-linearity of epidemic processes, may result in a mis-estimate of epidemic intensity and miscalculated expenditures associated with public-health interventions. In this paper, we utilize a Bayesian method for quantifying the overall uncertainty arising from differences in population-specific basic reproductive rates. Using this method, we fit spatial and non-spatial susceptible-exposed-infected-recovered (SEIR) models to a series of 13 smallpox outbreaks. Five outbreaks occurred in populations that had been previously exposed to smallpox, while the remaining eight occurred in Native-American populations that were naïve to the disease at the time. The Native-American outbreaks were close in a spatial and temporal sense. Using Bayesian Information Criterion (BIC), we show that the best model includes population-specific R0 values. These differences in R0 values may, in part, be due to differences in genetic background, social structure, or food and water availability. As a result of these inter-population differences, the overall uncertainty associated with the “population average” value of smallpox R0 is larger, a finding that can have important consequences for controlling epidemics. In general, Bayesian hierarchical models are able to properly account for the uncertainty associated with multiple epidemics, provide a clearer understanding of variability in epidemic dynamics, and yield a better assessment of the range of potential risks and consequences that decision makers face. PMID:24021521

  15. Solar Process Heat Basics | NREL

    Science.gov Websites

    Process Heat Basics Solar Process Heat Basics Commercial and industrial buildings may use the same , black metal panel mounted on a south-facing wall to absorb the sun's heat. Air passes through the many nonresidential buildings. A typical system includes solar collectors that work along with a pump, heat exchanger

  16. Curves showing column strength of steel and duralumin tubing

    NASA Technical Reports Server (NTRS)

    Ross, Orrin E

    1929-01-01

    Given here are a set of column strength curves that are intended to simplify the method of determining the size of struts in an airplane structure when the load in the member is known. The curves will also simplify the checking of the strength of a strut if the size and length are known. With these curves, no computations are necessary, as in the case of the old-fashioned method of strut design. The process is so simple that draftsmen or others who are not entirely familiar with mechanics can check the strength of a strut without much danger of error.

  17. A simplified analytical solution for thermal response of a one-dimensional, steady state transpiration cooling system in radiative and convective environment

    NASA Technical Reports Server (NTRS)

    Kubota, H.

    1976-01-01

    A simplified analytical method for calculation of thermal response within a transpiration-cooled porous heat shield material in an intense radiative-convective heating environment is presented. The essential assumptions of the radiative and convective transfer processes in the heat shield matrix are the two-temperature approximation and the specified radiative-convective heatings of the front surface. Sample calculations for porous silica with CO2 injection are presented for some typical parameters of mass injection rate, porosity, and material thickness. The effect of these parameters on the cooling system is discussed.

  18. Spacecraft transformer and inductor design

    NASA Technical Reports Server (NTRS)

    Mclyman, W. T.

    1977-01-01

    The conversion process in spacecraft power electronics requires the use of magnetic components which frequently are the heaviest and bulkiest items in the conversion circuit. This handbook pertains to magnetic material selection, transformer and inductor design tradeoffs, transformer design, iron core dc inductor design, toroidal power core inductor design, window utilization factors, regulation, and temperature rise. Relationships are given which simplify and standardize the design of transformers and the analysis of the circuits in which they are used. The interactions of the various design parameters are also presented in simplified form so that tradeoffs and optimizations may easily be made.

  19. Developing a simplified consent form for biobanking.

    PubMed

    Beskow, Laura M; Friedman, Joëlle Y; Hardy, N Chantelle; Lin, Li; Weinfurt, Kevin P

    2010-10-08

    Consent forms have lengthened over time and become harder for participants to understand. We sought to demonstrate the feasibility of creating a simplified consent form for biobanking that comprises the minimum information necessary to meet ethical and regulatory requirements. We then gathered preliminary data concerning its content from hypothetical biobank participants. We followed basic principles of plain-language writing and incorporated into a 2-page form (not including the signature page) those elements of information required by federal regulations and recommended by best practice guidelines for biobanking. We then recruited diabetes patients from community-based practices and randomized half (n = 56) to read the 2-page form, first on paper and then a second time on a tablet computer. Participants were encouraged to use "More information" buttons on the electronic version whenever they had questions or desired further information. These buttons led to a series of "Frequently Asked Questions" (FAQs) that contained additional detailed information. Participants were asked to identify specific sentences in the FAQs they thought would be important if they were considering taking part in a biorepository. On average, participants identified 7 FAQ sentences as important (mean 6.6, SD 14.7, range: 0-71). No one sentence was highlighted by a majority of participants; further, 34 (60.7%) participants did not highlight any FAQ sentences. Our preliminary findings suggest that our 2-page form contains the information that most prospective participants identify as important. Combining simplified forms with supplemental material for those participants who desire more information could help minimize consent form length and complexity, allowing the most substantively material information to be better highlighted and enabling potential participants to read the form and ask questions more effectively.

  20. Impact and Cost-effectiveness of 3 Doses of 9-Valent Human Papillomavirus (HPV) Vaccine Among US Females Previously Vaccinated With 4-Valent HPV Vaccine.

    PubMed

    Chesson, Harrell W; Laprise, Jean-François; Brisson, Marc; Markowitz, Lauri E

    2016-06-01

    We estimated the potential impact and cost-effectiveness of providing 3-doses of nonavalent human papillomavirus (HPV) vaccine (9vHPV) to females aged 13-18 years who had previously completed a series of quadrivalent HPV vaccine (4vHPV), a strategy we refer to as "additional 9vHPV vaccination." We used 2 distinct models: (1) the simplified model, which is among the most basic of the published dynamic HPV models, and (2) the US HPV-ADVISE model, a complex, stochastic, individual-based transmission-dynamic model. When assuming no 4vHPV cross-protection, the incremental cost per quality-adjusted life-year (QALY) gained by additional 9vHPV vaccination was $146 200 in the simplified model and $108 200 in the US HPV-ADVISE model ($191 800 when assuming 4vHPV cross-protection). In 1-way sensitivity analyses in the scenario of no 4vHPV cross-protection, the simplified model results ranged from $70 300 to $182 000, and the US HPV-ADVISE model results ranged from $97 600 to $118 900. The average cost per QALY gained by additional 9vHPV vaccination exceeded $100 000 in both models. However, the results varied considerably in sensitivity and uncertainty analyses. Additional 9vHPV vaccination is likely not as efficient as many other potential HPV vaccination strategies, such as increasing primary 9vHPV vaccine coverage. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  1. SOPanG: online text searching over a pan-genome.

    PubMed

    Cislak, Aleksander; Grabowski, Szymon; Holub, Jan

    2018-06-22

    The many thousands of high-quality genomes available nowadays imply a shift from single genome to pan-genomic analyses. A basic algorithmic building brick for such a scenario is online search over a collection of similar texts, a problem with surprisingly few solutions presented so far. We present SOPanG, a simple tool for exact pattern matching over an elastic-degenerate string, a recently proposed simplified model for the pan-genome. Thanks to bit-parallelism, it achieves pattern matching speeds above 400MB/s, more than an order of magnitude higher than of other software. SOPanG is available for free from: https://github.com/MrAlexSee/sopang. Supplementary data are available at Bioinformatics online.

  2. Resizing procedure for optimum design of structures under combined mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Narayanaswami, R.

    1976-01-01

    An algorithm is reported for resizing structures subjected to combined thermal and mechanical loading. The algorithm is applicable to uniaxial stress elements (rods) and membrane biaxial stress members. Thermal Fully Stressed Design (TFSD) is based on the basic difference between mechanical and thermal stresses in their response to resizing. The TFSD technique is found to converge in fewer iterations than ordinary fully stressed design for problems where thermal stresses are comparable to the mechanical stresses. The improved convergence is demonstrated by example with a study of a simplified wing structure, built-up with rods and membranes and subjected to a combination of mechanical loads and a three dimensional temperature distribution.

  3. A pilot training manual for the terminal configured vehicle electronic horizontal situation indicator

    NASA Technical Reports Server (NTRS)

    Houck, J. A.

    1981-01-01

    The initial phase of a training program for the Terminal Configured Vehicle Electronic Situation indicator (EHSI) is presented. The EHSI and its symbology is introduced and interpretation of the symbols is explained. Basic symbols shown on the display at all times are first presented. Additional optional symbols to be used as appropriate during different portions of a flight are then introduced and various display configurations interpreted. The upper half of each page is a reproduction of the EHSI display or other pertinent instructional material and the bottom half contains explanatory text, simplifying production of an audiovisual package for use with large training classes. Two quizzes on the course material are included.

  4. The computation of standard solar models

    NASA Technical Reports Server (NTRS)

    Ulrich, Roger K.; Cox, Arthur N.

    1991-01-01

    Procedures for calculating standard solar models with the usual simplifying approximations of spherical symmetry, no mixing except in the surface convection zone, no mass loss or gain during the solar lifetime, and no separation of elements by diffusion are described. The standard network of nuclear reactions among the light elements is discussed including rates, energy production and abundance changes. Several of the equation of state and opacity formulations required for the basic equations of mass, momentum and energy conservation are presented. The usual mixing-length convection theory is used for these results. Numerical procedures for calculating the solar evolution, and current evolution and oscillation frequency results for the present sun by some recent authors are given.

  5. The ANMLite Language and Logic for Specifying Planning Problems

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Siminiceanu, Radu I.; Munoz, Cesar A.

    2007-01-01

    We present the basic concepts of the ANMLite planning language. We discuss various aspects of specifying a plan in terms of constraints and checking the existence of a solution with the help of a model checker. The constructs of the ANMLite language have been kept as simple as possible in order to reduce complexity and simplify the verification problem. We illustrate the language with a specification of the space shuttle crew activity model that was constructed under the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project. The main purpose of this study was to explore the implications of choosing a robust logic behind the specification of constraints, rather than simply proposing a new planning language.

  6. An improved task-role-based access control model for G-CSCW applications

    NASA Astrophysics Data System (ADS)

    He, Chaoying; Chen, Jun; Jiang, Jie; Han, Gang

    2005-10-01

    Access control is an important and popular security mechanism for multi-user applications. GIS-based Computer Supported Cooperative Work (G-CSCW) application is one of such applications. This paper presents an improved Task-Role-Based Access Control (X-TRBAC) model for G-CSCW applications. The new model inherits the basic concepts of the old ones, such as role and task. Moreover, it has introduced two concepts, i.e. object hierarchy and operation hierarchy, and the corresponding rules to improve the efficiency of permission definition in access control models. The experiments show that the method can simplify the definition of permissions, and it is more applicable for G-CSCW applications.

  7. An RC active filter design handbook

    NASA Technical Reports Server (NTRS)

    Deboo, G. J.

    1977-01-01

    The design of filters is described. Emphasis is placed on simplified procedures that can be used by the reader who has minimum knowledge about circuit design and little acquaintance with filter theory. The handbook has three main parts. The first part is a review of some information that is essential for work with filters. The second part includes design information for specific types of filter circuitry and describes simple procedures for obtaining the component values for a filter that will have a desired set of characteristics. Pertinent information relating to actual performance is given. The third part (appendix) is a review of certain topics in filter theory and is intended to provide some basic understanding of how filters are designed.

  8. Kernelization

    NASA Astrophysics Data System (ADS)

    Fomin, Fedor V.

    Preprocessing (data reduction or kernelization) as a strategy of coping with hard problems is universally used in almost every implementation. The history of preprocessing, like applying reduction rules simplifying truth functions, can be traced back to the 1950's [6]. A natural question in this regard is how to measure the quality of preprocessing rules proposed for a specific problem. For a long time the mathematical analysis of polynomial time preprocessing algorithms was neglected. The basic reason for this anomaly was that if we start with an instance I of an NP-hard problem and can show that in polynomial time we can replace this with an equivalent instance I' with |I'| < |I| then that would imply P=NP in classical complexity.

  9. Tinkering With AGCMs To Investigate Atmospheric Behavior

    NASA Astrophysics Data System (ADS)

    Bitz, C. M.

    2014-12-01

    My experience teaching a course in global climate modeling has proven that students (and instructors) with wide-ranging backgrounds in earth-science learn effectively about the complexity of climate by tinker with model components. As an example, I will present a series of experiments in an AGCM with highly simplified geometries for ocean and land to test the response of the atmosphere to variations in basic parameters. The figure below shows an example of how the zonal wind changes with surface roughness and orography. The pinnacle of experiments explored in my course was the outcome of a homework assignment where students reduced the cloud droplet radius by 40% over ocean, and the results surprised students and instructor alike.

  10. Memory-Intensive Benchmarks: IRAM vs. Cache-Based Machines

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Gaeke, Brian R.; Husbands, Parry; Li, Xiaoye S.; Oliker, Leonid; Yelick, Katherine A.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The increasing gap between processor and memory performance has lead to new architectural models for memory-intensive applications. In this paper, we explore the performance of a set of memory-intensive benchmarks and use them to compare the performance of conventional cache-based microprocessors to a mixed logic and DRAM processor called VIRAM. The benchmarks are based on problem statements, rather than specific implementations, and in each case we explore the fundamental hardware requirements of the problem, as well as alternative algorithms and data structures that can help expose fine-grained parallelism or simplify memory access patterns. The benchmarks are characterized by their memory access patterns, their basic control structures, and the ratio of computation to memory operation.

  11. Modeling of Radiative Heat Transfer in an Electric Arc Furnace

    NASA Astrophysics Data System (ADS)

    Opitz, Florian; Treffinger, Peter; Wöllenstein, Jürgen

    2017-12-01

    Radiation is an important means of heat transfer inside an electric arc furnace (EAF). To gain insight into the complex processes of heat transfer inside the EAF vessel, not only radiation from the surfaces but also emission and absorption of the gas phase and the dust cloud need to be considered. Furthermore, the radiative heat exchange depends on the geometrical configuration which is continuously changing throughout the process. The present paper introduces a system model of the EAF which takes into account the radiative heat transfer between the surfaces and the participating medium. This is attained by the development of a simplified geometrical model, the use of a weighted-sum-of-gray-gases model, and a simplified consideration of dust radiation. The simulation results were compared with the data of real EAF plants available in literature.

  12. Climate Change, Nutrition, and Bottom-Up and Top-Down Food Web Processes.

    PubMed

    Rosenblatt, Adam E; Schmitz, Oswald J

    2016-12-01

    Climate change ecology has focused on climate effects on trophic interactions through the lenses of temperature effects on organismal physiology and phenological asynchronies. Trophic interactions are also affected by the nutrient content of resources, but this topic has received less attention. Using concepts from nutritional ecology, we propose a conceptual framework for understanding how climate affects food webs through top-down and bottom-up processes impacted by co-occurring environmental drivers. The framework integrates climate effects on consumer physiology and feeding behavior with effects on resource nutrient content. It illustrates how studying responses of simplified food webs to simplified climate change might produce erroneous predictions. We encourage greater integrative complexity of climate change research on trophic interactions to resolve patterns and enhance predictive capacities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doughty, Benjamin; Simpson, Mary Jane; Yang, Bin

    Our work aims to simplify multi-dimensional femtosecond transient absorption microscopy (TAM) data into decay associated amplitude maps that describe the spatial distributions of dynamical processes occurring on various characteristic timescales. Application of this method to TAM data obtained from a model methyl-ammonium lead iodide (CH 3NH 3PbI 3) perovskite thin film allows us to simplify the dataset consisting of a 68 time-resolved images into 4 decay associated amplitude maps. Furthermore, these maps provide a simple means to visualize the complex electronic excited-state dynamics in this system by separating distinct dynamical processes evolving on characteristic timescales into individual spatial images. Thismore » approach provides new insight into subtle aspects of ultrafast relaxation dynamics associated with excitons and charge carriers in the perovskite thin film, which have recently been found to coexist at spatially distinct locations.« less

  14. Testing of a simplified LED based vis/NIR system for rapid ripeness evaluation of white grape (Vitis vinifera L.) for Franciacorta wine.

    PubMed

    Giovenzana, Valentina; Civelli, Raffaele; Beghi, Roberto; Oberti, Roberto; Guidetti, Riccardo

    2015-11-01

    The aim of this work was to test a simplified optical prototype for a rapid estimation of the ripening parameters of white grape for Franciacorta wine directly in field. Spectral acquisition based on reflectance at four wavelengths (630, 690, 750 and 850 nm) was proposed. The integration of a simple processing algorithm in the microcontroller software would allow to visualize real time values of spectral reflectance. Non-destructive analyses were carried out on 95 grape bunches for a total of 475 berries. Samplings were performed weekly during the last ripening stages. Optical measurements were carried out both using the simplified system and a portable commercial vis/NIR spectrophotometer, as reference instrument for performance comparison. Chemometric analyses were performed in order to extract the maximum useful information from optical data. Principal component analysis (PCA) was performed for a preliminary evaluation of the data. Correlations between the optical data matrix and ripening parameters (total soluble solids content, SSC; titratable acidity, TA) were carried out using partial least square (PLS) regression for spectra and using multiple linear regression (MLR) for data from the simplified device. Classification analysis were also performed with the aim of discriminate ripe and unripe samples. PCA, MLR and classification analyses show the effectiveness of the simplified system in separating samples among different sampling dates and in discriminating ripe from unripe samples. Finally, simple equations for SSC and TA prediction were calculated. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Connecting the Library's Patron Database to Campus Administrative Software: Simplifying the Library's Accounts Receivable Process

    ERIC Educational Resources Information Center

    Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth

    2010-01-01

    This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…

  16. Seven Ways to Streamline Student Services

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2011-01-01

    To meet the expectations of today's tech-savvy students, colleges and universities are looking for ways to speed up their processes and provide better services for their No. 1 customer. They have turned to technology to simplify processes, reduce costs, and meet the high expectations of a technically literate student body. In this article, the…

  17. Strategies of Clarification in Judges' Use of Language: From the Written to the Spoken.

    ERIC Educational Resources Information Center

    Philips, Susan U.

    1985-01-01

    Reports on a study of judges' strategies in clarifying their verbal explanations of constitutional rights to criminal defendants. Identifies six clarification processes and compares them with other studies of clarification processes and with the properties of simplified registers, particularly speech addressed to first- and second-language…

  18. Basic Functional Capabilities for a Military Message Processing Service

    DTIC Science & Technology

    1974-09-01

    AD-AiI1 166 BASIC FUNCTIONA’. CAPABILITIES FOR A MILITARY MESSAGE PROCESSING SERVICE Ronald Tugender, et al University of Southern California...Itte) S. TYPE OF REPORT & PERIOD COVERED BASIC FUNCTIONAL CAPABILITIES FOR A Research Report MILITARY MESSAGE PROCESSING SERVICE 6. PERFORMING ONG...WOROD (Conionwo m trevre aide If tneeoooy arm idmentify by egekA INber) automated message processing , command and control, writer-to-reader service

  19. Learning Computers, Speaking English: Cooperative Activities for Learning English and Basic Word Processing.

    ERIC Educational Resources Information Center

    Quann, Steve; Satin, Diana

    This textbook leads high-beginning and intermediate English-as-a-Second-Language (ESL) students through cooperative computer-based activities that combine language learning with training in basic computer skills and word processing. Each unit concentrates on a basic concept of word processing while also focusing on a grammar topic. Skills are…

  20. A Simplified Baseband Prefilter Model with Adaptive Kalman Filter for Ultra-Tight COMPASS/INS Integration

    PubMed Central

    Luo, Yong; Wu, Wenqi; Babu, Ravindra; Tang, Kanghua; Luo, Bing

    2012-01-01

    COMPASS is an indigenously developed Chinese global navigation satellite system and will share many features in common with GPS (Global Positioning System). Since the ultra-tight GPS/INS (Inertial Navigation System) integration shows its advantage over independent GPS receivers in many scenarios, the federated ultra-tight COMPASS/INS integration has been investigated in this paper, particularly, by proposing a simplified prefilter model. Compared with a traditional prefilter model, the state space of this simplified system contains only carrier phase, carrier frequency and carrier frequency rate tracking errors. A two-quadrant arctangent discriminator output is used as a measurement. Since the code tracking error related parameters were excluded from the state space of traditional prefilter models, the code/carrier divergence would destroy the carrier tracking process, and therefore an adaptive Kalman filter algorithm tuning process noise covariance matrix based on state correction sequence was incorporated to compensate for the divergence. The federated ultra-tight COMPASS/INS integration was implemented with a hardware COMPASS intermediate frequency (IF), and INS's accelerometers and gyroscopes signal sampling system. Field and simulation test results showed almost similar tracking and navigation performances for both the traditional prefilter model and the proposed system; however, the latter largely decreased the computational load. PMID:23012564

  1. Improving the Simplified Acquisition of Base Engineering Requirements (SABER) Delivery Order Award Process: Results of a Process Improvement Plan

    DTIC Science & Technology

    1991-09-01

    putting all tasks directed towsrds achieving an outcome in aequence. The tasks can be viewed as steps in the process (39:2.3). Using this...improvement opportunity is investigated. A plan is developed, root causes are identified, and solutions are tested and implemented. The process is... solutions , check for actual improvement, and integrate the successful improvements into the process. ?UP 7. Check Improvement Performance. Finally, the

  2. Enzyme catalysis with small ionic liquid quantities.

    PubMed

    Fischer, Fabian; Mutschler, Julien; Zufferey, Daniel

    2011-04-01

    Enzyme catalysis with minimal ionic liquid quantities improves reaction rates, stereoselectivity and enables solvent-free processing. In particular the widely used lipases combine well with many ionic liquids. Demonstrated applications are racemate separation, esterification and glycerolysis. Minimal solvent processing is also an alternative to sluggish solvent-free catalysis. The method allows simplified down-stream processing, as only traces of ionic liquids have to be removed.

  3. Development of integrated control system for smart factory in the injection molding process

    NASA Astrophysics Data System (ADS)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  4. Cysteine Racemization on IgG Heavy and Light Chains

    PubMed Central

    Zhang, Qingchun; Flynn, Gregory C.

    2013-01-01

    Under basic pH conditions, the heavy chain 220-light chain 214 (H220-L214) disulfide bond, found in the flexible hinge region of an IgG1, can convert to a thioether. Similar conditions also result in racemization of the H220 cysteine. Here, we report that racemization occurs on both H220 and L214 on an IgG1 with a λ light chain (IgG1λ) but almost entirely on H220 of an IgGl with a κ light chain (IgG1κ) under similar conditions. Likewise, racemization was detected at significant levels on H220 and L214 on endogenous human IgG1λ but only at the H220 position on IgG1κ. Low but measurable levels of d-cysteines were found on IgG2 cysteines in the hinge region, both with monoclonal antibodies incubated under basic pH conditions and on antibodies isolated from human serum. A simplified reaction mechanism involving reversible β-elimination on the cysteine is presented that accounts for both base-catalyzed racemization and thioether formation at the hinge disulfide. PMID:24142697

  5. This Is Rocket Science!

    NASA Astrophysics Data System (ADS)

    Keith, Wayne; Martin, Cynthia; Veltkamp, Pamela

    2013-09-01

    Using model rockets to teach physics can be an effective way to engage students in learning. In this paper, we present a curriculum developed in response to an expressed need for helping high school students review physics equations in preparation for a state-mandated exam. This required a mode of teaching that was more advanced and analytical than that offered by Estes Industries, but more basic than the analysis of Nelson et al. In particular, drag is neglected until the very end of the exercise, which allows the concept of conservation of energy to be shown when predicting the rocket's flight. Also, the variable mass of the rocket motor is assumed to decrease linearly during the flight (while the propulsion charge and recovery delay charge are burning) and handled simplistically by using an average mass value. These changes greatly simplify the equations needed to predict the times and heights at various stages of flight, making it more useful as a review of basic physics. Details about model rocket motors, range safety, and other supplemental information may be found online at Apogee Components4 and the National Association of Rocketry.5

  6. Genetically engineered pigs as models for human disease

    PubMed Central

    Perleberg, Carolin; Kind, Alexander

    2018-01-01

    ABSTRACT Genetically modified animals are vital for gaining a proper understanding of disease mechanisms. Mice have long been the mainstay of basic research into a wide variety of diseases but are not always the most suitable means of translating basic knowledge into clinical application. The shortcomings of rodent preclinical studies are widely recognised, and regulatory agencies around the world now require preclinical trial data from nonrodent species. Pigs are well suited to biomedical research, sharing many similarities with humans, including body size, anatomical features, physiology and pathophysiology, and they already play an important role in translational studies. This role is set to increase as advanced genetic techniques simplify the generation of pigs with precisely tailored modifications designed to replicate lesions responsible for human disease. This article provides an overview of the most promising and clinically relevant genetically modified porcine models of human disease for translational biomedical research, including cardiovascular diseases, cancers, diabetes mellitus, Alzheimer's disease, cystic fibrosis and Duchenne muscular dystrophy. We briefly summarise the technologies involved and consider the future impact of recent technical advances. PMID:29419487

  7. Patterned control of human locomotion

    PubMed Central

    Lacquaniti, Francesco; Ivanenko, Yuri P; Zago, Myrka

    2012-01-01

    There is much experimental evidence for the existence of biomechanical constraints which simplify the problem of control of multi-segment movements. In addition, it has been hypothesized that movements are controlled using a small set of basic temporal components or activation patterns, shared by several different muscles and reflecting global kinematic and kinetic goals. Here we review recent studies on human locomotion showing that muscle activity is accounted for by a combination of few basic patterns, each one timed at a different phase of the gait cycle. Similar patterns are involved in walking and running at different speeds, walking forwards or backwards, and walking under different loading conditions. The corresponding weights of distribution to different muscles may change as a function of the condition, allowing highly flexible control. Biomechanical correlates of each activation pattern have been described, leading to the hypothesis that the co-ordination of limb and body segments arises from the coupling of neural oscillators between each other and with limb mechanical oscillators. Muscle activations need only intervene during limited time epochs to force intrinsic oscillations of the system when energy is lost. PMID:22411012

  8. Patterned control of human locomotion.

    PubMed

    Lacquaniti, Francesco; Ivanenko, Yuri P; Zago, Myrka

    2012-05-15

    There is much experimental evidence for the existence of biomechanical constraints which simplify the problem of control of multi-segment movements. In addition, it has been hypothesized that movements are controlled using a small set of basic temporal components or activation patterns, shared by several different muscles and reflecting global kinematic and kinetic goals. Here we review recent studies on human locomotion showing that muscle activity is accounted for by a combination of few basic patterns, each one timed at a different phase of the gait cycle. Similar patterns are involved in walking and running at different speeds, walking forwards or backwards, and walking under different loading conditions. The corresponding weights of distribution to different muscles may change as a function of the condition, allowing highly flexible control. Biomechanical correlates of each activation pattern have been described, leading to the hypothesis that the co-ordination of limb and body segments arises from the coupling of neural oscillators between each other and with limb mechanical oscillators. Muscle activations need only intervene during limited time epochs to force intrinsic oscillations of the system when energy is lost.

  9. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.

  10. 6 Source Categories - Boilers (Proposed Action)

    EPA Pesticide Factsheets

    EPA is proposing options to simplify the Clean Air Act permitting process for certain smaller sources of air pollution commonly found in Indian country. This action would ensure that air quality in Indian country is protected.

  11. A Sequential Shifting Algorithm for Variable Rotor Speed Control

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Edwards, Jason M.; DeCastro, Jonathan A.

    2007-01-01

    A proof of concept of a continuously variable rotor speed control methodology for rotorcraft is described. Variable rotor speed is desirable for several reasons including improved maneuverability, agility, and noise reduction. However, it has been difficult to implement because turboshaft engines are designed to operate within a narrow speed band, and a reliable drive train that can provide continuous power over a wide speed range does not exist. The new methodology proposed here is a sequential shifting control for twin-engine rotorcraft that coordinates the disengagement and engagement of the two turboshaft engines in such a way that the rotor speed may vary over a wide range, but the engines remain within their prescribed speed bands and provide continuous torque to the rotor; two multi-speed gearboxes facilitate the wide rotor speed variation. The shifting process begins when one engine slows down and disengages from the transmission by way of a standard freewheeling clutch mechanism; the other engine continues to apply torque to the rotor. Once one engine disengages, its gear shifts, the multi-speed gearbox output shaft speed resynchronizes and it re-engages. This process is then repeated with the other engine. By tailoring the sequential shifting, the rotor may perform large, rapid speed changes smoothly, as demonstrated in several examples. The emphasis of this effort is on the coordination and control aspects for proof of concept. The engines, rotor, and transmission are all simplified linear models, integrated to capture the basic dynamics of the problem.

  12. A low-cost PC-based telemetry data-reduction system

    NASA Astrophysics Data System (ADS)

    Simms, D. A.; Butterfield, C. P.

    1990-04-01

    The Solar Energy Research Institute's (SERI) Wind Research Branch is using Pulse Code Modulation (PCM) telemetry data-acquisition systems to study horizontal-axis wind turbines. PCM telemetry systems are used in test installations that require accurate multiple-channel measurements taken from a variety of different locations. SERI has found them ideal for use in tests requiring concurrent acquisition of data-reduction system to facilitate quick, in-the-field multiple-channel data analysis. Called the PC-PCM System, it consists of two basic components. First, AT-compatible hardware boards are used for decoding and combining PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for the DOS operating system was developed to simplify data-acquisition control and management. The software provides a quick, easy-to-use interface between the PC and PCM data streams. Called the Quick-Look Data Management Program, it is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. This paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data. Also discussed are problems and techniques associated with PC-based telemetry data acquisition, processing, and real-time display.

  13. Energy alternative for industry: the high-temperature gas-cooled reactor steamer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMain, A.T. Jr.; Blok, F.J.

    1978-04-01

    Large industrial complexes are faced with new requirements that will lead to a transition from such fluid fuels as natural gas and oil to such solid fuels as coal and uranium for supply of industrial energy. Power plants using these latter fuels will be of moderate size (800 to 1200 MW(thermal)) and will generally have the capability of co-generating electric power and process steam. A study has been made regarding use of the 840-MW(thermal) Fort St. Vrain high-temperature gas-cooled reactor (HTGR) design for industrial applications. The initial conceptual design (referred to as the HTGR Steamer) is substantially simplified relative tomore » Fort St. Vrain in that outlet helium and steam temperatures are lower and the reheat section is deleted from the steam generators. The Steamer has four independent steam generating loops producing a total of 277 kg/s (2.2 x 10/sup 6/ lb/h) of prime steam at 4.5 MPa/672 K (650 psia/750/sup 0/F). The unit co-generates 46 MW(electric) and provides process steam at 8.31 MPa/762 K(1200 psia/912/sup 0/F). The basic configuration and much of the equipment are retained from the Fort St. Vrain design. The system has inherent safety features important for industrial applications. These and other features indicate that the HTGR Steamer is an industrial energy option deserving additional evaluation. Subsequent work will focus on parallel design optimization and application studies.« less

  14. Population-expression models of immune response

    NASA Astrophysics Data System (ADS)

    Stromberg, Sean P.; Antia, Rustom; Nemenman, Ilya

    2013-06-01

    The immune response to a pathogen has two basic features. The first is the expansion of a few pathogen-specific cells to form a population large enough to control the pathogen. The second is the process of differentiation of cells from an initial naive phenotype to an effector phenotype which controls the pathogen, and subsequently to a memory phenotype that is maintained and responsible for long-term protection. The expansion and the differentiation have been considered largely independently. Changes in cell populations are typically described using ecologically based ordinary differential equation models. In contrast, differentiation of single cells is studied within systems biology and is frequently modeled by considering changes in gene and protein expression in individual cells. Recent advances in experimental systems biology make available for the first time data to allow the coupling of population and high dimensional expression data of immune cells during infections. Here we describe and develop population-expression models which integrate these two processes into systems biology on the multicellular level. When translated into mathematical equations, these models result in non-conservative, non-local advection-diffusion equations. We describe situations where the population-expression approach can make correct inference from data while previous modeling approaches based on common simplifying assumptions would fail. We also explore how model reduction techniques can be used to build population-expression models, minimizing the complexity of the model while keeping the essential features of the system. While we consider problems in immunology in this paper, we expect population-expression models to be more broadly applicable.

  15. Learning at work: competence development or competence-stress.

    PubMed

    Paulsson, Katarina; Ivergård, Toni; Hunt, Brian

    2005-03-01

    Changes in work and the ways in which it is carried out bring a need for upgrading workplace knowledge, skills and competencies. In today's workplaces, and for a number of reasons, workloads are higher than ever and stress is a growing concern (Health Risk Soc. 2(2) (2000) 173; Educat. Psychol. Meas. 61(5) (2001) 866). Increased demand for learning brings a risk that this will be an additional stress factor and thus a risk to health. Our research study is based on the control-demand-support model of Karasek and Theorell (Health Work: Stress, Productivity and the Reconstruction of Working Life, Basic Books/Harper, New York, 1990). We have used this model for our own empirical research with the aim to evaluate the model in the modern workplace. Our research enables us to expand the model in the light of current workplace conditions-especially those relating to learning. We report empirical data from a questionnaire survey of working conditions in two different branches of industry. We are able to define differences between companies in terms of working conditions and competence development. We describe and discuss the effects these conditions have on workplace competence development. Our research results show that increased workers' control of the learning process makes competence development more stimulating, is likely to simplify the work and reduces (learning-related) stress. It is therefore important that learning at work allows employees to control their learning and also allows time for the process of learning and reflection.

  16. How much expert knowledge is it worth to put in conceptual hydrological models?

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Zappa, Massimiliano

    2017-04-01

    Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.

  17. 47 CFR 69.119 - Basic service element expedited approval process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Basic service element expedited approval process. 69.119 Section 69.119 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) ACCESS CHARGES Computation of Charges § 69.119 Basic service element...

  18. Environmental Education Project for Developing Supersphere Characters in Children's Books

    NASA Astrophysics Data System (ADS)

    Vasconcelos, G. F.

    2014-12-01

    To raise awareness and provide basic knowledge about the environment, three children's books have been created to explain, in a simplified language, the basic processes of formation of the Earth, the origin of the universe, the planets and the moon, volcanism, rock formation and the appearance of water, the earliest life forms and their evolution and the main elements of Earth. These phenomena are represented through the main characters who are super heroes or designated Superspheres, such as Hydrosphere (water), Lithosphere (rocks), Biosphere (Polite, a stromatolite and Sarite, the mineral dolomite), Atmosphere (air), which are the 4 components of the Earth System, and the Pyrosphere(fire), which supplies the energy to drive the Earth System through volcanic activity. The characters have each developed super powers that evolved over geological time as they are transformed. They are the basic elements of nature and appear in a specific chronological order. With the emergence of ancient life in the seas, the Biosphere begins to use the energy of the sun, through of the photosynthetic activity of stromatolites, one of the friends of the superheroes, to produce the oxygen for the Atmosphere. Over a vast period of time, the evolution of life continues with the formation of the supercontinent Gondwana. With the arrival of man and his interaction with the Earth, the villain appears in the book as the "Homo incorrectus" (incorrect man), who abuses and tries to destroy the super heroes, the "Superspheres". The importance of these characters for the living Earth is emphasized, creating a connection between children and the characters. The aim of the story is to create a greater ecological conscience in the children and showing them that they should be helping to save the "Superspheres", who are in danger and need to be preserved. Projects for primary schools in the State of Rio de Janeiro have been designed and implemented around these characters, focusing on the rescue of each of these "Superspheres" and their importance for the local geological heritage of the region.

  19. Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, Ioulia; Scherzer, Daniel

    2016-04-01

    Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.

  20. Concept for a fast analysis method of the energy dissipation at mechanical joints

    NASA Astrophysics Data System (ADS)

    Wolf, Alexander; Brosius, Alexander

    2017-10-01

    When designing hybrid parts and structures one major challenge is the design, production and quality assessment of the joining points. While the polymeric composites themselves have excellent material properties, the necessary joints are often the weak link in assembled structures. This paper presents a method of measuring and analysing the energy dissipation at mechanical joining points of hybrid parts. A simplified model is applied based on the characteristic response to different excitation frequencies and amplitudes. The dissipation from damage is the result of relative moments between joining partners und damaged fibres within the composite, whereas the visco-elastic material behaviour causes the intrinsic dissipation. The ambition is to transfer these research findings to the characterisation of mechanical joints in order to quickly assess the general quality of the joint with this non-destructive testing method. The inherent challenge for realising this method is the correct interpretation of the measured energy dissipation and its attribution to either a bad joining point or intrinsic material properties. In this paper the authors present the concept for energy dissipation measurements at different joining points. By inverse analysis a simplified fast semi-analytical model will be developed that allows for a quick basic quality assessment of a given joining point.

  1. Studying Weather and Climate Using Atmospheric Retrospective Analyses

    NASA Astrophysics Data System (ADS)

    Bosilovich, M. G.

    2014-12-01

    Over the last 35 years, tremendous amounts of satellite observations of the Earth's atmosphere have been collected along side the much longer and diverse record of in situ measurements. The satellite data records have disparate qualities, structure and uncertainty which make comparing weather from the 80s and 2000s a challenging prospect. Likewise, in-situ data records lack complete coverage of the earth in both space and time. Atmospheric reanalyses use the observations with numerical models and data assimilation to produce continuous and consistent weather data records for periods longer than decades. The result is a simplified data format with a relatively straightforward learning curve that includes many more variables available (through the modeling component of the system), but driven by a full suite of observational data. The simplified data format allows introduction into weather and climate data analysis. Some examples are provided from undergraduate meteorology program internship projects. We will present the students progression through the projects from their initial understanding and competencies to some final results and the skills learned along the way. Reanalyses are a leading research tool in weather and climate, but can also provide an introductory experience as well, allowing students to develop an understanding of the physical system while learning basic programming and analysis skills.

  2. A Process for the Critical Analysis of Instructional Theory

    ERIC Educational Resources Information Center

    Bostwick, Jay A.; Calvert, Isaac Wade; Francis, Jenifer; Hawkley, Melissa; Henrie, Curtis R.; Hyatt, Frederick R.; Juncker, Janeel; Gibbons, Andrew S.

    2014-01-01

    Some have argued for a common language in the field of instructional design in an effort to reduce misunderstandings and simplify a multitude of synonymous terms and concepts. Others feel that this goal is undesirable in that it precludes development and flexibility. In this article we propose an ontology-building process as a way for readers to…

  3. Etching and Growth of GaAs

    NASA Technical Reports Server (NTRS)

    Seabaugh, A. C.; Mattauch, R., J.

    1983-01-01

    In-place process for etching and growth of gallium arsenide calls for presaturation of etch and growth melts by arsenic source crystal. Procedure allows precise control of thickness of etch and newly grown layer on substrate. Etching and deposition setup is expected to simplify processing and improve characteristics of gallium arsenide lasers, high-frequency amplifiers, and advanced integrated circuits.

  4. A Rotating Plug Model of Friction Stir Welding Heat Transfer

    NASA Technical Reports Server (NTRS)

    Raghulapadu J. K.; Peddieson, J.; Buchanan, G. R.; Nunes, A. C.

    2006-01-01

    A simplified rotating plug model is employed to study the heat transfer phenomena associated with the fiction stir welding process. An approximate analytical solution is obtained based on this idealized model and used both to demonstrate the qualitative influence of process parameters on predictions and to estimate temperatures produced in typical fiction stir welding situations.

  5. Cutting the Cost of New Community College Facilities: Streamlining the Facilities Approval Process. Commission on Innovation Policy Discussion Paper Number 3.

    ERIC Educational Resources Information Center

    BW Associates, Berkeley, CA.

    Intended to provide background information and preliminary options for the California Community Colleges' Commission on Innovation, this document proposes that approval processes for new facilities be simplified and that restrictions on the lease or purchase of off-campus facilities be eased. Following introductory materials detailing the…

  6. A Debugger for Computational Grid Applications

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele

    2000-01-01

    The p2d2 project at NAS has built a debugger for applications running on heterogeneous computational grids. It employs a client-server architecture to simplify the implementation. Its user interface has been designed to provide process control and state examination functions on a computation containing a large number of processes. It can find processes participating in distributed computations even when those processes were not created under debugger control. These process identification techniques work both on conventional distributed executions as well as those on a computational grid.

  7. Guidance on including ITS elements in transportation projects

    DOT National Transportation Integrated Search

    2001-01-01

    The purpose of this document is to provide guidance for including ITS equipment and technologies as part of traditional transportation construction or maintenance projects. This document is not intended to simplify the planning process, rather it is ...

  8. Simplifying the construction of domain-specific automatic programming systems: The NASA automated software development workstation project

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1987-01-01

    An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.

  9. Simplifying the construction of domain-specific automatic programming systems: The NASA automated software development workstation project

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1988-01-01

    An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Flight Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.

  10. Simplification of femtosecond transient absorption microscopy data from CH3NH3PbI3 perovskite thin films into decay associated amplitude maps

    NASA Astrophysics Data System (ADS)

    Doughty, Benjamin; Simpson, Mary Jane; Yang, Bin; Xiao, Kai; Ma, Ying-Zhong

    2016-03-01

    This work aims to simplify multi-dimensional femtosecond transient absorption microscopy (TAM) data into decay associated amplitude maps (DAAMs) that describe the spatial distributions of dynamical processes occurring on various characteristic timescales. Application of this method to TAM data obtained from a model methyl-ammonium lead iodide (CH3NH3PbI3) perovskite thin film allows us to simplify the data set comprising 68 time-resolved images into four DAAMs. These maps offer a simple means to visualize the complex electronic excited-state dynamics in this system by separating distinct dynamical processes evolving on characteristic timescales into individual spatial images. This approach provides new insight into subtle aspects of ultrafast relaxation dynamics associated with excitons and charge carriers in the perovskite thin film, which have recently been found to coexist at spatially distinct locations.

  11. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    NASA Astrophysics Data System (ADS)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  12. Natural-Annotation-based Unsupervised Construction of Korean-Chinese Domain Dictionary

    NASA Astrophysics Data System (ADS)

    Liu, Wuying; Wang, Lin

    2018-03-01

    The large-scale bilingual parallel resource is significant to statistical learning and deep learning in natural language processing. This paper addresses the automatic construction issue of the Korean-Chinese domain dictionary, and presents a novel unsupervised construction method based on the natural annotation in the raw corpus. We firstly extract all Korean-Chinese word pairs from Korean texts according to natural annotations, secondly transform the traditional Chinese characters into the simplified ones, and finally distill out a bilingual domain dictionary after retrieving the simplified Chinese words in an extra Chinese domain dictionary. The experimental results show that our method can automatically build multiple Korean-Chinese domain dictionaries efficiently.

  13. Making Cancer Health Text on the Internet Easier to Read for Deaf People Who Use American Sign Language.

    PubMed

    Kushalnagar, Poorna; Smith, Scott; Hopper, Melinda; Ryan, Claire; Rinkevich, Micah; Kushalnagar, Raja

    2018-02-01

    People with relatively limited English language proficiency find the Internet's cancer and health information difficult to access and understand. The presence of unfamiliar words and complex grammar make this particularly difficult for Deaf people. Unfortunately, current technology does not support low-cost, accurate translations of online materials into American Sign Language. However, current technology is relatively more advanced in allowing text simplification, while retaining content. This research team developed a two-step approach for simplifying cancer and other health text. They then tested the approach, using a crossover design with a sample of 36 deaf and 38 hearing college students. Results indicated that hearing college students did well on both the original and simplified text versions. Deaf college students' comprehension, in contrast, significantly benefitted from the simplified text. This two-step translation process offers a strategy that may improve the accessibility of Internet information for Deaf, as well as other low-literacy individuals.

  14. Basic visual perceptual processes in children with typical development and cerebral palsy: The processing of surface, length, orientation, and position.

    PubMed

    Schmetz, Emilie; Magis, David; Detraux, Jean-Jacques; Barisnikov, Koviljka; Rousselle, Laurence

    2018-03-02

    The present study aims to assess how the processing of basic visual perceptual (VP) components (length, surface, orientation, and position) develops in typically developing (TD) children (n = 215, 4-14 years old) and adults (n = 20, 20-25 years old), and in children with cerebral palsy (CP) (n = 86, 5-14 years old) using the first four subtests of the Battery for the Evaluation of Visual Perceptual and Spatial processing in children. Experiment 1 showed that these four basic VP processes follow distinct developmental trajectories in typical development. Experiment 2 revealed that children with CP present global and persistent deficits for the processing of basic VP components when compared with TD children matched on chronological age and nonverbal reasoning abilities.

  15. Effect of spine motion on mobility in quadruped running

    NASA Astrophysics Data System (ADS)

    Chen, Dongliang; Liu, Qi; Dong, Litao; Wang, Hong; Zhang, Qun

    2014-11-01

    Most of current running quadruped robots have similar construction: a stiff body and four compliant legs. Many researches have indicated that the stiff body without spine motion is a main factor in limitation of robots' mobility. Therefore, investigating spine motion is very important to build robots with better mobility. A planar quadruped robot is designed based on cheetahs' morphology. There is a spinal driving joint in the body of the robot. When the spinal driving joint acts, the robot has spine motion; otherwise, the robot has not spine motion. Six group prototype experiments with the robot are carried out to study the effect of spine motion on mobility. In each group, there are two comparative experiments: the spinal driving joint acts in one experiment but does not in the other experiment. The results of the prototype experiments indicate that the average speeds of the robot with spine motion are 8.7%-15.9% larger than those of the robot without spine motion. Furthermore, a simplified sagittal plane model of quadruped mammals is introduced. The simplified model also has a spinal driving joint. Using a similar process as the prototype experiments, six group simulation experiments with the simplified model are conducted. The results of the simulation experiments show that the maximum rear leg horizontal thrusts of the simplified mode with spine motion are 68.2%-71.3% larger than those of the simplified mode without spine motion. Hence, it is found that spine motion can increase the average running speed and the intrinsic reason of speed increase is the improvement of the maximum rear leg horizontal thrust.

  16. Efficient Execution Methods of Pivoting for Bulk Extraction of Entity-Attribute-Value-Modeled Data

    PubMed Central

    Luo, Gang; Frey, Lewis J.

    2017-01-01

    Entity-attribute-value (EAV) tables are widely used to store data in electronic medical records and clinical study data management systems. Before they can be used by various analytical (e.g., data mining and machine learning) programs, EAV-modeled data usually must be transformed into conventional relational table format through pivot operations. This time-consuming and resource-intensive process is often performed repeatedly on a regular basis, e.g., to provide a daily refresh of the content in a clinical data warehouse. Thus, it would be beneficial to make pivot operations as efficient as possible. In this paper, we present three techniques for improving the efficiency of pivot operations: 1) filtering out EAV tuples related to unneeded clinical parameters early on; 2) supporting pivoting across multiple EAV tables; and 3) conducting multi-query optimization. We demonstrate the effectiveness of our techniques through implementation. We show that our optimized execution method of pivoting using these techniques significantly outperforms the current basic execution method of pivoting. Our techniques can be used to build a data extraction tool to simplify the specification of and improve the efficiency of extracting data from the EAV tables in electronic medical records and clinical study data management systems. PMID:25608318

  17. Three-tier multi-granularity switching system based on PCE

    NASA Astrophysics Data System (ADS)

    Wang, Yubao; Sun, Hao; Liu, Yanfei

    2017-10-01

    With the growing demand for business communications, electrical signal processing optical path switching can't meet the demand. The multi-granularity switch system that can improve node routing and switching capabilities came into being. In the traditional network, each node is responsible for calculating the path; synchronize the whole network state, which will increase the burden on the network, so the concept of path calculation element (PCE) is proposed. The PCE is responsible for routing and allocating resources in the network1. In the traditional band-switched optical network, the wavelength is used as the basic routing unit, resulting in relatively low wavelength utilization. Due to the limitation of wavelength continuity, the routing design of the band technology becomes complicated, which directly affects the utilization of the system. In this paper, optical code granularity is adopted. There is no continuity of the optical code, and the number of optical codes is more flexible than the wavelength. For the introduction of optical code switching, we propose a Code Group Routing Entity (CGRE) algorithm. In short, the combination of three-tier multi-granularity optical switching system and PCE can simplify the network structure, reduce the node load, and enhance the network scalability and survivability. Realize the intelligentization of optical network.

  18. Modeling and simulation of dense cloud dispersion in urban areas by means of computational fluid dynamics.

    PubMed

    Scargiali, F; Grisafi, F; Busciglio, A; Brucato, A

    2011-12-15

    The formation of toxic heavy clouds as a result of sudden accidental releases from mobile containers, such as road tankers or railway tank cars, may occur inside urban areas so the problem arises of their consequences evaluation. Due to the semi-confined nature of the dispersion site simplified models may often be inappropriate. As an alternative, computational fluid dynamics (CFD) has the potential to provide realistic simulations even for geometrically complex scenarios since the heavy gas dispersion process is described by basic conservation equations with a reduced number of approximations. In the present work a commercial general purpose CFD code (CFX 4.4 by Ansys(®)) is employed for the simulation of dense cloud dispersion in urban areas. The simulation strategy proposed involves a stationary pre-release flow field simulation followed by a dynamic after-release flow and concentration field simulations. In order to try a generalization of results, the computational domain is modeled as a simple network of straight roads with regularly distributed blocks mimicking the buildings. Results show that the presence of buildings lower concentration maxima and enlarge the side spread of the cloud. Dispersion dynamics is also found to be strongly affected by the quantity of heavy-gas released. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. NASA Manufacturing and Test Requirements for Normally Closed Pyrovalves for Hazardous Flight Systems Applications

    NASA Technical Reports Server (NTRS)

    McDougle, Stephen H.

    2015-01-01

    Pyrovalves (figure 1, Basic Pyrovalve Design and Features,) are typically lighter, more reliable, and in most cases less expensive than other types of valves. They also consume less electrical power. They are single-use devices that are used in propulsion systems to isolate propellants or pressurant gases. These fluids may be hazardous because of their toxicity, reactivity, temperature, or high pressure. Note that in the simplified block diagram below not all detail features are shown so that those of major interest are more prominent. The diagram is provided to point out the various features that are discussed in this Specification. Features of some NC parent metal valve designs may differ. In 2013, the NESC concluded an extensive study of the reliability and safety of NC parent metal valves used in payloads carried aboard ELVs. The assessment successfully evaluated technical data to determine the risk of NC parent metal valve leakage or inadvertent activation in ELV payloads. The study resulted in numerous recommendations to ensure personnel and hardware/facility safety during ground processing of ELV payloads. One of those recommendations was to establish a NASA specification for NC parent metal valves. This Specification is a result of that recommendation, which is documented in NESC-RP-10-00614.

  20. Development of vehicle model test-bending of a simple structural surfaces model for automotive vehicle sedan

    NASA Astrophysics Data System (ADS)

    Nor, M. K. Mohd; Noordin, A.; Ruzali, M. F. S.; Hussen, M. H.; Mustapa@Othman, N.

    2017-04-01

    Simple Structural Surfaces (SSS) method is offered as a means of organizing the process for rationalizing the basic vehicle body structure load paths. The application of this simplified approach is highly beneficial in the development of modern passenger car structure design. In Malaysia, the SSS topic has been widely adopted and seems compulsory in various automotive programs related to automotive vehicle structures in many higher education institutions. However, there is no real physical model of SSS available to gain considerable insight and understanding into the function of each major subassembly in the whole vehicle structures. Based on this motivation, a real physical SSS of sedan model and the corresponding model vehicle tests of bending is proposed in this work. The proposed approach is relatively easy to understand as compared to Finite Element Method (FEM). The results prove that the proposed vehicle model test is useful to physically demonstrate the importance of providing continuous load path using the necessary structural components within the vehicle structures. It is clearly observed that the global bending stiffness reduce significantly when more panels are removed from the complete SSS model. The analysis shows the front parcel shelf is an important subassembly to sustain bending load.

  1. A procedure for classifying textural facies in gravel‐bed rivers

    USGS Publications Warehouse

    Buffington, John M.; Montgomery, David R.

    1999-01-01

    Textural patches (i.e., grain‐size facies) are commonly observed in gravel‐bed channels and are of significance for both physical and biological processes at subreach scales. We present a general framework for classifying textural patches that allows modification for particular study goals, while maintaining a basic degree of standardization. Textures are classified using a two‐tier system of ternary diagrams that identifies the relative abundance of major size classes and subcategories of the dominant size. An iterative procedure of visual identification and quantitative grain‐size measurement is used. A field test of our classification indicates that it affords reasonable statistical discrimination of median grain size and variance of bed‐surface textures. We also explore the compromise between classification simplicity and accuracy. We find that statistically meaningful textural discrimination requires use of both tiers of our classification. Furthermore, we find that simplified variants of the two‐tier scheme are less accurate but may be more practical for field studies which do not require a high level of textural discrimination or detailed description of grain‐size distributions. Facies maps provide a natural template for stratifying other physical and biological measurements and produce a retrievable and versatile database that can be used as a component of channel monitoring efforts.

  2. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  3. Towards automatic patient positioning and scan planning using continuously moving table MR imaging.

    PubMed

    Koken, Peter; Dries, Sebastian P M; Keupp, Jochen; Bystrov, Daniel; Pekar, Vladimir; Börnert, Peter

    2009-10-01

    A concept is proposed to simplify patient positioning and scan planning to improve ease of use and workflow in MR. After patient preparation in front of the scanner the operator selects the anatomy of interest by a single push-button action. Subsequently, the patient table is moved automatically into the scanner, while real-time 3D isotropic low-resolution continuously moving table scout scanning is performed using patient-independent MR system settings. With a real-time organ identification process running in parallel and steering the scanner, the target anatomy can be positioned fully automatically in the scanner's sensitive volume. The desired diagnostic examination of the anatomy of interest can be planned and continued immediately using the geometric information derived from the acquired 3D data. The concept was implemented and successfully tested in vivo in 12 healthy volunteers, focusing on the liver as the target anatomy. The positioning accuracy achieved was on the order of several millimeters, which turned out to be sufficient for initial planning purposes. Furthermore, the impact of nonoptimal system settings on the positioning performance, the signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) was investigated. The present work proved the basic concept of the proposed approach as an element of future scan automation. (c) 2009 Wiley-Liss, Inc.

  4. A micropatterning and image processing approach to simplify measurement of cellular traction forces

    PubMed Central

    Polio, Samuel R.; Rothenberg, Katheryn E.; Stamenović, Dimitrije; Smith, Michael L.

    2012-01-01

    Quantification of the traction forces that cells apply to their surroundings has been critical to the advancement of our understanding of cancer, development and basic cell biology. This field was made possible through the development of engineered cell culture systems that permit optical measurement of cell-mediated displacements and computational algorithms that allow conversion of these displacements into stresses and forces. Here, we present a novel advancement of traction force microscopy on polyacrylamide (PAA) gels that addresses limitations of existing technologies. Through an indirect patterning technique, we generated PAA gels with fluorescent 1 μm dot markers in a regularized array. This improves existing traction measurements since (i) multiple fields of view can be measured in one experiment without the need for cell removal; (ii) traction vectors are modeled as discrete point forces, and not as a continuous field, using an extremely simple computational algorithm that we have made available online; and (iii) the pattern transfer technique is amenable to any of the published techniques for producing patterns on glass. In the future, this technique will be used for measuring traction forces on complex patterns with multiple, spatially distinct ligands in systems for applying strain to the substrate, and in sandwich cultures that generate quasi-three-dimensional environments for cells. PMID:21884832

  5. Aeroacoustic Simulation of Nose Landing Gear on Adaptive Unstructured Grids With FUN3D

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Khorrami, Mehdi R.; Park, Michael A.; Lockard, David P.

    2013-01-01

    Numerical simulations have been performed for a partially-dressed, cavity-closed nose landing gear configuration that was tested in NASA Langley s closed-wall Basic Aerodynamic Research Tunnel (BART) and in the University of Florida's open-jet acoustic facility known as the UFAFF. The unstructured-grid flow solver FUN3D, developed at NASA Langley Research center, is used to compute the unsteady flow field for this configuration. Starting with a coarse grid, a series of successively finer grids were generated using the adaptive gridding methodology available in the FUN3D code. A hybrid Reynolds-averaged Navier-Stokes/large eddy simulation (RANS/LES) turbulence model is used for these computations. Time-averaged and instantaneous solutions obtained on these grids are compared with the measured data. In general, the correlation with the experimental data improves with grid refinement. A similar trend is observed for sound pressure levels obtained by using these CFD solutions as input to a FfowcsWilliams-Hawkings noise propagation code to compute the farfield noise levels. In general, the numerical solutions obtained on adapted grids compare well with the hand-tuned enriched fine grid solutions and experimental data. In addition, the grid adaption strategy discussed here simplifies the grid generation process, and results in improved computational efficiency of CFD simulations.

  6. Theoretical and numerical studies on physics and dynamics of orographic precipitation associated with tropical cyclones over mesoscale mountains

    NASA Astrophysics Data System (ADS)

    Sever, Gokhan

    A series of systematic two/three-dimensional (2D/3D) idealized numerical experiments were conducted to investigate the combined effects of dynamical and physical processes on orographic precipitation (OP) with varying incoming basic flow speed (U) and CAPE in a conditionally unstable uniform flow. The three moist flow regimes identified by Chu and Lin are reproduced using the CM1 model in low resolution (Deltax = 1 km) 2D simulations. A new flow regime, namely Regime IV (U > 36 m s-1) is characterized by gravity waves, heavy precipitation, lack of upper-level wave breaking and turbulence over the lee slope. The regime transition from III to IV at about 36 m s -1 can be explained by the transition from upward propagating gravity waves to evanescent flow, which can be predicted using a moist mountain wave theory. Although the basic features are captured well in low grid resolutions, high resolution (Deltax = 100 m) 2D/3D simulations are required to resolve precipitation distribution and intensity at higher basic winds (U > 30 m s -1). These findings may be applied to examine the performance of moist and turbulence parameterization schemes. Based on 3D simulations, gravity wave-induced severe downslope winds and turbulent mixing within hydraulic jump reduce OP in Regime III. Then in Regime IV, precipitation amount and spatial extent are intensified as the upper-level wave breaking vanishes and updrafts strengthen. Similar experiments were performed with a low CAPE sounding to assess the evolution of OP in an environment similar to that observed in tropical cyclones. These low CAPE simulations show that precipitation is nearly doubled at high wind speeds compared to high CAPE results. Based on a microphysics budget analysis, two factors are identified to explain this difference: 1) warm-rain formation processes (auto-conversion and accretion), which are more effective in low CAPE environment, and 2) even though rain production (via graupel and snow melting) is intense in high CAPE, strong downdrafts and advection induced evaporation tend to deplete precipitation before reaching the ground. Overall, both in 2D/3D high wind speed simulations, the pattern of the precipitation distribution resembles to the bell-shaped mountain profile with maximum located over the mountain peak. This result has a potential to simplify the parameterization of OP in terms of two control parameters and might applicable to global weather and climate modeling.

  7. Sketchcode: A Documentation Technique for Computer Hobbyists and Programmers

    ERIC Educational Resources Information Center

    Voros, Todd, L.

    1978-01-01

    Sketchcode is a metaprograming pseudo-language documentation technique intended to simplify the process of program writing and debugging for both high and low-level users. Helpful hints and examples for the use of the technique are included. (CMV)

  8. Flexible Data Link

    DTIC Science & Technology

    2015-04-01

    DDC ) results in more complicated digital (FPGA) processing, yet simplifies the analog design significantly while improving the quality of the...Interleaved CP Cyclic Prefix DAC Digital to Analog Converter DDC Digital Down Converter DDR Double Data Rate DUC Digital Up Converter ENOB Effective

  9. Update on Canada.

    ERIC Educational Resources Information Center

    Hochstadt, John Webster

    1994-01-01

    Gift planning is increasing in Canada's colleges and universities to offset effects of retrenchment. New annuity vehicles and the emergence of university Crown Foundations offer tax breaks that support private giving to institutions. In addition, a simplified process for gifts is anticipated. (MSE)

  10. Simplified process for preparation of schizophyllan solutions for biomaterial applications

    USDA-ARS?s Scientific Manuscript database

    Schizophyllan is a biopolymer commercially produced for pharmaceutical and cosmetics uses. However, schizophyllan also has potential biomaterial applications. Schizophyllan is conventionally produced from glucose and recovered by diafiltration and ultrafiltration to produce a highly purified product...

  11. A simplified rotor system mathematical model for piloted flight dynamics simulation

    NASA Technical Reports Server (NTRS)

    Chen, R. T. N.

    1979-01-01

    The model was developed for real-time pilot-in-the-loop investigation of helicopter flying qualities. The mathematical model included the tip-path plane dynamics and several primary rotor design parameters, such as flapping hinge restraint, flapping hinge offset, blade Lock number, and pitch-flap coupling. The model was used in several exploratory studies of the flying qualities of helicopters with a variety of rotor systems. The basic assumptions used and the major steps involved in the development of the set of equations listed are described. The equations consisted of the tip-path plane dynamic equation, the equations for the main rotor forces and moments, and the equation for control phasing required to achieve decoupling in pitch and roll due to cyclic inputs.

  12. TNSPackage: A Fortran2003 library designed for tensor network state methods

    NASA Astrophysics Data System (ADS)

    Dong, Shao-Jun; Liu, Wen-Yuan; Wang, Chao; Han, Yongjian; Guo, G.-C.; He, Lixin

    2018-07-01

    Recently, the tensor network states (TNS) methods have proven to be very powerful tools to investigate the strongly correlated many-particle physics in one and two dimensions. The implementation of TNS methods depends heavily on the operations of tensors, including contraction, permutation, reshaping tensors, SVD and so on. Unfortunately, the most popular computer languages for scientific computation, such as Fortran and C/C++ do not have a standard library for such operations, and therefore make the coding of TNS very tedious. We develop a Fortran2003 package that includes all kinds of basic tensor operations designed for TNS. It is user-friendly and flexible for different forms of TNS, and therefore greatly simplifies the coding work for the TNS methods.

  13. Avoidable errors in dealing with anaphylactoid reactions to iodinated contrast media.

    PubMed

    Segal, Arthur J; Bush, William H

    2011-03-01

    Contrast reactions are much less common today than in the past. This is principally because of the current and predominant use of low and iso-osmolar contrast media compared with the prior use of high osmolality contrast media. As a result of the significantly diminished frequency, there are now fewer opportunities for physicians to recognize and appropriately treat such adverse reactions. In review of the literature combined with our own clinical and legal experience, 12 potential errors were identified and these are reviewed in detail so that they can be avoided by the physician-in-charge. Basic treatment considerations are presented along with a plan to systematize an approach to contrast reactions, simplify treatment options and plans, and schedule periodic drills.

  14. Fossil fuel and biomass burning effect on climate - Heating or cooling?

    NASA Technical Reports Server (NTRS)

    Kaufman, Yoram J.; Fraser, Robert S.; Mahoney, Robert L.

    1991-01-01

    The basic theory of the effect of pollution on cloud microphysics and its global implications is applied to compare the relative effect of a small increase in the consumption rate of oil, coal, or biomass burning on cooling and heating of the atmosphere. The characteristics of and evidence for the SO2 induced cooling effect are reviewed. This perturbation analysis approach permits linearization, therefore simplifying the analysis and reducing the number of uncertain parameters. For biomass burning the analysis is restricted to burning associated with deforestation. Predictions of the effect of an increase in oil or coal burning show that within the present conditions the cooling effect from oil and coal burning may range from 0.4 to 8 times the heating effect.

  15. Liquid lens based on electrowetting: actual developments on larger aperture and multiple electrodes design for image stabilization or beam steering

    NASA Astrophysics Data System (ADS)

    Berge, Bruno; Broutin, Jérôme; Gaton, Hilario; Malet, Géraldine; Simon, Eric; Thieblemont, Florent

    2013-03-01

    This paper presents experimental results on several liquid lenses based on Electrowetting which are commercially available. It will be shown that larger aperture lenses are basically of the same optical quality than smaller lenses, sometimes reaching the diffraction limit, then opening new kind of applications areas for variable lenses in laser science. Regarding response time, actual performances of liquids lenses based on Electrowetting are presented and compared to a model simulating the internal fluid reorganization, seen as the main source of delay between electrical actuation and optical evolution of the lens. This simplified analytical model is supporting experimental results in various situations (focus and tilt variations), in static and dynamic regimes.

  16. TLS from fundamentals to practice

    PubMed Central

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Adams, Paul D.

    2014-01-01

    The Translation-Libration-Screw-rotation (TLS) model of rigid-body harmonic displacements introduced in crystallography by Schomaker & Trueblood (1968) is now a routine tool in macromolecular studies and is a feature of most modern crystallographic structure refinement packages. In this review we consider a number of simple examples that illustrate important features of the TLS model. Based on these examples simplified formulae are given for several special cases that may occur in structure modeling and refinement. The derivation of general TLS formulae from basic principles is also provided. This manuscript describes the principles of TLS modeling, as well as some select algorithmic details for practical application. An extensive list of applications references as examples of TLS in macromolecular crystallography refinement is provided. PMID:25249713

  17. Aeroelastic Considerations For Rotorcraft Primary Control with On-Blade Elevons

    NASA Technical Reports Server (NTRS)

    Ormiston, Robert A.; Rutkowski, Michael (Technical Monitor)

    2001-01-01

    Replacing the helicopter rotor swashplate and blade pitch control system with on-blade elevon control surfaces for primary flight control may significantly reduce weight and drag to improve mission performance. Simplified analyses are used to examine the basic aeroelastic characteristics of such rotor blades, including pitch and flap dynamic response, elevon reversal, and elevon control effectiveness. The profile power penalty associated with deflections of elevon control surfaces buried within the blade planform is also evaluated. Results suggest that with aeroelastic design for pitch frequencies in the neighborhood of 2/rev, reasonable elevon control effectiveness may be achieved and that, together with collective pitch indexing, the aerodynamic profile power penalty of on-blade control surface deflections may be minimized.

  18. Intelligent monitoring system of bedridden elderly

    NASA Astrophysics Data System (ADS)

    Dong, Rue Shao; Tanaka, Motohiro; Ushijima, Miki; Ishimatsu, Takakazu

    2005-12-01

    In this paper we propose a system to detect physical behavior of the elderly under bedridden status. This system is used to prevent those elderly from falling down and being wounded. Basic idea of our approach is to measure the body movements of the elderly using the acceleration sensor. Based on the data measured, dangerous actions of the elderly are extracted and warning signals to the caseworkers are generated via wireless signals. A feature of the system is that the senor part is compactly assembled as a wearable unit. Another feature of the system is that the system adopts a simplified wireless network system. Due to the network capability the system can monitor physical movements of multi-patients. Applicability of the system is now being examined at hospitals.

  19. A stratospheric aerosol model with perturbations induced by the space shuttle particulate effluents

    NASA Technical Reports Server (NTRS)

    Rosen, J. M.; Hofmann, D. J.

    1977-01-01

    A one dimensional steady state stratospheric aerosol model is developed that considers the subsequent perturbations caused by including the expected space shuttle particulate effluents. Two approaches to the basic modeling effort were made: in one, enough simplifying assumptions were introduced so that a more or less exact solution to the descriptive equations could be obtained; in the other approach very few simplifications were made and a computer technique was used to solve the equations. The most complex form of the model contains the effects of sedimentation, diffusion, particle growth and coagulation. Results of the perturbation calculations show that there will probably be an immeasurably small increase in the stratospheric aerosol concentration for particles larger than about 0.15 micrometer radius.

  20. A multitasking finite state architecture for computer control of an electric powertrain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burba, J.C.

    1984-01-01

    Finite state techniques provide a common design language between the control engineer and the computer engineer for event driven computer control systems. They simplify communication and provide a highly maintainable control system understandable by both. This paper describes the development of a control system for an electric vehicle powertrain utilizing finite state concepts. The basics of finite state automata are provided as a framework to discuss a unique multitasking software architecture developed for this application. The architecture employs conventional time-sliced techniques with task scheduling controlled by a finite state machine representation of the control strategy of the powertrain. The complexitiesmore » of excitation variable sampling in this environment are also considered.« less

Top